TOTAL NUCLEAR-REACTION PROBABILITY OF 270 TO 390 N-14 IONS IN SI AND CSI
WARNER, RE; CARPENTER, CL; FETTER, JM; WAITE, WF; WILSCHUT, HW; HOOGDUIN, JM
A magnetic spectrograph and position-sensitive detectors were used to measure the total nuclear reaction probability eta(R) for alpha + CsI at 116 MeV, N-14 + CsI at 265 and 385 MeV, and N-14 + Si at 271 and 390 MeV. From these eta(R)'s, average reaction cross sections sigma(R) were deduced for
Ozone-surface reactions in five homes: surface reaction probabilities, aldehyde yields, and trends.
Wang, H; Morrison, G
2010-06-01
Field experiments were conducted in five homes during three seasons (summer 2005, summer 2006 and winter 2007) to quantify ozone-initiated secondary aldehyde yields, surface reaction probabilities, and trends any temporal over a 1.5-year interval. Surfaces examined include living room carpets, bedroom carpets, kitchen floors, kitchen counters, and living room walls. Reaction probabilities for all surfaces for all seasons ranged from 9.4 x 10(-8) to 1.0 x 10(-4). There were no significant temporal trends in reaction probabilities for any surfaces from summer 2005 to summer 2006, nor over the entire 1.5-year period, indicating that it may take significantly longer than this period for surfaces to exhibit any 'ozone aging' or lowering of ozone-surface reactivity. However, all surfaces in three houses exhibited a significant decrease in reaction probabilities from summer 2006 to winter 2007. The total yield of aldehydes for the summer of 2005 were nearly identical to that for summer of 2006, but were significantly higher than for winter 2007. We also observed that older carpets were consistently less reactive than in newer carpets, but that countertops remained consistently reactive, probably because of occupant activities such as cooking and cleaning. Ozone reactions taking place at indoor surfaces significantly influence personal exposure to ozone and volatile reaction products. These field studies show that indoor surfaces only slowly lose their ability to react with ozone over several year time frames, and that this is probably because of a combination of large reservoirs of reactive coatings and periodic additions of reactive coatings in the form of cooking, cleaning, and skin-oil residues. When considering exposure to ozone and its reaction products and in the absence of dramatic changes in occupancy, activities or furnishings, indoor surface reactivity is expected to change very slowly.
International Nuclear Information System (INIS)
Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.
2010-01-01
Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Jambrina, P. G.; Lara, Manuel; Menéndez, M.; Launay, J.-M.; Aoiz, F. J.
2012-10-01
Cumulative reaction probabilities (CRPs) at various total angular momenta have been calculated for the barrierless reaction S(1D) + H2 → SH + H at total energies up to 1.2 eV using three different theoretical approaches: time-independent quantum mechanics (QM), quasiclassical trajectories (QCT), and statistical quasiclassical trajectories (SQCT). The calculations have been carried out on the widely used potential energy surface (PES) by Ho et al. [J. Chem. Phys. 116, 4124 (2002), 10.1063/1.1431280] as well as on the recent PES developed by Song et al. [J. Phys. Chem. A 113, 9213 (2009), 10.1021/jp903790h]. The results show that the differences between these two PES are relatively minor and mostly related to the different topologies of the well. In addition, the agreement between the three theoretical methodologies is good, even for the highest total angular momenta and energies. In particular, the good accordance between the CRPs obtained with dynamical methods (QM and QCT) and the statistical model (SQCT) indicates that the reaction can be considered statistical in the whole range of energies in contrast with the findings for other prototypical barrierless reactions. In addition, total CRPs and rate coefficients in the range of 20-1000 K have been calculated using the QCT and SQCT methods and have been found somewhat smaller than the experimental total removal rates of S(1D).
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
Energy Technology Data Exchange (ETDEWEB)
Kanki, T [Osaka Univ., Toyonaka (Japan). Coll. of General Education
1976-12-01
We present a quark-gluon-parton model in which quark-partons and gluons make clusters corresponding to two or three constituent quarks (or anti-quarks) in the meson or in the baryon, respectively. We explicitly construct the constituent quark state (cluster), by employing the Kuti-Weisskopf theory and by requiring the scaling. The quark additivity of the hadronic total cross sections and the quark counting rules on the threshold powers of various distributions are satisfied. For small x (Feynman fraction), it is shown that the constituent quarks and quark-partons have quite different probability distributions. We apply our model to hadron-hadron inclusive reactions, and clarify that the fragmentation and the diffractive processes relate to the constituent quark distributions, while the processes in or near the central region are controlled by the quark-partons. Our model gives the reasonable interpretation for the experimental data and much improves the usual ''constituent interchange model'' result near and in the central region (x asymptotically equals x sub(T) asymptotically equals 0).
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
Energy Technology Data Exchange (ETDEWEB)
Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)
2016-06-14
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.
Mielke, Steven L.; Truhlar, Donald G.; Schwenke, David W.
1991-01-01
Improved techniques and well-optimized basis sets are presented for application of the outgoing wave variational principle to calculate converged quantum mechanical reaction probabilities. They are illustrated with calculations for the reactions D + H2 yields HD + H with total angular momentum J = 3 and F + H2 yields HF + H with J = 0 and 3. The optimization involves the choice of distortion potential, the grid for calculating half-integrated Green's functions, the placement, width, and number of primitive distributed Gaussians, and the computationally most efficient partition between dynamically adapted and primitive basis functions. Benchmark calculations with 224-1064 channels are presented.
International Nuclear Information System (INIS)
Gocmen, C.
2007-01-01
When the total solar eclipse came into question, people connected the eclipse with the earthquake dated 17.08.1999. We thought if any physical parameters change during total solar eclipse on the earth, we could measure this changing and we did the project 'To Measure Probable Physical Changes On The Earth During Total Solar Eclipse Using Geophysical Methods' We did gravity, magnetic and self-potential measurements at Konya and Ankara during total solar eclipse (29, March, 2006) and the day before eclipse and the day after eclipse. The measurements went on three days continuously twenty-four hours at Konya and daytime in Ankara. Bogazici University Kandilli Observatory gave us magnetic values in Istanbul and we compare the values with our magnetic values. Turkish State Meteorological Service sent us temperature and air pressure observations during three days, in Konya and Ankara. We interpreted all of them
International Nuclear Information System (INIS)
Sargsyan, V.V.; Adamian, G.G.; Antonenko, N.V.; Gomes, P.R.S.
2014-01-01
We suggest simple and useful methods to extract reaction and capture (fusion) cross sections from the experimental elastic and quasi-elastic backscattering data.The direct measurement of the reaction or capture (fusion) cross section is a difficult task since it would require the measurement of individual cross sections of many reaction channels, and most of them could be reached only by specific experiments. This would require different experimental setups not always available at the same laboratory and, consequently, such direct measurements would demand a large amount of beam time and would take probably some years to be reached. Because of that, the measurements of elastic scattering angular distributions that cover full angular ranges and optical model analysis have been used for the determination of reaction cross sections. This traditional method consists in deriving the parameters of the complex optical potentials which fit the experimental elastic scattering angular distributions and then of deriving the reaction cross sections predicted by these potentials. Even so, both the experimental part and the analysis of this latter method are not so simple. In the present work we present a much simpler method to determine reaction and capture (fusion) cross sections. It consists of measuring only elastic or quasi-elastic scattering at one backward angle, and from that, the extraction of the reaction or capture cross sections can easily be performed. (author)
Resonance capture reactions with a total energy detector
International Nuclear Information System (INIS)
Macklin, R.L.
1978-01-01
The determination of nuclear reaction rates is considered; the Moxon--Rae detector and pulse height weighting are reviewed. This method has been especially useful in measuring (n,γ) cross sections. Strength functions and level spacing can be derived from (n,γ) yields. The relevance of neutron capture data to astrophysical nucleosynthesis is pointed out. The total gamma energy detection method has been applied successfully to radiative neutron capture cross section measurements. A bibliography of most of the published papers reporting neutron capture cross sections measured by the pulse height weighting technique is included. 55 references
The modulation of simple reaction time by the spatial probability of a visual stimulus
Directory of Open Access Journals (Sweden)
Carreiro L.R.R.
2003-01-01
Full Text Available Simple reaction time (SRT in response to visual stimuli can be influenced by many stimulus features. The speed and accuracy with which observers respond to a visual stimulus may be improved by prior knowledge about the stimulus location, which can be obtained by manipulating the spatial probability of the stimulus. However, when higher spatial probability is achieved by holding constant the stimulus location throughout successive trials, the resulting improvement in performance can also be due to local sensory facilitation caused by the recurrent spatial location of a visual target (position priming. The main objective of the present investigation was to quantitatively evaluate the modulation of SRT by the spatial probability structure of a visual stimulus. In two experiments the volunteers had to respond as quickly as possible to the visual target presented on a computer screen by pressing an optic key with the index finger of the dominant hand. Experiment 1 (N = 14 investigated how SRT changed as a function of both the different levels of spatial probability and the subject's explicit knowledge about the precise probability structure of visual stimulation. We found a gradual decrease in SRT with increasing spatial probability of a visual target regardless of the observer's previous knowledge concerning the spatial probability of the stimulus. Error rates, below 2%, were independent of the spatial probability structure of the visual stimulus, suggesting the absence of a speed-accuracy trade-off. Experiment 2 (N = 12 examined whether changes in SRT in response to a spatially recurrent visual target might be accounted for simply by sensory and temporally local facilitation. The findings indicated that the decrease in SRT brought about by a spatially recurrent target was associated with its spatial predictability, and could not be accounted for solely in terms of sensory priming.
Secondary standard neutron detector for measuring total reaction cross sections
International Nuclear Information System (INIS)
Sekharan, K.K.; Laumer, H.; Gabbard, F.
1975-01-01
A neutron detector has been constructed and calibrated for the accurate measurement of total neutron-production cross sections. The detector consists of a polyethylene sphere of 24'' diameter in which 8- 10 BF 3 counters have been installed radially. The relative efficiency of this detector has been determined for average neutron energies, from 30 keV to 1.5 MeV by counting neutrons from 7 Li(p,n) 7 Be. By adjusting the radial positions of the BF 3 counters in the polyethylene sphere the efficiency for neutron detection was made nearly constant for this energy range. Measurement of absolute efficiency for the same neutron energy range has been done by counting the neutrons from 51 V(p,n) 51 Cr and 57 Fe(p,n) 57 Co reactions and determining the absolute number of residual nuclei produced during the measurement of neutron yield. Details of absolute efficiency measurements and the use of the detector for measurement of total neutron yields from neutron producing reactions such as 23 Na(p,n) 23 Mg are given
Ozone deposition velocities, reaction probabilities and product yields for green building materials
Lamble, S. P.; Corsi, R. L.; Morrison, G. C.
2011-12-01
Indoor surfaces can passively remove ozone that enters buildings, reducing occupant exposure without an energy penalty. However, reactions between ozone and building surfaces can generate and release aerosols and irritating and carcinogenic gases. To identify desirable indoor surfaces the deposition velocity, reaction probability and carbonyl product yields of building materials considered green (listed, recycled, sustainable, etc.) were quantified. Nineteen separate floor, wall or ceiling materials were tested in a 10 L, flow-through laboratory reaction chamber. Inlet ozone concentrations were maintained between 150 and 200 ppb (generally much lower in chamber air), relative humidity at 50%, temperature at 25 °C and exposure occurred over 24 h. Deposition velocities ranged from 0.25 m h -1 for a linoleum style flooring up to 8.2 m h -1 for a clay based paint; reaction probabilities ranged from 8.8 × 10 -7 to 6.9 × 10 -5 respectively. For all materials, product yields of C 1 thru C 12 saturated n-aldehydes, plus acetone ranged from undetectable to greater than 0.70 The most promising material was a clay wall plaster which exhibited a high deposition velocity (5.0 m h -1) and a low product yield (
International Nuclear Information System (INIS)
Misawa, T.; Itakura, H.
1995-01-01
The present article focuses on a dynamical simulation of molecular motion in liquids. In the simulation involving diffusion-controlled reaction with discrete time steps, lack of information regarding the trajectory within the time step may result in a failure to count the number of reactions of the particles within the step. In order to rectify this, an interpolated diffusion process is used. The process is derived from a stochastic interpolation formula recently developed by the first author [J. Math. Phys. 34, 775 (1993)]. In this method, the probability that reaction has occurred during the time step given the initial and final positions of the particles is calculated. Some numerical examples confirm that the theoretical result corresponds to an improvement over the Clifford-Green work [Mol. Phys. 57, 123 (1986)] on the same matter
Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios
Energy Technology Data Exchange (ETDEWEB)
Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W
2005-04-21
Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.
Naine, Tarun Bharath; Gundawar, Manoj Kumar
2017-09-01
We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.
DEFF Research Database (Denmark)
Skousgaard, Søren Glud; Hjelmborg, Jacob; Skytthe, Axel
2015-01-01
INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk and heritab......INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk...... and heritability of primary osteoarthritis of the hip leading to a total hip arthroplasty, and if this heritability increased with increasing age. METHODS: In a nationwide population-based follow-up study 118,788 twins from the Danish Twin Register and 90,007 individuals from the Danish Hip Arthroplasty Register...... not have had a total hip arthroplasty at the time of follow-up. RESULTS: There were 94,063 twins eligible for analyses, comprising 835 cases of 36 concordant and 763 discordant twin pairs. The probability increased particularly from 50 years of age. After sex and age adjustment a significant additive...
Koglin, Johnathon
Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to
Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics
International Nuclear Information System (INIS)
Windus, Alastair; Jensen, Henrik J
2008-01-01
We consider a reaction-diffusion model incorporating the reactions A→φ, A→2A and 2A→3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.
Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics
Energy Technology Data Exchange (ETDEWEB)
Windus, Alastair; Jensen, Henrik J [The Institute for Mathematical Sciences, 53 Prince' s Gate, South Kensington, London SW7 2PG (United Kingdom)], E-mail: h.jensen@imperial.ac.uk
2008-11-15
We consider a reaction-diffusion model incorporating the reactions A{yields}{phi}, A{yields}2A and 2A{yields}3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.
Energy Technology Data Exchange (ETDEWEB)
Korhonen, Marko [Department of Mathematics and Statistics, University of Helsinki, FIN-00014 (Finland); Lee, Eunghyun [Centre de Recherches Mathématiques (CRM), Université de Montréal, Quebec H3C 3J7 (Canada)
2014-01-15
We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle's position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.
Frigm, Ryan C.; Hejduk, Matthew D.; Johnson, Lauren C.; Plakalovic, Dragan
2015-01-01
On-orbit collision risk is becoming an increasing mission risk to all operational satellites in Earth orbit. Managing this risk can be disruptive to mission and operations, present challenges for decision-makers, and is time-consuming for all parties involved. With the planned capability improvements to detecting and tracking smaller orbital debris and capacity improvements to routinely predict on-orbit conjunctions, this mission risk will continue to grow in terms of likelihood and effort. It is very real possibility that the future space environment will not allow collision risk management and mission operations to be conducted in the same manner as it is today. This paper presents the concept of a finite conjunction assessment-one where each discrete conjunction is not treated separately but, rather, as a continuous event that must be managed concurrently. The paper also introduces the Total Probability of Collision as an analogous metric for finite conjunction assessment operations and provides several options for its usage in a Concept of Operations.
Lynch, Gillian C.; Halvick, Philippe; Zhao, Meishan; Truhlar, Donald G.; Yu, Chin-Hui; Kouri, Donald J.; Schwenke, David W.
1991-01-01
Accurate three-dimensional quantum mechanical reaction probabilities are presented for the reaction F + H2 yields HF + H on the new global potential energy surface 5SEC for total angular momentum J = 0 over a range of translational energies from 0.15 to 4.6 kcal/mol. It is found that the v-prime = 3 HF vibrational product state has a threshold as low as for v-prime = 2.
Analysis of partial and total inelasticities obtained from inclusive reactions
International Nuclear Information System (INIS)
Bellandi, J.; Covolan, R.; Costa, C.G.; Montanha, J.; Mundim, L.M.
1994-01-01
An independent analysis of model for energetic dependence on inelasticity is presented, from experimental data of pp → c X (c = π +- , Κ +- , p +- ) type inclusive reactions. 6 refs., 2 figs., 1 tab
Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene
International Nuclear Information System (INIS)
Grau Malonda, A.; Garcia-Torano, E.
1983-01-01
Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs
Photocatalytic degradation of paracetamol: intermediates and total reaction mechanism.
Moctezuma, Edgar; Leyva, Elisa; Aguilar, Claudia A; Luna, Raúl A; Montalvo, Carlos
2012-12-01
The advanced oxidation of paracetamol (PAM) promoted by TiO(2)/UV system in aqueous medium was investigated. Monitoring this reaction by HPLC and TOC, it was demonstrated that while oxidation of paracetamol is quite efficient under these conditions, its mineralization is not complete. HPLC indicated the formation of hydroquinone, benzoquinone, p-aminophenol and p-nitrophenol in the reaction mixtures. Further evidence of p-nitrophenol formation was obtained following the reaction by UV-vis spectroscopy. Continuous monitoring by IR spectroscopy demonstrated the breaking of the aromatic amide present in PAM and subsequent formation of several aromatic intermediate compounds such as p-aminophenol and p-nitrophenol. These aromatic compounds were eventually converted into trans-unsaturated carboxylic acids. Based on these experimental results, an alternative deacylation mechanism for the photocatalytic oxidation of paracetamol is proposed. Our studies also demonstrated IR spectroscopy to be a useful technique to investigate oxidative mechanisms of pharmaceutical compounds. Copyright © 2012 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Musho, M.K.; Kozak, J.J.
1984-01-01
A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes
Directory of Open Access Journals (Sweden)
Marini P.
2016-01-01
Full Text Available Fission and gamma decay probabilities of 237U and 239Np have been measured, for the first time simultaneously in dedicated experiments, via the surrogate reactions 238U(3He, 4He and 238U(3He,d, respectively. While a good agreement between our data and neutron-induced data is found for fission probabilities, gamma decay probabilities are several times higher than the corresponding neutron-induced data for each studied nucleus. We study the role of the different spin distributions populated in the surrogate and neutron-induced reactions. The compound nucleus spin distribution populated in the surrogate reaction is extracted from the measured gamma-decay probabilities, and used as input parameter in the statistical model to predict fission probabilities to be compared to our data. A strong disagreement between our data and the prediction is obtained. Preliminary results from an additional dedicated experiment confirm the observed discrepancies, indicating the need of a better understanding of the formation and decay processes of the compound nucleus.
Energy Technology Data Exchange (ETDEWEB)
Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Roberts, Billy J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kutchenreiter, Mark C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wilcox, Steve [Solar Resource Solutions, LLC, Lakewood, CO (United States); Stoffel, Tom [Solar Resource Solutions, LLC, Lakewood, CO (United States)
2017-07-21
The National Renewable Energy Laboratory (NREL) and collaborators have created a clear-sky probability analysis to help guide viewers of the August 21, 2017, total solar eclipse, the first continent-spanning eclipse in nearly 100 years in the United States. Using cloud and solar data from NREL's National Solar Radiation Database (NSRDB), the analysis provides cloudless sky probabilities specific to the date and time of the eclipse. Although this paper is not intended to be an eclipse weather forecast, the detailed maps can help guide eclipse enthusiasts to likely optimal viewing locations. Additionally, high-resolution data are presented for the centerline of the path of totality, representing the likelihood for cloudless skies and atmospheric clarity. The NSRDB provides industry, academia, and other stakeholders with high-resolution solar irradiance data to support feasibility analyses for photovoltaic and concentrating solar power generation projects.
Bar-Sela, Gil; Abu-Amna, Mahmoud; Hadad, Salim; Haim, Nissim; Shahar, Eduardo
2015-09-01
Vemurafenib and dabrafenib are both orally bioavailable small molecule agents that block mitogen activated protein kinase signalling in patients with melanoma and BRAF(V600E) mutation. Generalized hypersensitivity reactions to vemurafenib or dabrafenib have not been described. Continuing vemurafenib or dabrafenib therapy despite hypersensitivity reaction is especially important in patients with melanoma and BRAF(V600E) mutation, in whom this mutation plays a critical role in tumour growth. Desensitization protocols to overcome hypersensitivity reactions by gradual reintroduction of small amounts of the offending drug up to full therapeutic doses are available for many anti-cancer agents, including vemurafenib but, to the best of our knowledge, have not been reported for dabrafenib. We describe a patient with metastatic melanoma who developed Type I hypersensitivity reaction to vemurafenib and to subsequent treatment with dabrafenib, and who was successfully treated by drug desensitization which allowed safe prolonged continuation of dabrafenib. The development of hypersensitivity reactions for both dabrafenib and vemurafinib in the current case could be because these drugs have a similar chemical structure and cause a cross-reactivity. However, hypersensitivity reaction to a non-medicinal ingredient shared by the two drugs is also possible. Oral desensitization appears to be an option for patients with hypersensitivity Type I to dabrafenib. This approach may permit clinicians to safely administer dabrafenib to patients who experience hypersensitivity reactions to this life-prolonging medication. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Reaction probability of molecular deuterium with a disordered InSb (110) surface
International Nuclear Information System (INIS)
Wolf, B.; Zehe, A.
1987-01-01
A detailed experimental analysis of the interaction of molecular deuterium with sputter-damaged InSb surfaces by the aid of SIMS is given. The sticking probability of D 2 and its transformation to a chemisorbed state resulting in InD + signals in SIMS measurements can be determined by adsorption experimens both with and without a hot tungsten filament. The calculated sticking probability of D 2 = 2 x 10 -4 is at least three orders of magnitude higher than the known-value for a cleavage plane of InSb
International Nuclear Information System (INIS)
Khayat, Omid; Afarideh, Hossein; Mohammadnia, Meisam
2015-01-01
In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks
Anisotropy evidence for the K-shell ionization probability in the use of Ag(p,p)Ag reaction
International Nuclear Information System (INIS)
Andriamonje, S.
1976-01-01
The ionization probability of silver by 1MeV protons has been measured at large angles up to 110 0 C. The experimental results have been obtained using the coincidence between scattered protons and KX rays. The angular dependence in the ionization probability at small impact parameters indicates an anisotropy as expected by Ciochetti and Molinari in their theoretical study of K-shell ionization probability associated with nuclear reactions. The results have been compared to the predictions of the BEA (Binary Exchange Approximation) method, including relativistic corrections of deflection and binding energy. The anisotropy coefficient deduced from the comparison of experimental and theoretical results is in good agreement with expected values [fr
DEFF Research Database (Denmark)
Nielsen, Thomas E.; Le Quement, Sebastian; Juhl, Martin
2005-01-01
A study on the Stille reaction of alkenyl iodides and starmanes with structural resemblance to retrosynthetic fragments of a projected total synthesis of the marine alkaloid zoanthamine was carried out. A range of reaction conditions was examined, and a protocol developed by Corey utilizing excess...
A simple functional form for proton-208Pb total reaction cross sections
International Nuclear Information System (INIS)
Majumdar, S.; Deb, P.K.; Amos, K.
2001-01-01
A simple functional form has been found that gives a good representation of the total reaction cross sections for the scattering from 208 Pb of protons with energies in the range 30 to 300 MeV. The ratios of the total reaction cross sections calculated under this approximations compared well (to within a few percent) to those determined from the microscopic optical model potentials
Transition probabilities of 36Cl and 36Ar excited states in heavy ion reactions
International Nuclear Information System (INIS)
Costa, G.J.; Alexander, T.K.; Forster, J.S.; McDonald, A.B.; Towner, I.S.
The reactions 2 H( 35 Cl,pγ) and 2 H( 35 Cl,nγ) have been used to determine by the recoil-distance method, the lifetimes of levels in 36 Cl and 36 Ar respectively. Large discrepancies exist in the literature for some lifetimes of 36 Cl levels. Transition rates found for decay of the negative parity states in 36 Ar (4178 (3 - ), 4974 (2 - ) and 5171 (5 - ) keV), are compared whith the Maripuu-Hokken model and RPA and TDA predictions [fr
Universal trend for heavy-ion total reaction cross sections at energies above the Coulomb barrier
International Nuclear Information System (INIS)
Tavares, O.A.P.; Medeiros, E.L.; Morcelle, V.
2010-06-01
Heavy-ion total reaction cross section measurements for more than one thousand one hundred reaction cases covering 61 target nuclei in the range 6 Li- 238 U, and 158 projectile nuclei from 2 H up to 84 Kr (mostly exotic ones) have been analysed in a systematic way by using an empirical, three-parameter formula which is applicable to cases for projectile kinetic energies above the Coulomb barrier. The analysis has shown that the average total nuclear binding energy per nucleon of the interacting nuclei and their radii are the chief quantities which describe the cross section patterns. A great number of cross section data (87%) has been quite satisfactorily reproduced by the proposed formula, therefore total reaction cross section predictions for new, not yet experimentally investigated reaction cases can be obtained within 25 percent (or much less) of uncertainty (author)
Total Synthesis of (+)-Cytosporolide A via a Biomimetic Hetero-Diels-Alder Reaction.
Takao, Ken-Ichi; Noguchi, Shuji; Sakamoto, Shu; Kimura, Mizuki; Yoshida, Keisuke; Tadano, Kin-Ichi
2015-12-23
The first total synthesis of (+)-cytosporolide A was achieved by a biomimetic hetero-Diels-Alder reaction of (-)-fuscoatrol A with o-quinone methide generated from (+)-CJ-12,373. The dienophile, highly oxygenated caryophyllene sesquiterpenoid (-)-fuscoatrol A, was synthesized from the synthetic intermediate in our previous total synthesis of (+)-pestalotiopsin A. The o-quinone methide precursor, isochroman carboxylic acid (+)-CJ-12,373, was synthesized through a Kolbe-Schmitt reaction and an oxa-Pictet-Spengler reaction. The hetero-Diels-Alder reaction of these two compounds proceeded with complete chemo-, regio-, and stereoselectivity to produce the complicated pentacyclic ring system of the cytosporolide skeleton. This total synthesis unambiguously demonstrates that natural cytosporolide A has the structure previously suggested.
Universal trend for heavy-ion total reaction cross sections at energies above the Coulomb barrier
Energy Technology Data Exchange (ETDEWEB)
Tavares, O.A.P.; Medeiros, E.L., E-mail: emil@cbpf.b [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Morcelle, V. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica
2010-06-15
Heavy-ion total reaction cross section measurements for more than one thousand one hundred reaction cases covering 61 target nuclei in the range {sup 6}Li-{sup 238}U, and 158 projectile nuclei from {sup 2}H up to {sup 84}Kr (mostly exotic ones) have been analysed in a systematic way by using an empirical, three-parameter formula which is applicable to cases for projectile kinetic energies above the Coulomb barrier. The analysis has shown that the average total nuclear binding energy per nucleon of the interacting nuclei and their radii are the chief quantities which describe the cross section patterns. A great number of cross section data (87%) has been quite satisfactorily reproduced by the proposed formula, therefore total reaction cross section predictions for new, not yet experimentally investigated reaction cases can be obtained within 25 percent (or much less) of uncertainty (author)
Energy Technology Data Exchange (ETDEWEB)
Burke, J. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hughes, R. O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, J. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scielzo, N. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Casperson, R. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ressler, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Saastamoinen, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ota, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ross, T. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Austin, R. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rapisarda, G. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-09-21
This technical report documents the surrogate reaction method and experimental results used to determine the desired neutron induced cross sections of ^{87}Y(n,g) and the known ^{90}Zr(n,g) cross section. This experiment was performed at the STARLiTeR apparatus located at Texas A&M Cyclotron Institute using the K150 Cyclotron which produced a 28.56 MeV proton beam. The proton beam impinged on Y and Zr targets to produce the nuclear reactions ^{89}Y(p,d)^{88}Y and ^{92}Zr(p,d)^{91}Zr. Both particle singles data and particle-gamma ray coincident data were measured during the experiment. This data was used to determine the γ-ray probability as a function of energy for these reactions. The results for the γ-ray probabilities as a function of energy for both these nuclei are documented here. For completeness, extensive tabulated and graphical results are provided in the appendices.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
Energy Technology Data Exchange (ETDEWEB)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
2018-01-01
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.
Dependence of compound nucleus formation probability on K equilibration time in heavy-ion reactions
International Nuclear Information System (INIS)
Yadav, C.; Thomas, R.G.; Mohanty, A.K.; Kapoor, S.S.
2014-01-01
In the present work, we have carried out the analysis of fragment anisotropy data of various systems selected for cases Z 1 Z 2 < 1600 and Z CN < 96 so that both QF and FF are absent and the anomalous anisotropies are only due to PEF. It may also be noted that in such cases J cr (the J above which the fusion pocket vanishes) is less than J Bf = 0 (the J at which the liquid drop fission barrier vanishes) so that all J's will be contributing to PEF as well. According to PEF model, the observed angular anisotropy of fission fragments in heavy-ion induced reactions can be written as an admixture of two components: the anisotropy from compound nucleus fission (CN) and anisotropy due to non-compound nucleus fission (NCN)
Quinones as dienophiles in the Diels-Alder reaction: history and applications in total synthesis.
Nawrat, Christopher C; Moody, Christopher J
2014-02-17
In the canon of reactions available to the organic chemist engaged in total synthesis, the Diels-Alder reaction is among the most powerful and well understood. Its ability to rapidly generate molecular complexity through the simultaneous formation of two carbon-carbon bonds is almost unrivalled, and this is reflected in the great number of reported applications of this reaction. Historically, the use of quinones as dienophiles is highly significant, being the very first example investigated by Diels and Alder. Herein, we review the application of the Diels-Alder reaction of quinones in the total synthesis of natural products. The highlighted examples span some 60 years from the landmark syntheses of morphine (1952) and reserpine (1956) by Gates and Woodward, respectively, through to the present day examples, such as the tetracyclines. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian
2017-09-01
This paper proposed a new reaction headspace gas chromatographic (HS-GC) method for efficiently quantifying the total starch content in wheat flours. A certain weight of wheat flour was oxidized by potassium dichromate in an acidic condition in a sealed headspace vial. The results show that the starch in wheat flour can be completely transferred to carbon dioxide at the given conditions (at 100 °C for 40 min) and the total starch content in wheat flour sample can be indirectly quantified by detecting the CO 2 formed from the oxidation reaction. The data showed that the relative standard deviation of the reaction HS-GC method in the precision test was less than 3.06%, and the relative differences between the new method and the reference method (titration method) were no more than 8.90%. The new reaction HS-GC method is automated, accurate, and can be a reliable tool for determining the total starch content in wheat flours in both laboratory and industrial applications. Graphical abstract The total starch content in wheat flour can be indirectly quantified by the GC detection of the CO 2 formed from the oxidation reaction between wheat flour and potassium dichromate in an acidic condition.
6,7Li + 28Si total reaction cross sections at near barrier energies
International Nuclear Information System (INIS)
Pakou, A.; Musumarra, A.; Pierroutsakou, D.; Alamanos, N.; Assimakopoulos, P.A.; Divis, N.; Doukelis, G.; Gillibert, A.; Harissopulos, S.; Kalyva, G.; Kokkoris, M.; Lagoyannis, A.; Mertzimekis, T.J.; Nicolis, N.G.; Papachristodoulou, C.; Perdikakis, G.; Roubos, D.; Rusek, K.; Spyrou, S.; Zarkadas, Ch.
2007-01-01
Total reaction cross section measurements for the 6,7 Li + 28 Si systems have been performed at near-barrier energies. The results indicate that, with respect to the potential anomaly at barrier, 6 Li and 7 Li on light targets exhibit similar energy dependence on the imaginary potential. Comparisons are made with 6,7 Li cross sections on light and heavy targets, extracted via previous elastic scattering measurements and also with CDCC calculations. Energy dependent parametrisations are also obtained for total reaction cross sections of 6,7 Li on Si, as well as on any target, at near barrier energies
International Nuclear Information System (INIS)
Schopper, H.; Moorhead, W.G.; Morrison, D.R.O.
1988-01-01
The aim of this report is to present a compilation of cross-sections (i.e. reaction rates) of elementary particles at high energy. The data are presented in the form of tables, plots and some fits, which should be easy for the reader to use and may enable him to estimate cross-sections for presently unmeasured energies. We have analyzed all the data published in the major Journals and Reviews for momenta of the incoming particles larger than ≅ 50 MeV/c, since the early days of elementary particle physics and, for each reaction, we have selected the best cross-section data available. We have restricted our attention to integrated cross-sections, such as total cross-sections, exclusive and inclusive cross-sections etc., at various incident beam energies. We have disregarded data affected by geometrical and/or kinematical cuts which would make them not directly comparable to other data at different energies. Also, in the case of exclusive reactions, we have left out data where not all of the particles in the final state were unambiguously identified. This work contains reactions induced by neutrinos, gammas, charged pions, kaons, nucleons, antinucleons and hyperons. (orig./HSI)
Shu, Shi; Morrison, Glenn C
2011-05-15
Ozone can react homogeneously with unsaturated organic compounds in buildings to generate undesirable products. However, these reactions can also occur on indoor surfaces, especially for low-volatility organics. Conversion rates of ozone with α-terpineol, a representative low-volatility compound, were quantified on surfaces that mimic indoor substrates. Rates were measured for α-terpineol adsorbed to beads of glass, polyvinylchloride (PVC), and dry latex paint, in a plug flow reactor. A newly defined second-order surface reaction rate coefficient, k(2), was derived from the flow reactor model. The value of k(2) ranged from 0.68 × 10(-14) cm(4)s(-1)molecule(-1) for α-terpineol adsorbed to PVC to 3.17 × 10(-14) cm(4)s(-1)molecule(-1) for glass, but was insensitive to relative humidity. Further, k(2) is only weakly influenced by the adsorbed mass but instead appears to be more strongly related to the interfacial activity α-terpineol. The minimum reaction probability ranged from 3.79 × 10(-6) for glass at 20% RH to 6.75 × 10(-5) for PVC at 50% RH. The combination of high equilibrium surface coverage and high reactivity for α-terpineol suggests that surface conversion rates are fast enough to compete with or even overwhelm other removal mechanisms in buildings such as gas-phase conversion and air exchange.
The heavy-ion total reaction cross-section and nuclear transparancy
International Nuclear Information System (INIS)
Rego, R.A.; Hussein, M.S.
1982-10-01
The total reaction cross section of heavy ions at intermediate energies is discussed. The special role played by the individual nucleon-nucleon collisions in determining the nuclear transparancy is analysed. Several competing effects arising from the nuclear and Coulomb interactions between the two ions are found to be important in determing σ sub(R) at lower energies. (Author) [pt
The heavy-ion total reaction cross-section and nuclear transparency
International Nuclear Information System (INIS)
Rego, R.A.; Hussein, M.S.
1982-01-01
The total reaction cross section of heavy ions at intermediate energies is discussed. The special role played by the individual nucleon-nucleon collisions in determining the nuclear transparency is analysed. Several competing effects arising from the nuclear and Coulomb interactions between the two ions are found to be important in determining σ(sub R) at lower energies. (Author) [pt
A new closed form expression for the total reaction cross-section of heavy ions
International Nuclear Information System (INIS)
Rego, R.A.; Hussein, M.S.
1989-02-01
A new analytical expression for the HI total reaction cross-section which exhibits the macroscopic features of the transparency factor is derived. Comparison with optical model calculation are made for the 12 C+ 208 Pb and 16 O + 208 Pb at several energies. (author)
Effect of imaginary part of an optical potential on reaction total cross sections
International Nuclear Information System (INIS)
Afanas'ev, G.N.; Dobromyslov, M.B.; Kim Yng Pkhung; Shilov, V.M.
1977-01-01
The effect of the imaginary part of optical potential on the total cross sections of reactions is explained. The complex rectangular well model is used, i.e. the real rectangular well at r 16 O + 27 Al reactions and the partial permeabilities are presented. It is demonstrated that the S-matrix has proved to be unitary. Oscillations of the partial permeabilities and cross-sections are observed for small potential values in the Wsub(o) imaginary part, which no longer occur at larger Wsub(o). This corresponds to the overlapping and nonoverlapping quasistationary levels in complex rectangular well
Catalytic Asymmetric Total Synthesis of (+)- and (-)-Paeoveitol via a Hetero-Diels-Alder Reaction.
Li, Tian-Ze; Geng, Chang-An; Yin, Xiu-Juan; Yang, Tong-Hua; Chen, Xing-Long; Huang, Xiao-Yan; Ma, Yun-Bao; Zhang, Xue-Mei; Chen, Ji-Jun
2017-02-03
The first catalytic asymmetric total synthesis of (+)- and (-)-paeoveitol has been accomplished in 42% overall yield via a biomimetic hetero-Diels-Alder reaction. The chiral phosphoric acid catalyzed hetero-Diels-Alder reaction showed excellent diastereo- and enantioselectivity (>99:1 dr and 90% ee); two rings and three stereocenters were constructed in a single step to produce (-)-paeoveitol on a scale of 452 mg. This strategy enabled us to selectively synthesize both paeoveitol enantiomers from the same substrates by simply changing the enantiomer of the catalyst.
Microscopic theory of the total reaction cross section and application to stable and exotic nuclei
International Nuclear Information System (INIS)
Hussein, M.S.; Rego, R.A.; Bertulani, C.A.
1990-09-01
The multiple scattering theory is used to develop a theoretical framework for the calculation of the heavy-ion total reaction order double scattering contribution to the ion-ion t sub(ρ1 ρ2) interaction is calculated and found to contribute at most 10% effect on σ sub(R). It is found that whereas at intermediate energies the t sub(ρ1ρ2) accounts reasonably well for the total reaction cross section, indicating the predominance, at these energies, of single nucleon knockout, it underestimates σ sub(R) at lower energies by a large amount. This is mainly due to the absence in t sub(ρ1ρ2) of fusion and inelastic surface excitation. The case of exotic (neutron-and proton-rich) nuclei is also discussed. (author) the absence
Reaction and total cross sections for low energy π+ and π- on isospin zero nuclei
International Nuclear Information System (INIS)
Saunders, A.; Ho/ibraten, S.; Kraushaar, J.J.; Kriss, B.J.; Peterson, R.J.; Ristinen, R.A.; Brack, J.T.; Hofman, G.; Gibson, E.F.; Morris, C.L.
1996-01-01
Reaction and total cross sections for π + and π - on targets of 2 H, 6 Li, C, Al, Si, S, and Ca have been measured for beam energies from 42 to 65 MeV. The cross sections are proportional to the target mass at 50 MeV, consistent with transparency to these projectiles. The cross sections are compared to theoretical calculations. copyright 1996 The American Physical Society
Measurement of total reaction cross sections of exotic neutron rich nuclei
International Nuclear Information System (INIS)
Mittig, W.; Chouvel, J.M.; Wen Long, Z.
1987-01-01
Total reaction cross-sections of neutron rich nuclei from C to Mg in a thick Si-target have been measured using the detection of the associated γ-rays in a 4Π-geometry. This cross-section strongly increases with neutron excess, indicating an increase of as much as 15% of the reduced strong absorption radius with respect to stable nuclei
Total Synthesis of Ustiloxin D Utilizing an Ammonia-Ugi Reaction.
Brown, Aaron L; Churches, Quentin I; Hutton, Craig A
2015-10-16
Total synthesis of the highly functionalized cyclic peptide natural product, ustiloxin D, has been achieved in a convergent manner. Our strategy incorporates an asymmetric allylic alkylation to construct the tert-alkyl aryl ether linkage between the dopa and isoleucine residues. The elaborated β-hydroxydopa derivative is rapidly converted to a linear tripeptide through an ammonia-Ugi reaction. Subsequent cyclization and global deprotection affords ustiloxin D in six steps from a known β-hydroxydopa derivative.
International Nuclear Information System (INIS)
Hencken, K.; Trautmann, D.; Baur, G.
1995-01-01
We calculate the impact-parameter-dependent total probability P total (b) for the electromagnetic production of electron-positron pairs in relativistic heavy-ion collisions in lowest order. We study expecially impact parameters smaller than the Compton wavelength of the electron, where the equivalent-photon approximation cannot be used. Calculations with and without a form factor for the heavy ions are done; the influence is found to be small. The lowest-order results are found to violate unitarity and are used for the calculation of multiple-pair production probabilities with the help of the approximate Poisson distribution already found in earlier publications
Modeling networks of coupled enzymatic reactions using the total quasi-steady state approximation.
Directory of Open Access Journals (Sweden)
Andrea Ciliberto
2007-03-01
Full Text Available In metabolic networks, metabolites are usually present in great excess over the enzymes that catalyze their interconversion, and describing the rates of these reactions by using the Michaelis-Menten rate law is perfectly valid. This rate law assumes that the concentration of enzyme-substrate complex (C is much less than the free substrate concentration (S0. However, in protein interaction networks, the enzymes and substrates are all proteins in comparable concentrations, and neglecting C with respect to S0 is not valid. Borghans, DeBoer, and Segel developed an alternative description of enzyme kinetics that is valid when C is comparable to S0. We extend this description, which Borghans et al. call the total quasi-steady state approximation, to networks of coupled enzymatic reactions. First, we analyze an isolated Goldbeter-Koshland switch when enzymes and substrates are present in comparable concentrations. Then, on the basis of a real example of the molecular network governing cell cycle progression, we couple two and three Goldbeter-Koshland switches together to study the effects of feedback in networks of protein kinases and phosphatases. Our analysis shows that the total quasi-steady state approximation provides an excellent kinetic formalism for protein interaction networks, because (1 it unveils the modular structure of the enzymatic reactions, (2 it suggests a simple algorithm to formulate correct kinetic equations, and (3 contrary to classical Michaelis-Menten kinetics, it succeeds in faithfully reproducing the dynamics of the network both qualitatively and quantitatively.
Total reaction cross section and foward glory for 12C + 16O
International Nuclear Information System (INIS)
Villari, A.C.C.; Lepine-Szily, A.; Lichtenthaler Filho, R.; Portezan Filho, O.; Obuti, M.M.
1988-12-01
A model-independent method is proposed for the determination of the total reaction cross section from elastic angular distributions. This method based on the generalized optical theorem, was applied to 16 complete elastic angular distributions of the system 12 C + 16 O at energies between E CM =8.55 MeV and E CM = 26.74 MeV. Some of the angular distributions were measured at the Sao Paulo Pelletron Laboratory. Angular distributions measured by the Erlangen and Saclay groups were also used. The reaction cross section is compared with fusion measurements and with σ R obtained by indirect methods. The existence and conditions for the observation of the nuclear forward glory scattering are investigated. (author) [pt
Stress-reaction during hypokinesia and its effect on total resistance of the animal body
International Nuclear Information System (INIS)
Chernov, I.P.
1980-01-01
In the experiments on rats, shown has been that three-phase stress-reaction develops during the hypokinetic syndrome formation. This reaction is confirmed by specific changes of general state of the organism, body mass and by the activity of hypothalamic-hypophysial-adrenal system evaluated by oscillations of relative mass of pituitary body and adrenal glands and by karyometry of neuron of the hypothalamus arcuate nuclear and cells of zona fasciculata of adrenal glands. The hypokinetic stress affects the total resistance of the body, its sensitivity to gamma-irradiation in the dose of 800 rad. On the definite stage of development the hypokinetic stress forms the state of heightened ''cross'' stability
Elastic scattering and total reaction cross section for the 6He +58Ni system
Morcelle, V.; Lichtenthäler, R.; Lépine-Szily, A.; Guimarães, V.; Mendes, D. R., Jr.; Pires, K. C. C.; de Faria, P. N.; Barioni, A.; Gasques, L.; Morais, M. C.; Shorto, J. M. B.; Zamora, J. C.; Scarduelli, V.; Condori, R. Pampa; Leistenschneider, E.
2014-11-01
Elastic scattering measurements of 6He + 58Ni system have been performed at the laboratory energy of 21.7 MeV. The 6He secondary beam was produced by a transfer reaction 9Be (7Li , 6He ) and impinged on 58Ni and 197Au targets, using the Radioactive Ion Beam (RIB) facility, RIBRAS, installed in the Pelletron Laboratory of the Institute of Physics of the University of São Paulo, Brazil. The elastic angular distribution was obtained in the angular range from 15° to 80° in the center of mass frame. Optical model calculations have been performed using a hybrid potential to fit the experimental data. The total reaction cross section was derived.
Total reaction cross section for 12C+16O below the Coulomb barrier
International Nuclear Information System (INIS)
Cujec, B.; Barnes, C.A.
1976-01-01
The energy dependence of the total reaction cross section, sigma(E), for 12 C+ 16 0 has been measured over the range Esub(c.m.)=4-12 MeV, by detecting γ-rays from the various possible residual nuclei with two large NaI(Tl) detectors placed close to the target. This technique for measuring total reaction cross sections was explored in some detail and shown to yield reliable values for sigma(E). Although the principal emphasis of this work was placed on obtaining reliable cross sections, a preliminary study has been made of the suitability of various methods for extrapolating the cross section to still lower energies. The statistical model provides a good fit with a reasonable value for the strength function, 2 >/ =6.8x10 -2 , over the range Esub(c.m.)=6.5-12 MeV, but predicts cross sections which are much too large for Esub(c.m.)<6.5 MeV. Optical model fits at low energies are especially sensitive to the radius and diffuseness of the imaginary component of the potential and, since these are still poorly known at present, such extrapolations may be wrong by orders of magnitude. A simple barrier penetration model gives a moderately good fit to the data and seems to provide the safest extrapolation to lower energies at the present time. It is clear, however, that our knowledge of the heavy-ion reaction mechanism at low energies is incomplete, and that cross-section measurements at still lower energies are needed to establish the correct procedure for extrapolating heavy-ion reaction cross sections to low energies. (Auth.)
Valence quark annihilation and the total charge in the forward hemisphere of hadron-hadron reactions
International Nuclear Information System (INIS)
Szczekowski, M.
1980-01-01
The consequences of the valence quark annihilation (VQA) mechanism on the energy behaviour of the total net charge for final state particles in the forward c.m.s. hemisphere (Qsub(F)) in K +- p, π + p and pp reactions is examined. The data are in qualitative agreement with VQA model predictions and suggest that at low energies (psub(LAB) approximately 10 GeV/c) the VQA provides the dominating contribution to Qsub(F) in K - p and π +- p interactions. (author)
Parameterization of α-nucleus total reaction cross section at intermediate energies
International Nuclear Information System (INIS)
Alvi, M A; Abdulmomen, M A
2008-01-01
Applying a Coulomb correction factor to the Glauber model we have derived a closed expression for α-nucleus total reaction cross section, σ R . Under the approximation of rigid projectile model, the elastic S-matrix element S el (b) is evaluated from the phenomenological N-α amplitude and a Gaussian fit to the Helm's model form factor. Excellent agreements with the experimental data have been achieved by performing two-parameter fits to the α-nucleus σ R data in the energy range about 75 to 193 MeV. One of the parameters was found to be energy independent while the other, as expected, shows the energy dependence similar to that of N-α total cross section.
Study of p-4He Total Reaction cross section using Glauber and Modified Glauber Models
International Nuclear Information System (INIS)
Tag El Din, I.M.A.; Taha, M.M.; Hassan, S.S.A.
2012-01-01
The total nuclear reaction cross-section for p - 4 He in the energy range from 25 to 1000 MeV is calculated within Glauber and modified Glauber models. The modified Glauber model is introduced via both Coulomb trajectory of the projectile and calculation of the effective radius of interaction. The effects of density dependent total cross-section and phase variation of nucleon-nucleon scattering amplitude are studied. It is pointed out that the phase variation of the nucleon-nucleon amplitude plays a significant role in describing σR at E p 2 at e = e0 = 0 and γ=2fm 2 at e = e0 = 0.17fm -3 .
Preobrazhenskaia, L A; Ioffe, M E; Mats, V N
2004-01-01
The role of the prefrontal cortex was investigated on the reaction of the active choice of the two feeders under changes value and probability reinforcement. The experiments were performed on 2 dogs with prefrontal ablation (g. proreus). Before the lesions the dogs were taught to receive food in two different feeders to conditioned stimuli with equally probable alimentary reinforcement. After ablation in the inter-trial intervals the dogs were running from the one feeder to another. In the answer to conditioned stimuli for many times the dogs choose the same feeder. The disturbance of the behavior after some times completely restored. In the experiments with competition of probability events and values of reinforcement the dogs chose the feeder with low-probability but better quality of reinforcement. In the experiments with equal value but different probability the intact dogs chose the feeder with higher probability. In our experiments the dogs with prefrontal lesions chose the each feeder equiprobably. Thus in condition of free behavior one of different functions of the prefrontal cortex is the reactions choose with more probability of reinforcement.
Directory of Open Access Journals (Sweden)
Bhargava P
2007-01-01
Full Text Available The principal objectives of arthroplasty are relief of pain and enhancement of range of motion. Currently, postoperative pain and functional capacity are assessed largely on the basis of subjective evaluation scores. Because of the lack of control inherent in this method it is often difficult to interpret data presented by different observers in the critical evaluation of surgical method, new components and modes of rehabilitation. Gait analysis is a rapid, simple and reliable method to assess functional outcome. This study was undertaken in an effort to evaluate the gait characteristics of patients who underwent arthroplasty, using an Ultraflex gait analyzer. Materials and Methods: The study was based on the assessment of gait and weight-bearing pattern of both hips in patients who underwent total hip replacement and its comparison with an age and sex-matched control group. Twenty subjects of total arthroplasty group having unilateral involvement, operated by posterior approach at our institution with a minimum six-month postoperative period were selected. Control group was age and sex-matched, randomly selected from the general population. Gait analysis was done using Ultraflex gait analyzer. Gait parameters and vertical ground reaction forces assessment was done by measuring the gait cycle properties, step time parameters and VGRF variables. Data of affected limb was compared with unaffected limb as well as control group to assess the weight-bearing pattern. Statistical analysis was done by′t′ test. Results: Frequency is reduced and gait cycle duration increased in total arthroplasty group as compared with control. Step time parameters including Step time, Stance time and Single support time are significantly reduced ( P value < .05 while Double support time and Single swing time are significantly increased ( P value < .05 in the THR group. Forces over each sensor are increased more on the unaffected limb of the THR group as compared to
Guo, Chengye; Wang, Houyu; Zhang, Lei; Fan, Liuyin; Cao, Chengxi
2013-11-01
A visual, rapid and accurate moving reaction boundary titration (MRBT) method was used for the determination of the total protein in soya-bean milk. During the process, moving reaction boundary (MRB) was formed by hydroxyl ions in the catholyte and soya-bean milk proteins immobilized in polyacrylamide gel (PAG), and an acid-base indicator was used to denote the boundary motion. The velocity of MRB has a relationship with protein concentration, which was used to obtain a standard curve. By paired t-test, there was no significant difference of the protein content between MRBT and Kjeldahl method at 95% confidence interval. The procedure of MRBT method required about 10 min, and it had linearity in the range of 2.0-14.0 g/L, low limit of detection (0.05 g/L), good precision (RSD of intra-day < 1.90% and inter-day < 4.39%), and high recoveries (97.41%-99.91%). In addition, non-protein nitrogen (NPN) such as melamine added into the soya-bean milk had weak influence on MRBT results.
Preliminary assessment of the Velocity Pump Reaction Turbine as a geothermal total-flow expander
Energy Technology Data Exchange (ETDEWEB)
Demuth, O.J.
1985-01-01
A preliminary evaluation was made of the Velocity Pump Reaction Turbine (VPRT) as a total flow expander in a geothermal-electric conversion cycle. Values of geofluid effectiveness of VPRT systems were estimated for conditions consisting of: a 360/sup 0/F geothermal resource, 60/sup 0/F wet-bulb ambient temperature, zero and 0.003 mass concentrations of dissolved noncondensible gas in the geofluid, 100 and 120/sup 0/F condensing temperature, and engine efficiencies ranging from 0.4 to 1.0. Achievable engine efficiencies were estimated to range from 0.47 to 0.77, with plant geofluid effectivenss values ranging as high as 9.5 Watt hr/lbm geofluid. This value is competitive with magnitudes of geofluid effectiveness projected for advanced binary plants, and is on the order of 40% higher than estimates for dual-flash steam systems and other total flow systems reviewed. Because of its potentially high performance and relative simplicity, the VPRT system appears to warrant further investigation toward its use in a well-head geothermal plant. 13 refs., 5 figs.
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis
Koyunoglu, Cemil; Karaca, Hüseyin
2017-12-01
Given the high cost of the tetraline solvent commonly used in liquefaction, the use of manure with EL is an important factor when considering the high cost of using tetraline as a hydrogen transfer source. In addition, due to the another cost factor which is the catalyst prices, red mud (commonly used, produced as a byproduct in the production of aluminium) is reduced cost in the work of liquefaction of coal, biomass, even coal combined biomass, corresponding that making the EL liquefaction an agenda for our country is another important factor. Conditions for liquefaction experiments conducted for hydrogen transfer from manure to coal; Catalyst concentration of 9%, liquid/solid ratio of 3/1, reaction time of 60 min, fertilizer/lignite ratio of 1/3, and the reaction temperature of 400 °C, the stirred speed of 400 rpm and the initial nitrogen pressure of 20 bar was fixed. In order to demonstrate the hydrogen, transfer from manure to coal, coal is used solely, by using tetraline (also known as a hydrogen carrier) and distilled water which is not hydrogen donor as a solvent in the co-liquefaction of experiments, and also the liquefaction conditions are carried out under an inert (N2) gas atmosphere. According to the results of the obtained liquefaction test; using tetraline solvent the total liquid product conversion percentage of the oil + gas conversion was 38.3 %, however, the results of oil+gas conversion obtained using distilled water and EL combined with manure the total liquid product conversion percentage was 7.4 %. According to the results of calorific value and elemental analysis, only the ratio of (H/C)atomic of coal obtained by using tetraline increased with the liquefaction of manure and distilled water. The reason of the increase in the amount of hydrogen due to hydrogen transfer from the manure on the solid surface of the coal, and also on the surface of the inner pore of the coal during the liquefaction, brings about the evaluation of the coal as a
International Nuclear Information System (INIS)
Pensado, Osvaldo; Mancillas, James
2007-01-01
An approach is described to estimate mean consequences and confidence bounds on the mean of seismic events with low probability of breaching components of the engineered barrier system. The approach is aimed at complementing total system performance assessment models used to understand consequences of scenarios leading to radionuclide releases in geologic nuclear waste repository systems. The objective is to develop an efficient approach to estimate mean consequences associated with seismic events of low probability, employing data from a performance assessment model with a modest number of Monte Carlo realizations. The derived equations and formulas were tested with results from a specific performance assessment model. The derived equations appear to be one method to estimate mean consequences without having to use a large number of realizations. (authors)
L-myo-inosose-1 as a probable intermediate in the reaction catalyzed by myo-inositol oxygenase
International Nuclear Information System (INIS)
Naber, N.I.; Swan, J.S.; Hamilton, G.A.
1986-01-01
In previous investigations, it was necessary to have Fe(II) and cysteine present in order to assay the catalytic activity of purified hog kidney myo-inositol oxygenase. In the present study it was found that, if this purified nonheme iron enzyme is slowly frozen in solution with glutathione and stored at -20 degrees C, it is fully active in the absence of activators if catalase is present to remove adventitious H 2 O 2 . With this simpler assay system it was possible to clarify the effects of several variables on the enzymic reaction. Thus, the maximum velocity is pH-dependent with a maximum around pH 9.5, but the apparent Km for myo-inositol (air atmosphere) remains constant at 5.0 mM throughout a broad pH range. The enzyme is quite specific for its substrate myo-inositol, is very sensitive to oxidants and reductants, but is not affected by a variety of complexing agents, nucleotides, sulfhydryl reagents, etc. In other experiments it was found that L-myo-inosose-1, a potential intermediate in the enzymic reaction, is a potent competitive inhibitor (Ki = 62 microM), while other inososes and a solution thought to contain D-glucodialdehyde, another potential intermediate, are weak inhibitors. Also, both a kinetic deuterium isotope effect (kH/kD = 2.1) and a tritium isotope effect (kH/kT = 7.5) are observed for the enzymic reaction when [1-2H]- and [1-3H]-myo-inositol are used as reactants. These latter results are considered strong evidence that the oxygenase reaction proceeds by a pathway involving L-myo-inosose-1 as an intermediate rather than by an alternative pathway that would have D-glucodialdehyde as the intermediate
Kozaka, Takashi; Miyakoshi, Naoki; Mukai, Chisato
2007-12-21
The total syntheses of (-)-magellanine, (+)-magellaninone, and (+)-paniculatine were completed from diethyl l-tartrate via the common intermediate in a stereoselective manner. The crucial steps in these syntheses involved two intramolecular Pauson-Khand reactions of enynes: the first Pauson-Khand reaction constructed the bicyclo[4.3.0] carbon framework, the corresponding A and B rings of these alkaloids in a highly stereoselective manner, whereas the second Pauson-Khand reaction stereoselectively produced the bicyclo[3.3.0]skeleton, which could be converted into the C and D rings of the target natural products.
Elastic scattering and total reaction cross section for the 6He + 27Al system
International Nuclear Information System (INIS)
Benjamim, E.A.; Lepine-Szily, A.; Mendes Junior, D.R.; Lichtenthaeler, R.; Guimaraes, V.; Gomes, P.R.S.; Chamon, L.C.; Hussein, M.S.; Moro, A.M.; Arazi, A.; Padron, I.; Alcantara Nunez, J.; Assuncao, M.; Barioni, A.; Camargo, O.; Denke, R.Z.; Faria, P.N. de; Pires, K.C.C.
2007-01-01
The elastic scattering of the radioactive halo nucleus 6 He on 27 Al target was measured at four energies close to the Coulomb barrier using the RIBRAS (Radioactive Ion Beams in Brazil) facility. The Sao Paulo Potential (SPP) was used and its diffuseness and imaginary strength were adjusted to fit the elastic scattering angular distributions. Reaction cross-sections were extracted from the optical model fits. The reduced reaction cross-sections of 6 He on 27 Al are similar to those for stable, weakly bound projectiles as 6,7 Li, 9 Be and larger than stable, tightly bound projectile as 16 O on 27 Al
DEFF Research Database (Denmark)
Tanner, David Ackland; Hagberg, Lars
1998-01-01
A convergent enantioselective total synthesis of the neurotoxic spirocyclic alkaloid (-)-perhydrohistrionicotoxin (2) is described. A Lewis acid-mediated intramolecular imine ene-type reaction was used for the key spirocyclisation step (14 to 3, with 3 being obtained as a single diastereoisomer...
Aung, T. T.; Fujii, T.; Amo, M.; Suzuki, K.
2017-12-01
Understanding potential of methane flux from the Pleistocene fore-arc basin filled turbiditic sedimentary formation along the eastern Nankai Trough is important in the quantitative assessment of gas hydrate resources. We considered generated methane could exist in sedimentary basin in the forms of three major components, and those are methane in methane hydrate, free gas and methane dissolved in water. Generation of biomethane strongly depends on microbe activity and microbes in turn survive in diverse range of temperature, salinity and pH. This study aims to understand effect of reaction temperature and total organic carbon on generation of biomethane and its components. Biomarker analysis and cultural experiment results of the core samples from the eastern Nankai Trough reveal that methane generation rate gets peak at various temperature ranging12.5°to 35°. Simulation study of biomethane generation was made using commercial basin scale simulator, PetroMod, with different reaction temperature and total organic carbon to predict how these effect on generation of biomethane. Reaction model is set by Gaussian distribution with constant hydrogen index and standard deviation of 1. Series of simulation cases with peak reaction temperature ranging 12.5°to 35° and total organic carbon of 0.6% to 3% were conducted and analyzed. Simulation results show that linear decrease in generation potential while increasing reaction temperature. But decreasing amount becomes larger in the model with higher total organic carbon. At higher reaction temperatures, >30°, extremely low generation potential was found. This is due to the fact that the source formation modeled is less than 1 km in thickness and most of formation do not reach temperature more than 30°. In terms of the components, methane in methane hydrate and free methane increase with increasing TOC. Drastic increase in free methane was observed in the model with 3% of TOC. Methane amount dissolved in water shows almost
Tandem ring-closing metathesis/isomerization reactions for the total synthesis of violacein
DEFF Research Database (Denmark)
Petersen, Mette Terp; Nielsen, Thomas Eiland
2013-01-01
A series of 5-substituted 2-pyrrolidinones was synthesized through a one-pot ruthenium alkylidene-catalyzed tandem RCM/isomerization/nucleophilic addition sequence. The intermediates resulting from RCM/isomerization showed reactivity toward electrophiles in aldol condensation reactions which...
International Nuclear Information System (INIS)
Abul-Magd, A.Y.; Talib aly al Hinai, M.
2000-01-01
In the framework of Glauber's multiple scattering theory we propose a closed form expression for the total nucleus-nucleus reaction cross-section. We adopt the Gaussian and the two-parameter Fermi step radial shapes to describe the nuclear density distributions of the projectile and the target, respectively. The present formula is used to study different systems over a wide energy range including low energy reactions, where the role of the Coulomb repulsion is taken into account. The present predictions reasonably reproduce experiment
Martin, Alex D; Siamaki, Ali R; Belecki, Katherine; Gupton, B Frank
2015-02-06
A direct and efficient total synthesis has been developed for telmisartan, a widely prescribed treatment for hypertension. This approach brings together two functionalized benzimidazoles using a high-yielding Suzuki reaction that can be catalyzed by either a homogeneous palladium source or graphene-supported palladium nanoparticles. The ability to perform the cross-coupling reaction was facilitated by the regio-controlled preparation of the 2-bromo-1-methylbenzimidazole precursor. This convergent approach provides telmisartan in an overall yield of 72% while circumventing many issues associated with previously reported processes.
Energy Technology Data Exchange (ETDEWEB)
Zhang, H; Kong, V; Jin, J [Georgia Regents University Cancer Center, Augusta, GA (Georgia); Ren, L; Zhang, Y; Giles, W [Duke University Medical Center, Durham, NC (United States)
2015-06-15
Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.
International Nuclear Information System (INIS)
Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W
2015-01-01
Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose
Influence of Minimally Invasive Total Hip Replacement on Hip Reaction Forces and Their Orientations
Weber, Tim; Al-Munajjed, Amir A.; Verkerke, Gijsbertus Jacob; Dendorfer, Sebastian; Renkawitz, Tobias
2014-01-01
Minimally invasive surgery (MIS) is becoming increasingly popular. Supporters claim that the main advantages of MIS total hip replacement (THR) are less pain and a faster rehabilitation and recovery. Critics claim that safety and efficacy of MIS are yet to be determined. We focused on a
The total quasi-steady-state approximation for complex enzyme reactions
DEFF Research Database (Denmark)
Pedersen, Morten Gram; Bersani, A. M.; Bersani, E.
2008-01-01
) approximation (or standard quasi-steady-state approximation (sQSSA)), which is valid when the enzyme concentration is sufficiently small. This condition is usually fulfilled for in vitro experiments, but often breaks down in vivo. The total QSSA (tQSSA), which is valid for a broader range of parameters covering...
A 11-Steps Total Synthesis of Magellanine through a Gold(I)-Catalyzed Dehydro Diels-Alder Reaction.
McGee, Philippe; Bétournay, Geneviève; Barabé, Francis; Barriault, Louis
2017-05-22
We have developed an innovative strategy for the formation of angular carbocycles via a gold(I)-catalyzed dehydro Diels-Alder reaction. This transformation provides rapid access to a variety of complex angular cores in excellent diastereoselectivities and high yields. The usefulness of this Au I -catalyzed cycloaddition was further demonstrated by accomplishing a 11-steps total synthesis of (±)-magellanine. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Horiuchi, W.; Hatakeyama, S.; Ebata, S.; Suzuki, Y.
2017-08-01
Low-lying electric-dipole (E 1 ) strength of a neutron-rich nucleus contains information on neutron-skin thickness, deformation, and shell evolution. We discuss the possibility of making use of total reaction cross sections on 40Ca, 120Sn, and 208Pb targets to probe the E 1 strength of neutron-rich Ca, Ni, and Sn isotopes. They exhibit large enhancement of the E 1 strength at neutron number N >28 , 50, and 82, respectively, due to a change of the single-particle orbits near the Fermi surface participating in the transitions. The density distributions and the electric-multipole strength functions of those isotopes are calculated by the Hartree-Fock+BCS and the canonical-basis-time-dependent-Hartree-Fock-Bogoliubov methods, respectively, using three kinds of Skyrme-type effective interaction. The nuclear and Coulomb breakup processes are respectively described with the Glauber model and the equivalent photon method in which the effect of finite-charge distribution is taken into account. The three Skyrme interactions give different results for the total reaction cross sections because of different Coulomb breakup contributions. The contribution of the low-lying E 1 strength is amplified when the low-incident energy is chosen. With an appropriate choice of the incident energy and target nucleus, the total reaction cross section can be complementary to the Coulomb excitation for analyzing the low-lying E 1 strength of unstable nuclei.
Woodworth, Benjamin D; Mead, Rebecca L; Nichols, Courtney N; Kolling, Derrick R J
2015-03-01
Microalgae are an attractive biofuel feedstock because of their high lipid to biomass ratios, lipid compositions that are suitable for biodiesel production, and the ability to grow on varied carbon sources. While algae can grow autotrophically, supplying an exogenous carbon source can increase growth rates and allow heterotrophic growth in the absence of light. Time course analyses of dextrose-supplemented Chlorella vulgaris batch cultures demonstrate that light availability directly influences growth rate, chlorophyll production, and total lipid accumulation. Parallel photomixotrophic and heterotrophic cultures grown to stationary phase reached the same amount of biomass, but total lipid content was higher for algae grown in the presence of light (an average of 1.90 mg/mL vs. 0.77 mg/mL over 5 days of stationary phase growth). Copyright © 2014 Elsevier Ltd. All rights reserved.
van der Velden, C A; Tolk, J J; Janssen, R P A; Reijman, M
2017-05-01
The aim of this study was to assess the current available evidence about when patients might resume driving after elective, primary total hip (THA) or total knee arthroplasty (TKA) undertaken for osteoarthritis (OA). In February 2016, EMBASE, MEDLINE, Web of Science, Scopus, Cochrane, PubMed Publisher, CINAHL, EBSCO and Google Scholar were searched for clinical studies reporting on 'THA', 'TKA', 'car driving', 'reaction time' and 'brake response time'. Two researchers (CAV and JJT) independently screened the titles and abstracts for eligibility and assessed the risk of bias. Both fixed and random effects were used to pool data and calculate mean differences (MD) and 95% confidence intervals (CI) between pre- and post-operative total brake response time (TBRT). A total of 19 studies were included. The assessment of the risk of bias showed that one study was at high risk, six studies at moderate risk and 12 studies at low risk. Meta-analysis of TBRT showed a MD decrease of 25.54 ms (95% CI -32.02 to 83.09) two weeks after right-sided THA, and of 18.19 ms (95% CI -6.13 to 42.50) four weeks after a right-sided TKA, when compared with the pre-operative value. The TBRT returned to baseline two weeks after a right-sided THA and four weeks after a right-sided TKA. These results may serve as guidelines for orthopaedic surgeons when advising patients when to resume driving. However, the advice should be individualised. Cite this article: Bone Joint J 2017;99-B:566-76. ©2017 The British Editorial Society of Bone & Joint Surgery.
International Nuclear Information System (INIS)
Zemlyanaya, E.V.; Lukyanov, K.V.; Lukyanov, V.K.; Hanna, K.M.
2009-01-01
The microscopic optical potential (OP) is calculated for the K+-meson scattering on the 12 C and 40 Ca nuclei at intermediate energies. This potential has no free parameters and based on the known kaon-nucleon amplitude and nuclear density distribution functions. Then, the Klein-Gordon equation is written in the form of the relativistic Schrodinger equation where terms quadratic in the potential was estimated can be neglected. The latter equations adapted to the considered task and solved numerically. The effect of revitalization is shown to play a significant role. A good agreement with the experimental data on differential elastic cross sections is obtained. However, to explain the data on total reaction cross sections the additional surface term of OP was introduced to account for influence of the peripheral nuclear reaction channels
Elastic scattering and total reaction cross section for the {sup 6}He+{sup 58}Ni system
Energy Technology Data Exchange (ETDEWEB)
Morcelle, V. [Instituto de Física - Universidade Federal Fluminense, 24210-346, Rio de Janeiro, Brazil and Universidade Federal de Itajubá, 35900-030, Itabira (Brazil); Lichtenthäler, R.; Lépine-Szily, A.; Guimarães, V.; Gasques, L.; Scarduelli, V.; Condori, R. Pampa; Leistenschneider, E. [Depto de Física Nuclear, Universidade de São Paulo, C.P. 66318, 05389-970, São Paulo (Brazil); Mendes Jr, D. R.; Faria, P. N. de [Instituto de Física - Universidade Federal Fluminense, 24210-346, Rio de Janeiro (Brazil); Pires, K. C. C. [Universidade Tecnológica Federal do Paraná, 86300-000, Cornélio Procópio (Brazil); Barioni, A. [Instituto de Física, Universidade Federal da Bahia, 40210-340, Bahia (Brazil); Morais, M. C. [Centro Brasileiro de Pesquisas Físicas, 22290-180, Rio de Janeiro (Brazil); Shorto, J. M. B. [Instituto de Pesquisas Energéticas e Nucleares- IPEN, 05508-000, São Paulo (Brazil); Zamora, J. C. [Departament of Physics, Technische Universität Darmstadt (Germany)
2014-11-11
Elastic scattering measurements of {sup 6}He + {sup 58}Ni system have been performed at the laboratory energy of 21.7 MeV. The {sup 6}He secondary beam was produced by a transfer reaction {sup 9}Be ({sup 7}Li, {sup 6}He) and impinged on {sup 58}Ni and {sup 197}Au targets, using the Radioactive Ion Beam (RIB) facility, RIBRAS, installed in the Pelletron Laboratory of the Institute of Physics of the University of São Paulo, Brazil. The elastic angular distribution was obtained in the angular range from 15° to 80° in the center of mass frame. Optical model calculations have been performed using a hybrid potential to fit the experimental data. The total reaction cross section was derived.
Guo, Cheng-ye; Wang, Hou-yu; Liu, Xiao-ping; Fan, Liu-yin; Zhang, Lei; Cao, Cheng-xi
2013-05-01
In this paper, moving reaction boundary titration (MRBT) was developed for rapid and accurate quantification of total protein in infant milk powder, from the concept of moving reaction boundary (MRB) electrophoresis. In the method, the MRB was formed by the hydroxide ions and the acidic residues of milk proteins immobilized via cross-linked polyacrylamide gel (PAG), an acid-base indicator was used to denote the boundary motion. As a proof of concept, we chose five brands of infant milk powders to study the feasibility of MRBT method. The calibration curve of MRB velocity versus logarithmic total protein content of infant milk powder sample was established based on the visual signal of MRB motion as a function of logarithmic milk protein content. Weak influence of nonprotein nitrogen (NPN) reagents (e.g., melamine and urea) on MRBT method was observed, due to the fact that MRB was formed with hydroxide ions and the acidic residues of captured milk proteins, rather than the alkaline residues or the NPN reagents added. The total protein contents in infant milk powder samples detected via the MRBT method were in good agreement with those achieved by the classic Kjeldahl method. In addition, the developed method had much faster measuring speed compared with the Kjeldahl method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
First results of total and partial cross-section measurements of the 107Ag(p,γ)108Cd reaction
Heim, Felix; Mayer, Jan; Scholz, Philipp; Spieker, Mark; Zilges, Andreas
2018-01-01
The γ process is assumed to play an important role in the nucleosynthesis of the majority of the p nuclei. Since the network of the γ process includes so many different reactions and - mainly unstable - nuclei, cross-section values are predominantly calculated in the scope of the Hauser-Feshbach statistical model. The values heavily depend on the nuclear-physics input parameters. The results of total and partial cross-section measurements are used to improve the accuracy of the theoretical calculations. In order to extend the experimental database, the 107Ag(p,γ)108Cd reaction was studied via the in-beam method at the high-efficiency HPGe γ-ray spectrometer HORUS at the University of Cologne. Proton beams with energies between 3.5 MeV and 5.0 MeV were provided by the 10 MV FN-Tandem accelerator leading to the determination of four new total cross-section values. After slight adjustments of the nuclear level density and γ-ray strength function an excellent agreement between theoretical calculations and experimentally deduced values for both total and partial cross sections has been obtained.
Sobieraj, Diana M; Freyer, Craig W
2010-01-01
To report a case of an adverse drug reaction (ADR) in a patient with type 2 diabetes mellitus taking prickly pear cactus (PPC), glipizide, and metformin. A 58-year-old Mexican male with type 2 diabetes mellitus being treated with metformin 1000 mg twice daily and extended-release glipizide 10 mg daily was referred to the pharmacist for medication education. He denied taking herbal supplements or experiencing hypoglycemia. Two hemoglobin A(1c) values (6.8% and 6.7%) obtained over the past year demonstrated glycemic control, which was supported by his reported fasting blood glucose readings of 113-132 mg/dL. One month later, the patient reported 4 hypoglycemic events with blood glucose readings of 49-68 mg/dL, which resulted in discontinuation of glipizide. One month later, the patient denied any further hypoglycemia. During medication reconciliation he reported consuming crude PPC pads daily for 2 months for glucose control. Literature suggests that PPC has an effect on lowering blood glucose levels in patients with type 2 diabetes mellitus, although few identified data describe ADRs from combining PPC with other agents used in treating type 2 diabetes mellitus. A literature search of MEDLINE (through December 2009) using the search terms diabetes mellitus, prickly pear cactus, nopal, opuntia, metformin, glipizide, glyburide, glimepiride, and sulfonylurea revealed no case reports of the described ADR. One case report describing the blood glucose-lowering effect of PPC in a patient concurrently taking oral antihyperglycemics documented an episode of hypoglycemia, although the Naranjo probability scale was not applied. One patient survey discovered the most common drug-herbal interaction in the given population to be between PPC and antihyperglycemic agents, resulting in hypoglycemia. In our case, use of the Naranjo probability scale suggests the ADR to be probable. The mechanism may be due to the additive glucose lowering of the 3 agents consumed concurrently by the
International Nuclear Information System (INIS)
Kaur, Ramanpreet; Vikas
2015-01-01
2-Aminopropionitrile (APN), a probable candidate as a chiral astrophysical molecule, is a precursor to amino-acid alanine. Stereochemical pathways in 2-APN are explored using Global Reaction Route Mapping (GRRM) method employing high-level quantum-mechanical computations. Besides predicting the conventional mechanism for chiral inversion that proceeds through an achiral intermediate, a counterintuitive flipping mechanism is revealed for 2-APN through chiral intermediates explored using the GRRM. The feasibility of the proposed stereochemical pathways, in terms of the Gibbs free-energy change, is analyzed at the temperature conditions akin to the interstellar medium. Notably, the stereoinversion in 2-APN is observed to be more feasible than the dissociation of 2-APN and intermediates involved along the stereochemical pathways, and the flipping barrier is observed to be as low as 3.68 kJ/mol along one of the pathways. The pathways proposed for the inversion of chirality in 2-APN may provide significant insight into the extraterrestrial origin of life
Musijowski, Jacek; Trojanowicz, Marek; Szostek, Bogdan; da Costa Lima, José Luis Fontes; Lapa, Rui; Yamashita, Hiroki; Takayanagi, Toshio; Motomizu, Shoji
2007-09-26
Considering recent reports on widespread occurrence and concerns about perfluoroalkyl substances (PFAS) in environmental and biological systems, analysis of these compounds have gained much attention in recent years. Majority of analyte-specific methods are based on a LC/MS/MS or a GC/MS detection, however many environmental or biological studies would benefit from a total organic fluorine (TOF) determination. Presented work was aimed at developing a method for TOF determination. TOF is determined as an amount of inorganic fluoride obtained after defluorination reaction conducted off-line using sodium biphenyl reagent directly on the sorbent without elution of retained analytes. Recovered fluoride was analyzed using flow-injection system with either fluorimetric or potentiometric detection. The TOF method was tested using perfluorocarboxylic acids (PFCA), including perfluorooctanoic acid (PFOA), as model compounds. Considering low concentrations of PFAS in natural samples, solid-phase extraction as a preconcentration procedure was evaluated. Several carbon-based sorbents were tested, namely multi-wall carbon nanotubes, carbon nanofibres and activated carbon. Good sorption of all analytes was achieved and defluorination reaction was possible to carry out directly on a sorbent bed. Recoveries obtained for PFCAs, adsorbed on an activated carbon sorbent, and measured as TOF, were 99.5+/-1.7, 110+/-9.4, 95+/-26, 120+/-32, 110+/-12 for C4, C6, C8, C10 and C12-PFCA, respectively. Two flow systems that would enable the defluorination reaction and fluoride determination in a single system were designed and tested.
Inclusive proton spectra and total reaction cross sections for proton-nucleus scattering at 800 MeV
International Nuclear Information System (INIS)
McGill, J.A.
1981-08-01
Current applications of multiple scattering theory to describe the elastic scattering of medium energy protons from nuclei have been shown to be quite successful in reproducing the experimental cross sections. These calculations use the impulse approximation, wherein the scattering from individual nucleons in the nucleus is described by the scattering amplitude for a free nucleon. Such an approximation restricts the inelastic channels to those initiated by nucleon-nucleon scattering. As a first step in determining the nature of p + nucleus scattering at 800 MeV, both total reaction cross sections and (p,p') inclusive cross sections were measured and compared to the free p + p cross sections for hydrogen, deuterium, calcium 40, carbon 12, and lead 208. It is concluded that as much as 85% of all reactions in a nucleus proceed from interactions with a single nucleon in the nucleus, and that the impulse approximation is a good starting point for a microscopic description of p + nucleus interactions at 800 MeV
African Journals Online (AJOL)
abp
19 oct. 2017 ... Reaction to Mohamed Said Nakhli et al. concerning the article: "When the axillary block remains the only alternative in a 5 year old child". .... Bertini L1, Savoia G, De Nicola A, Ivani G, Gravino E, Albani A et al ... 2010;7(2):101-.
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Peeters, M; Huang, C L; Vonk, L A; Lu, Z F; Bank, R A; Helder, M N; Doulabi, B Zandieh
2016-11-01
Studies which consider the molecular mechanisms of degeneration and regeneration of cartilaginous tissues are seriously hampered by problematic ribonucleic acid (RNA) isolations due to low cell density and the dense, proteoglycan-rich extracellular matrix of cartilage. Proteoglycans tend to co-purify with RNA, they can absorb the full spectrum of UV light and they are potent inhibitors of polymerase chain reaction (PCR). Therefore, the objective of the present study is to compare and optimise different homogenisation methods and RNA isolation kits for an array of cartilaginous tissues. Tissue samples such as the nucleus pulposus (NP), annulus fibrosus (AF), articular cartilage (AC) and meniscus, were collected from goats and homogenised by either the MagNA Lyser or Freezer Mill. RNA of duplicate samples was subsequently isolated by either TRIzol (benchmark), or the RNeasy Lipid Tissue, RNeasy Fibrous Tissue, or Aurum Total RNA Fatty and Fibrous Tissue kits. RNA yield, purity, and integrity were determined and gene expression levels of type II collagen and aggrecan were measured by real-time PCR. No differences between the two homogenisation methods were found. RNA isolation using the RNeasy Fibrous and Lipid kits resulted in the purest RNA (A260/A280 ratio), whereas TRIzol isolations resulted in RNA that is not as pure, and show a larger difference in gene expression of duplicate samples compared with both RNeasy kits. The Aurum kit showed low reproducibility. For the extraction of high-quality RNA from cartilaginous structures, we suggest homogenisation of the samples by the MagNA Lyser. For AC, NP and AF we recommend the RNeasy Fibrous kit, whereas for the meniscus the RNeasy Lipid kit is advised.Cite this article: M. Peeters, C. L. Huang, L. A. Vonk, Z. F. Lu, R. A. Bank, M. N. Helder, B. Zandieh Doulabi. Optimisation of high-quality total ribonucleic acid isolation from cartilaginous tissues for real-time polymerase chain reaction analysis. Bone Joint Res 2016
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Influence of nucleon density distribution in nucleon emission probability
International Nuclear Information System (INIS)
Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.
2014-01-01
Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
Borisova, Kseniya K; Kvyatkovskaya, Elizaveta A; Nikitina, Eugeniya V; Aysin, Rinat R; Novikov, Roman A; Zubkov, Fedor I
2018-04-20
A rare example of chemospecificity in the tandem Diels-Alder reaction of activated alkynes and bis-dienes has been revealed. The reaction between bis-furyl dienes and DMAD occurs at 25-80 °C and leads to kinetically controlled "pincer" adducts, 4a,8a-disubstituted 1,4:5,8-diepoxynaphthalenes. On the contrary, only thermodynamically controlled "domino" adducts (2,3-disubstituted 1,4:5,8-diepoxynaphthalenes) are formed in the same reaction at 140 °C. The "pincer" adducts can be transformed to the "domino" adducts at heating. The rate constants for reactions of both types were calculated using dynamic 1 H NMR spectroscopy.
Kerschgens, I. P.; Claveau, E.; Wanner, M.J.; Ingemann, S.; van Maarseveen, J.H.; Hiemstra, H.
2012-01-01
The pharmacologically interesting indole alkaloids (-)-mitragynine, (+)-paynantheine and (+)-speciogynine were synthesised in nine steps from 4-methoxytryptamine by a route featuring (i) an enantioselective thiourea-catalysed Pictet-Spengler reaction, providing the tetrahydro-β-carboline ring and
Taj, Rafiq A; Green, James R
2010-12-03
The application of the Nicholas reaction chemistry of 2,7-dioxygenated naphthalenes in the synthesis of cyclohepta[de]napthalenes and in the synthesis of (±)-microstegiol (1) is presented. The substitution profile of Nicholas monosubstitution (predominantly C-1) and disubstitution reactions (predominantly 1,6-) on 2,7-dioxygenated napthalenes is reported. Application of a 1,8-dicondensation product and selected C-1 monocondensation products to the construction of cyclohepta[de]naphthalenes by way of ring closing metathesis and intramolecular Friedel-Crafts reactions, respectively, is described. Deprotection of the C-7 oxygen function to the corresponding naphthol allows tautomerization to cyclohepta[de]naphthalene-1-ones upon seven-membered-ring closure in most cases, and replacement of the C-2 oxygen function in the naphthalene by a methyl group ultimately allows the synthesis of (±)-microstegiol.
Fillion, Eric; Fishlock, Dan
2005-09-28
The first synthesis of taiwaniaquinol B, a 6-nor-5(6-->7)abeoabietane-type diterpenoid exhibiting the uncommon fused 6-5-6 tricyclic carbon skeleton, was accomplished in 15 steps. A Lewis acid-promoted tandem intramolecular Friedel-Crafts/carbonyl alpha-tert-alkylation reaction was exploited as the core strategy for the synthesis of the sterically congested 1-indanone-containing tricyclic structure. This multiple carbon-carbon bond forming reaction exploits the unique reactivity of Meldrum's acid. The facile precursor synthesis makes this a useful methodology for the expedient modification and assembly of sterically congested 1-indanone-containing ring systems.
Roos, J.A.; Korf, S.J.; Veehof, R.H.J.; van Ommen, J.G.; Ross, J.R.H.
1989-01-01
Experiments using gas mixtures of O2, C2H6 or C2H4 and CH4 or He have been carried out with a Li/MgO catalyst using a well-mixed reaction system which show that the total oxidation products, CO and CO2, are formed predominantly from ethylene, formed in the oxidative coupling of methane. It is
Total Reaction Cross Section of Silicon Induced by ^{4}He in the Energy Range 3-10 MeV/u
Ugryumov, V Yu; Basybekov, K B; Bialkowski, E; Budzanowski, A; Duysebaev, A D; Duysebaev, B A; Zholdybaev, T K; Ismailov, K M; Kadyrzhanov, K K; Kalpakchieva, R; Kugler, A; Kukhtina, I N; Kushniruk, V F; Kuterbekov, K A; Mukhambetzhan, A; Penionzhkevich, Yu E; Sadykov, B M; Skwirczynska, I; Sobolev, Yu G
2003-01-01
The energy dependence of total reaction cross section for alpha-particles on ^{nat}Si has been directly and accurately measured by the transmission method. These data show that sigma_R has different energy dependence from theoretical predictions at low energies. The sigma_R corrections due to inelastic scattering to the first excited state were made by integrating corresponding angular distributions.
Total reaction cross section of silicon induced by 4He in the energy range 3-10 MeV/u
International Nuclear Information System (INIS)
Ugryumov, V.Yu.; Kuznetsov, I.V.; Kalpakchieva, R.
2003-01-01
The energy dependence of total reaction cross section for α-particles on nat Si has been directly and accurately measured by the transmission method. These data show that σ R has different energy dependence from theoretical predictions at low energies. The σ R corrections due to inelastic scattering to the first excited state were made by integrating corresponding angular distributions
International Nuclear Information System (INIS)
Sihver, L.; Kanai, T.
1992-07-01
We have developed a computer code for calculations of energy loss (dE/dx) and range distributions for heavy ions in any media. The results from our calculations are in very good agreement with previous calculations. We have developed semiempirical total reaction cross section formulae for proton-nucleus (with Z p ≤26) and nucleus-nucleus (with Z p and Z t ≤26) reactions. These formulae apply for incident energies above 15 MeV and 100 MeV/nucleon respectively. From the total reaction cross sections, we can calculate the mean free paths and the fluence distributions of protons and heavy ions in any media. We have compared all the calculated reaction cross sections and the mean free paths with experimental data, and the agreement is good. We have also constructed a procedure for calculating projectile fragment production cross sections, by scaling semiempirical proton-nucleus partial cross section systematics. The scaling is performed using a scaling parameter deduced from our reaction cross sections formulae, and additional enhancements factors. All products with atomic number ranging from that of the projectile (Z p ) down to Z=2 can be calculated. The agreement between the calculated cross sections and the experimental data is better than earlier published results. (author)
International Nuclear Information System (INIS)
Karamyan, S.A.; Adam, J.; Belov, A.G.; Chaloun, P.; Norseev, Yu.V.; Stegajlov, V.I.
1997-01-01
Fission-fragment mass distribution has been measured by the cumulative yields of radionuclides detected in the 232 Th(γ,f)-reaction at the Bremsstrahlung endpoint energies of 12 and 24 MeV. The yield upper limits have been estimated for the light nuclei 24 Na, 28 Mg, 38 S etc. at the Th and Ta targets exposure to the 24 MeV Bremsstrahlung. The results are discussed in terms of the multimodal fission phenomena and cluster emission >from a deformed fissioning system or from a compound nucleus
Energy Technology Data Exchange (ETDEWEB)
Kabdrakhimova, G. D., E-mail: gaukharkd@gmail.com [L.N.Gumilyov Eurasian National University (Kazakhstan); Sobolev, Yu. G.; Kuhtina, I. N. [Joint Institute for Nuclear Research (Russian Federation); Kuterbekov, K. A. [L.N.Gumilyov Eurasian National University (Kazakhstan); Mendibaev, K. O.; Penionzhkevich, Yu. E. [Joint Institute for Nuclear Research (Russian Federation)
2017-01-15
Experimental excitation functions in terms of the total cross sections for {sup 6}He + Si nuclear reactions are analyzed in the energy range between 5 and 50 MeV/A, and a brief survey of the procedures used to obtain experimental data is given. Particular attention is given to describing experiments performed in beams of radioactive nuclei from the accelerators of the Laboratory of Nuclear Reactions at the Joint Institute for Nuclear Research (JINR, Dubna). The experimental data in question are analyzed on the basis of a semimicroscopic optical model.
Measurement of the helicity-dependent total cross-section for the γn→ p π-π0 reaction
International Nuclear Information System (INIS)
Ahrens, J.; Arends, H.J.; Beck, R.; Heid, E.; Jahn, O.; Lang, M.; Martinez-Fabregate, M.; Tamas, G.; Thomas, A.; Altieri, S.; Panzeri, A.; Pinelli, T.; Annand, J.R.M.; McGeorge, J.C.; Protopopescu, D.; Rosner, G.; Blackston, M.A.; Weller, H.R.; Bradtke, C.; Dutz, H.; Klein, F.; Rohlof, C.; Braghieri, A.; Pedroni, P.; Hose, N. d'; Fix, A.; Kondratiev, R.; Lisin, V.; Meyer, W.; Reicherz, G.; Rostomyan, T.; Ryckbosch, D.
2011-01-01
The helicity dependence of the total cross-section for the γn→pπ - π 0 reaction has been measured for the first time at incident photon energies from 450 to 800MeV. The measurement was performed with the large-acceptance detector DAPHNE at the tagged photon beam facility of the MAMI accelerator in Mainz. Both the measured unpolarized and the helicity-dependent observables are not well described by the existing theoretical models. (orig.)
International Nuclear Information System (INIS)
Bueyuekuslu, H.; Kaplan, A.; Aydin, A.; Tel, E.; Yildirim, G.
2010-01-01
In this study, proton total reaction cross sections have been investigated for some isotopes such as 12 C, 27 Al, 9 Be, 16 O, 181 Ta, 197 Au, 6 Li, and 14 N by a proton beam up to 600 MeV. Calculation of the proton total cross sections has been carried out by the analytic expression formulated by M.A. Alvi by using Coulomb-modified Glauber theory with the Helm model nuclear form factor. The obtained results have been discussed and compared with the available experimental data and found to be in agreement with each other.
Rodríguez-Sánchez, Jose Luis; David, Jean-Christophe; Mancusi, Davide; Boudard, Alain; Cugnon, Joseph; Leray, Sylvie
2017-11-01
The prediction of one-nucleon-removal cross sections by the Liège intranuclear-cascade model has been improved using a refined description of the matter and energy densities in the nuclear surface. Hartree-Fock-Bogoliubov calculations with the Skyrme interaction are used to obtain a more realistic description of the radial-density distributions of protons and neutrons, as well as the excitation-energy uncorrelation at the nuclear surface due to quantum effects and short-range correlations. The results are compared with experimental data covering a large range of nuclei, from carbon to uranium, and projectile kinetic energies. We find that the new approach is in good agreement with experimental data of one-nucleon-removal cross sections covering a broad range in nuclei and energies. The new ingredients also improve the description of total reaction cross sections induced by protons at low energies, the production cross sections of heaviest residues close to the projectile, and the triple-differential cross sections for one-proton removal. However, other observables such as quadruple-differential cross sections of coincident protons do not present any sizable sensitivity to the new approach. Finally, the model is also tested for light-ion-induced reactions. It is shown that the new parameters can give a reasonable description of the nucleus-nucleus total reaction cross sections at high energies.
International Nuclear Information System (INIS)
Suwoto
2002-01-01
The integral testing of neutron cross-sections for Stainless Steel SUS-310 contained in various nuclear data files have been performed. The shielding benchmark calculations for Stainless Steel SUS-310 has been analysed through ORNL-Broomstick Experiment calculation which performed by MAERKER, R.E. at ORNL - USA ( 1) . Assessment with JENDL-3.1, JENDL-3.2, ENDF/B-IV, ENDF/B-VI nuclear data files and data from GEEL have also been carried out. The overall calculation results SUS-310 show in a good agreement with the experimental data, although, underestimate results appear below 3 MeV for all nuclear data files. These underestimation tendencies clearly caused by presented of iron nuclide which more than half in Stainless Steel compound. The total neutron cross-sections of iron nuclide contained in various nuclear data files relatively lower on that energy ranges
Directory of Open Access Journals (Sweden)
Octavio Novaro
2012-01-01
Full Text Available We review ab initio studies based on quantum mechanics on the most important mechanisms of reaction leading to the C–H, Si–H, and Ge–H bond breaking of methane, silane, and germane, respectively, by a metal atom in the lowest states in Cs symmetry: X(2nd excited state, 1st excited state and ground state + YH4→ H3XYH → H + XYH3 and XH + YH3. with X = Au, Zn, Cd, Hg, Al, and G, and Y = C, Si, and Ge. Important issues considered here are (a the role that the occupation of the d-, s-, or p-shells of the metal atom plays in the interactions with a methane or silane or germane molecule, (b the role of either singlet or doublet excited states of metals on the reaction barriers, and (c the role of transition probabilities for different families of reacting metals with these gases, using the H–X–Y angle as a reaction coordinate. The breaking of the Y–H bond of YH4 is useful in the production of amorphous hydrogenated films, necessary in several fields of industry.
International Nuclear Information System (INIS)
Varlamov, V.V.; Efimkin, N.G.; Ishkhanov, B.S.; Sapunenko, V.V.; Stepanov, M.E.
1993-01-01
The method based on the method of reduction is proposed for the evaluation of photonuclear reaction cross sections have been obtained at significant systematic uncertainties (different apparatus functions, calibration and normalization uncertainties). The evaluation method consists of using the real apparatus function (photon spectrum) of each individual experiment to reduce the data to a representation generated by an apparatus function of better quality. The task is to find the most reasonably achievable monoenergetic representation (MRAMR) of the information about cross section contained in different experiment observables and to take into account the experimental uncertainties of calibration and normalization procedures. The method was used to obtain the evaluated total photoneutron (γ, xn) reaction cross sections for 16 O, 28 Si, nat Cu, 141 Pr, and 208 Pb are presented. 79 refs., 19 figs., 6 tabs
International Nuclear Information System (INIS)
Dokmeci, D.; Akpolat, M.; Aydogdu, N.; Uzal, C.; Turan, N.F.
2004-01-01
Radiation therapy plays an important role in curative and palliative treatments of malignant diseases. Because of the lipid component in the membrane, lipid peroxidation has been reported to be particularly susceptible to radiation damage. However, lipid peroxidation is reversed by cellular defense mechanisms, and the use of various antioxidants involved in these mechanisms have recently been suggested to be beneficial. It is known that ibuprofen has antioxidative and/or free radical scavenging activities. Our purpose is to examine the antioxidant capacity of ibuprofen in hamsters undergoing total body irradiation (TBI). Ibuprofen was given by gavage at dose of 10 mg/kg for 15 consecutive days. After this period, animals were exposed to TBI 60 Co gamma irradiation with a single dose of 8 Gy. After 24 h radiation exposure, the hamsters were killed and samples were taken from blood. Plasma thiobarbituric acid reactive substances (TBARS) increased significantly after radiation exposure, and ibuprofen diminished the amounts of TBARS. Significant protection of the radiation-induced changes in the activities of superoxide dismutase (SOD) and catalase was also recorded in the blood of ibuprofen-treated and -irradiated hamsters. These results suggest that ibuprofen with its antioxidant capacity could play a modulatory role against cellular damage effected by free radicals induced by TBI. (author)
Energy Technology Data Exchange (ETDEWEB)
Ahrens, J.; Arends, H.J.; Beck, R.; Heid, E.; Jahn, O.; Lang, M.; Martinez-Fabregate, M.; Tamas, G.; Thomas, A. [Universitaet Mainz, Institut fuer Kernphysik, Mainz (Germany); Altieri, S.; Panzeri, A.; Pinelli, T. [INFN, Sezione di Pavia, Pavia (Italy); Universita di Pavia, Dipartimento di Fisica Nucleare e Teorica, Pavia (Italy); Annand, J.R.M.; McGeorge, J.C.; Protopopescu, D.; Rosner, G. [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Blackston, M.A.; Weller, H.R. [Duke University, Department of Physics, Durham, NC (United States); Bradtke, C.; Dutz, H.; Klein, F.; Rohlof, C. [Universitaet Bonn, Physikalisches Institut, Bonn (Germany); Braghieri, A.; Pedroni, P. [INFN, Sezione di Pavia, Pavia (Italy); Hose, N. d' [DSM/DAPNIA/SPhN, CEA Saclay, Gif-sur-Yvette Cedex (France); Fix, A. [Tomsk Polytechnic University, Laboratory of Mathematical Physics, Tomsk (Russian Federation); Kondratiev, R.; Lisin, V. [Academy of Science, INR, Moscow (Russian Federation); Meyer, W.; Reicherz, G. [Ruhr-Universitaet Bochum, Insitut fuer Experimentalphysik, Bochum (Germany); Rostomyan, T. [Universiteit Gent, Subatomaire en Stralingsfysica, Gent (Belgium); INFN, Sezione di Pavia, Pavia (Italy); Ryckbosch, D. [Universiteit Gent, Subatomaire en Stralingsfysica, Gent (Belgium)
2011-03-15
The helicity dependence of the total cross-section for the {gamma}n{yields}p{pi}{sup -}{pi}{sup 0} reaction has been measured for the first time at incident photon energies from 450 to 800MeV. The measurement was performed with the large-acceptance detector DAPHNE at the tagged photon beam facility of the MAMI accelerator in Mainz. Both the measured unpolarized and the helicity-dependent observables are not well described by the existing theoretical models. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Ou, I.; Yamada, Y.; Mori, T.; Yano, T.; Sakuda, M. [Department of Physics, Okayama University, Okayama 700-8530 (Japan); Tamii, A.; Suzuki, T.; Yosoi, M.; Aoi, N.; Ideguchi, E.; Hashimoto, T.; Miki, K.; Ito, T.; Iwamoto, C.; Yamamoto, T. [Research Center for Nuclear Physics (RCNP), Osaka University, Ibaraki, Osaka 567-0047 (Japan); Akimune, H. [Department of Physics, Konan University, Okamoto 8-9-1, Higashinada, Kobe 658-8501 (Japan)
2015-05-15
We propose to measure the γ-ray emission probability from excited states above 5 MeV including giant resonance of {sup 16}O and {sup 12}C as a function of excitation energy in 1-MeV step. Here, we measure both the excitation energy (E{sub x}=5-30MeV) at the forward scattering angles (0°-3°) of the {sup 16}O, {sup 12}C (p, p’) reaction using Grand-Raiden Spectrometer and the energy of γ-rays (E{sub γ}) using an array of NaI(Tl) counters. The purpose of the experiment is to provide the basic and important information not only for the γ-ray production from primary neutral-current neutrino-oxygen (-carbon) interactions but also for that from the secondary hadronic (neutron-oxygen and -carbon) interactions.
Hu, Liyuan; Song, Yushou; Hou, Yingwei; Liu, Huilan; Li, Hui
2018-07-01
A semi-microscopic analytical expression of the nucleus-nucleus total reaction cross section (σR) was proposed based on the strong absorption model. It is suitable for stable nuclei at intermediate energies. The matter density distributions of nuclei and the nucleon-nucleon total cross section were both considered. Particularly, the Fermi motion effect of the nucleons in a nucleus was also taken into account. The parametrization of σR was applied to the colliding systems including 12C. The experimental data at energies from 30 to 1000 MeV/nucleon were well reproduced, according to which an approach of deriving σR without adjustable parameters was developed. The necessity of considering the Fermi motion effect in the parametrization was discussed.
International Nuclear Information System (INIS)
Berezin, F.N.; Kisurin, V.A.; Nemets, O.F.; Ofengenden, R.G.; Pugach, V.M.; Pavlenko, Yu.N.; Patlan', Yu.V.; Savrasov, S.S.
1981-01-01
Experimental technique for investigation of three-particle nuclear reactions in kinematically total experiments is described. The technique provides the storage of one-dimensional and two- dimensional energy spectra from several detectors. A block diagram of the measuring system, using this technique, is presented. The measuring system consists of analog equipment for rapid-slow coincidences and of a two-processor complex on the base of the M-400 computer with a general bus. Application of a two-processor complex, each computer of which has a possibility of direct access to memory of another computer, permits to separate functions of data collection and data operational presentation and to perform necessary physical calculations. Software of the measuring complex which includes programs written using the ASSEMBLER language for the first computer and functional programs written using the BASIC language for the second computer, is considered. Software of the first computer includes the DISPETCHER dialog control program, driver package for control of external devices, of applied program package and system modules. The technique, described, is tested in experiment on investigation of d+ 10 B→α+α+α three- particle reaction at deutron energy of 13.6 MeV. The two-dimensional energy spectrum reaction obtained with the help of the technique described is presented [ru
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
International Nuclear Information System (INIS)
Cherkaoui-Tadili, R.
1982-01-01
The total reaction cross-section σsub(R) for interactions between heavy ions is predicted to decrease rapidly with the energy of the incident projectile over the energy range 10 MeV/A - 100 MeV/A. We present here an experimental met σsub(R) to test the model based predictions. The method consists in counting the number of all incoming projectiles and the number of out going projectiles that did not interact with the target. The difference between these two numbers corresponds to the number of particles that reacted with the target nuclei and is therefore proportional to σsub(R). Values of σsub(R) have been measured for the system 12 C + 12 C at two incident energies of 112 MeV and 996 MeV. The results of 1444 +- 70 (112 MeV) and 994 +- 50 (996 MeV) show a total reaction cross-section decreasing with energy as predicted from the Glauber model and optical model fits to elastic scattering [fr
Kim, Ji Yeun; Lee, Jung-Lim
2014-01-01
Background This study describes the first multiplex real-time polymerase chain reaction assay developed, as a multipurpose assessment, for the simultaneous quantification of total bacteria and three Vibrio spp. (V. parahaemolyticus, V. vulnificus and V. anguillarum) in fish and seawater. The consumption of raw finfish as sushi or sashimi has been increasing the chance of Vibrio outbreaks in consumers. Freshness and quality of fishery products also depend on the total bacterial populations present. Results The detection sensitivity of the specific targets for the multiplex assay was 1 CFU mL−1 in pure culture and seawater, and 10 CFU g−1 in fish. While total bacterial counts by the multiplex assay were similar to those obtained by cultural methods, the levels of Vibrio detected by the multiplex assay were generally higher than by cultural methods of the same populations. Among the natural samples without Vibrio spp. inoculation, eight out of 10 seawater and three out of 20 fish samples were determined to contain Vibrio spp. Conclusion Our data demonstrate that this multiplex assay could be useful for the rapid detection and quantification of Vibrio spp. and total bacteria as a multipurpose tool for surveillance of fish and water quality as well as diagnostic method. © 2014 The Authors. Journal of the Science of Food and Agriculture published by JohnWiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:24752974
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Energy Technology Data Exchange (ETDEWEB)
Ambrosino, G; Sorriaux, A [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1964-07-01
The experiment described consists in the measurement of the total cross-section of various materials: aluminium, carbon, fluorine and hydrogen, for mono-energetic 2.77 MeV neutrons obtained from the d,d reaction. The measurement is carried out by transmission. The neutrons are detected by means of a plastic scintillator mounted on a 56 AVP photomultiplier, and are isolated from all secondary phenomena (background noise, scattered neutrons) by coincidence with helium 3. which particles are associated to the neutrons from the reaction {sup 2}{sub 1}D ({sup 2}{sub 1}D, n) {sup 3}{sub 2}H The helium 3 particles are detected by a PN junction diode used with inverted polarisation. An absorption exponential has been traced out using measurements made on seven aluminium bars. The accuracy of the total cross-section measurements is about 10{sup -2}. (authors) [French] L'experience exposee dans ce rapport consiste en la mesure des sections efficaces totales de differents materiaux: aluminium, carbone, fluor et hydrogene, pour des neutrons monoenergetiques de 2,77 MeV, obtenus par la reaction d,d. La mesure est faite par transmission. Les neutrons sont detectes par un scintillateur plastique monte sur un photomultiplicateur 56 AVP, et sont separes de tout phenomene secondaire (bruit de fond, neutrons diffuses) par coincidence avec les helium 3, particules associees aux neutrons de la reaction {sup 2}{sub 1}D ({sup 2}{sub 1}D, n) {sup 3}{sub 2}H Les helium 3 sont detectes par une diode a jonction PN utilisee en polarisation inverse. Une exponentielle d'absorption a ete tracee a partir de mesures faites sur sept barreaux d'aluminium. La precision des mesures des sections efficaces totales est de l'ordre de 10{sup -2}. (auteurs)
Constraints on the ^22Ne(α,n)^25Mg reaction rate from ^natMg+n Total and ^25Mg(n,γ ) Cross Sections
Koehler, Paul
2002-10-01
The ^22Ne(α,n)^25Mg reaction is the neutron source during the s process in massive and intermediate mass stars as well as a secondary neutron source during the s process in low mass stars. Therefore, an accurate determination of this rate is important for a better understanding of the origin of nuclides heavier than iron as well as for improving s-process models. Also, because the s process produces seed nuclides for a later p process in massive stars, an accurate value for this rate is important for a better understanding of the p process. Because the lowest observed resonance in direct ^22Ne(α,n)^25Mg measurements is considerably above the most important energy range for s-process temperatures, the uncertainty in this rate is dominated by the poorly known properties of states in ^26Mg between this resonance and threshold. Neutron measurements can observe these states with much better sensitivity and determine their parameters much more accurately than direct ^22Ne(α,n)^25Mg measurements. I have analyzed previously reported Mg+n total and ^25Mg(n,γ ) cross sections to obtain a much improved set of resonance parameters for states in ^26Mg in this region, and an improved estimate of the uncertainty in the ^22Ne(α,n)^25Mg reaction rate. This work was supported by the U.S. DOE under contract No. DE-AC05-00OR22725 with UT-Battell, LLC.
Hue, B. M.; Isataev, T.; Erdemchimeg, B.; Artukh, A. G.; Aznabaev, D.; Davaa, S.; Klygin, S. A.; Kononenko, G. A.; Khuukhenkhuu, G.; Kuterbekov, K.; Lukyanov, S. M.; Mikhailova, T. I.; Maslov, V. A.; Mendibaev, K.; Sereda, Yu M.; Penionzhkevich, Yu E.; Vorontsov, A. N.
2017-12-01
Preliminary results of measurements of the total reaction cross sections σR and neutron removal cross section σ-xn for weakly bound 6He, 8Li, 9Be and 10Be nuclei at energy range (20-35) A MeV with 28Si target is presented. The secondary beams of light nuclei were produced by bombardment of the 22Ne (35 A MeV) primary beam on Be target and separated by COMBAS fragment-separator. In dispersive focal plane a horizontal slit defined the momentum acceptance as 1% and a wedge degrader of 200 μm Al was installed. The Bρ of the second section of the fragment-separator was adjusted for measurements in energy range (20-35) A MeV. Two-neutron removal cross sections for 6He and 10Be and one -neutron removal cross sections 8Li and 9Be were measured.
Duysebaev, A D; Kuchtina, I N; Sadykov, B M; Slusarenko, L I; Tokarevsky, V V; Fayans, S A
2001-01-01
A complex analysis of experimental data of elastic, inelastic scattering and total reactions cross-sections of alpha-particles on ^{90,94}Zr nuclei is performed. Values of the deformation lengths and neutron-proton multipole matrix elements relations for 2_{1}^{+}- and 3_{1}^{+}-states of ^{90,92,94,96}Zr nuclei for different types of particles are obtained. A comparative analysis is made. Experimental data for inelastic scattering of 35.4, 40.0, 50.1 and 65.0 MeV alpha-particles on ^{90,94}Zr nuclei are analysed for understanding the phase shifts in frames of the unified approach.
Compton, N.; Taylor, C. E.; Hicks, K.; Cole, P.; Zachariou, N.; Ilieva, Y.; Nadel-Turonski, P.; Klempt, E.; Nikonov, V. A.; Sarantsev, A. V.; Adhikari, K. P.; Adhikari, S.; Akbar, Z.; Anefalos Pereira, S.; Avakian, H.; Baltzell, N. A.; Battaglieri, M.; Batourine, V.; Bedlinskiy, I.; Biselli, A. S.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Camp, M.; Cao, Frank Thanh; Cao, T.; Carman, D. S.; Celentano, A.; Charles, G.; Chetry, T.; Ciullo, G.; Clark, L.; Cole, P. L.; Contalbrigo, M.; Cortes, O.; Crede, V.; D'Angelo, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Deur, A.; Djalali, C.; Dupre, R.; Egiyan, H.; El Alaoui, A.; El Fassi, L.; Elouadrhiri, L.; Eugenio, P.; Fedotov, G.; Filippi, A.; Fleming, J. A.; Fradi, A.; Gavalian, G.; Ghandilyan, Y.; Giovanetti, K. L.; Girod, F. X.; Glazier, D. I.; Gleason, C.; Golovatch, E.; Gothe, R. W.; Griffioen, K. A.; Guidal, M.; Guo, L.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Harrison, N.; Heddle, D.; Holtrop, M.; Hughes, S. M.; Hyde, C. E.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Jenkins, D.; Jo, H. S.; Joo, K.; Joosten, S.; Keller, D.; Khachatryan, G.; Khachatryan, M.; Khandaker, M.; Kim, W.; Klein, A.; Klein, F. J.; Kubarovsky, V.; Kuleshov, S. V.; Lanza, L.; Lenisa, P.; Livingston, K.; Lu, H. Y.; MacGregor, I. J. D.; Markov, N.; McKinnon, B.; Meyer, C. A.; Mineeva, T.; Mirazita, M.; Mokeev, V.; Montgomery, R. A.; Movsisyan, A.; Munevar, E.; Munoz Camacho, C.; Murdoch, G.; Nadel-Turonski, P.; Niccolai, S.; Niculescu, G.; Niculescu, I.; Osipenko, M.; Ostrovidov, A. I.; Paolone, M.; Paremuzyan, R.; Park, K.; Pasyuk, E.; Phelps, W.; Pisano, S.; Pogorelko, O.; Price, J. W.; Prok, Y.; Protopopescu, D.; Raue, B. A.; Ripani, M.; Ritchie, B. G.; Rizzo, A.; Rosner, G.; Sabatié, F.; Salgado, C.; Schumacher, R. A.; Sharabian, Y. G.; Simonyan, A.; Skorodumina, Iu.; Smith, G. D.; Sokhan, D.; Sparveris, N.; Stankovic, I.; Stepanyan, S.; Strakovsky, I. I.; Strauch, S.; Taiuti, M.; Torayev, B.; Trivedi, A.; Ungaro, M.; Voskanyan, H.; Voutier, E.; Walford, N. K.; Watts, D. P.; Wei, X.; Wood, M. H.; Zachariou, N.; Zhang, J.; CLAS Collaboration
2017-12-01
We report the first measurement of differential and total cross sections for the γ d →K0Λ (p ) reaction, using data from the CLAS detector at the Thomas Jefferson National Accelerator Facility. Data collected during two separate experimental runs were studied with photon-energy coverage 0.8-3.6 GeV and 0.5- 2.6 GeV, respectively. The two measurements are consistent giving confidence in the method and determination of systematic uncertainties. The cross sections are compared with predictions from the KAON-MAID theoretical model (without kaon exchange), which deviate from the data at higher W and at forward kaon angles. These data, along with previously published cross sections for K+Λ photoproduction, provide essential constraints on the nucleon resonance spectrum. A first partial wave analysis was performed that describes the data without the introduction of new resonances.
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
ROBERTS, DA; BECCHETTI, FD; BROWN, JA; JANECKE, J; PHAM, K; ODONNELL, TW; WARNER, RE; RONNINGEN, RM; WILSCHUT, HW
1995-01-01
A primary O-17 beam has been used to produce a 22.3 MeV/nucleon F-18(m) isomeric secondary beam via a single nucleon transfer reaction on a carbon target. The total nuclear reaction cross sections for F-18(m) and F-18(g.s.) in silicon were measured in a stack of seven silicon solid-state detectors.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Directory of Open Access Journals (Sweden)
Erfan Shahabpoor
2018-06-01
Full Text Available Continuous monitoring of natural human gait in real-life environments is essential in many applications including disease monitoring, rehabilitation, and professional sports. Wearable inertial measurement units are successfully used to measure body kinematics in real-life environments and to estimate total walking ground reaction forces GRF(t using equations of motion. However, for inverse dynamics and clinical gait analysis, the GRF(t of each foot is required separately. Using an experimental dataset of 1243 tri-axial separate-foot GRF(t time histories measured by the authors across eight years, this study proposes the ‘Twin Polynomial Method’ (TPM to estimate the tri-axial left and right foot GRF(t signals from the total GRF(t signals. For each gait cycle, TPM fits polynomials of degree five, eight, and nine to the known single-support part of the left and right foot vertical, anterior-posterior, and medial-lateral GRF(t signals, respectively, to extrapolate the unknown double-support parts of the corresponding GRF(t signals. Validation of the proposed method both with force plate measurements (gold standard in the laboratory, and in real-life environment showed a peak-to-peak normalized root mean square error of less than 2.5%, 6.5% and 7.5% for the estimated GRF(t signals in the vertical, anterior-posterior and medial-lateral directions, respectively. These values show considerable improvement compared with the currently available GRF(t decomposition methods in the literature.
Sundeep Chaitanya, V; Das, Madhusmita; Eisenbach, Tiffany L; Amoako, Angela; Rajan, Lakshmi; Horo, Ilse; Ebenezer, Mannam
2016-06-01
With the absence of an effective diagnostic tool for leprosy, cases with negative bacteriological index and limited clinical manifestations often pose diagnostic challenges. In this study, we investigated the utility of a novel Mycobacterium leprae specific 112-bp DNA sequence in the promoter region of probable 4-alpha-glucanotransferase (pseudogene, ML1545) for polymerase chain reaction (PCR) based diagnosis of leprosy in comparison to that of the RLEP gene. DNA was extracted from slit skin scrapings of 180 newly diagnosed untreated leprosy cases that were classified as per Ridley Jopling classifications and bacteriological index (BI). Primers were designed using Primer Blast 3.0 and PCR was performed with annealing temperatures of 61°C for ML1545 and 58°C for the RLEP gene using conventional gradient PCR. The results indicated a significant increase in PCR positivity of ML1545 when compared to RLEP across the study groups (164/180 [91.11%] were positive for ML1545 whereas 114/180 (63.33%) were positive for RLEP [pleprosy cases with negative BI, 28 (48.28%) were positive for RLEP and 48 (82.76%) were positive for ML1545 (p=.0001, z=3.8). Of the 42 borderline tuberculoid leprosy cases, 23 (54.76%) were positive for RLEP whereas 37 (88.09%) were positive for ML1545 (pleprosy and BI-positive groups. ML1545 can be a potential gene target for PCR-based diagnosis of leprosy especially in cases where clinical manifestations were minimal. Copyright © 2016 Asian-African Society for Mycobacteriology. Published by Elsevier Ltd. All rights reserved.
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
Energy Technology Data Exchange (ETDEWEB)
Grau Malonda, A; Garcia-Torano, E
1983-07-01
Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs.
Energy Technology Data Exchange (ETDEWEB)
Hayashi, M [Saitama Medical College (Japan)
1974-01-01
We evaluate the energy spectrum of the photons emitted in the reaction e/sup +/e/sup -/ ..-->.. ..mu../sup +/..mu../sup -/..gamma.., and the hard photon correction to the total cross-section of the reaction e/sup +/e/sup -/ ..-->.. ..mu../sup +/..mu../sup -/. We develop a simple technique based on the analytical QED formulae, in particular, on the current conservation.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Dodsworth, Jeremy A; McDonald, Austin I; Hedlund, Brian P
2012-08-01
To inform hypotheses regarding the relative importance of chemolithotrophic metabolisms in geothermal environments, we calculated free energy yields of 26 chemical reactions potentially supporting chemolithotrophy in two US Great Basin hot springs, taking into account the effects of changing reactant and product activities on the Gibbs free energy as each reaction progressed. Results ranged from 1.2 × 10(-5) to 3.6 J kg(-1) spring water, or 3.7 × 10(-5) to 11.5 J s(-1) based on measured flow rates, with aerobic oxidation of CH(4) or NH4 + giving the highest average yields. Energy yields calculated without constraining pH were similar to those at constant pH except for reactions where H(+) was consumed, which often had significantly lower yields when pH was unconstrained. In contrast to the commonly used normalization of reaction chemical affinities per mole of electrons transferred, reaction energy yields for a given oxidant varied by several orders of magnitude and were more sensitive to differences in the activities of products and reactants. The high energy yield of aerobic ammonia oxidation is consistent with previous observations of significant ammonia oxidation rates and abundant ammonia-oxidizing archaea in sediments of these springs. This approach offers an additional lens through which to view the thermodynamic landscape of geothermal springs. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Feng, Wei; Jiang, Danfeng; Kee, Choon-Wee; Liu, Hongjun; Tan, Choon-Hong
2016-02-04
Hydroisoquinoline derivatives were prepared in moderate to good enantioselectivities via a bicyclic guanidine-catalyzed tandem isomerization intramolecular-Diels-Alder (IMDA) reaction of alkynes. With this synthetic method, the first enantioselective synthesis of (+)-alpha-yohimbine was completed in 9 steps from the IMDA products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
International Nuclear Information System (INIS)
Owen, Andrew W.; McAulay, Edith A.J.; Nordon, Alison; Littlejohn, David; Lynch, Thomas P.; Lancaster, J. Steven; Wright, Robert G.
2014-01-01
Highlights: • High efficiency thermal vaporiser designed and used for on-line reaction monitoring. • Concentration profiles of all reactants and products obtained from mass spectra. • By-product formed from the presence of an impurity detected by MS but not MIR. • Mass spectrometry can detect trace and bulk components unlike molecular spectrometry. - Abstract: A specially designed thermal vaporiser was used with a process mass spectrometer designed for gas analysis to monitor the esterification of butan-1-ol and acetic anhydride. The reaction was conducted at two scales: in a 150 mL flask and a 1 L jacketed batch reactor, with liquid delivery flow rates to the vaporiser of 0.1 and 1.0 mL min −1 , respectively. Mass spectrometry measurements were made at selected ion masses, and classical least squares multivariate linear regression was used to produce concentration profiles for the reactants, products and catalyst. The extent of reaction was obtained from the butyl acetate profile and found to be 83% and 76% at 40 °C and 20 °C, respectively, at the 1 L scale. Reactions in the 1 L reactor were also monitored by in-line mid-infrared (MIR) spectrometry; off-line gas chromatography (GC) was used as a reference technique when building partial least squares (PLS) multivariate calibration models for prediction of butyl acetate concentrations from the MIR spectra. In validation experiments, good agreement was achieved between the concentration of butyl acetate obtained from in-line MIR spectra and off-line GC. In the initial few minutes of the reaction the profiles for butyl acetate derived from on-line direct liquid sampling mass spectrometry (DLSMS) differed from those of in-line MIR spectrometry owing to the 2 min transfer time between the reactor and mass spectrometer. As the reaction proceeded, however, the difference between the concentration profiles became less noticeable. DLSMS had advantages over in-line MIR spectrometry as it was easier to generate
Fassheber, Nancy; Dammeier, Johannes; Friedrichs, Gernot
2014-06-21
The overall rate constant of the reaction (2), NCN + H, which plays a key role in prompt-NO formation in flames, has been directly measured at temperatures 962 K rate constants are best represented by the combination of two Arrhenius expressions, k2/(cm(3) mol(-1) s(-1)) = 3.49 × 10(14) exp(-33.3 kJ mol(-1)/RT) + 1.07 × 10(13) exp(+10.0 kJ mol(-1)/RT), with a small uncertainty of ±20% at T = 1600 K and ±30% at the upper and lower experimental temperature limits.The two Arrhenius terms basically can be attributed to the contributions of reaction channel (2a) yielding CH + N2 and channel (2b) yielding HCN + N as the products. A more refined analysis taking into account experimental and theoretical literature data provided a consistent rate constant set for k2a, its reverse reaction k1a (CH + N2 → NCN + H), k2b as well as a value for the controversial enthalpy of formation of NCN, ΔfH = 450 kJ mol(-1). The analysis verifies the expected strong temperature dependence of the branching fraction ϕ = k2b/k2 with reaction channel (2b) dominating at the experimental high-temperature limit. In contrast, reaction (2a) dominates at the low-temperature limit with a possible minor contribution of the HNCN forming recombination channel (2d) at T < 1150 K.
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
K X-rays and nuclear reaction times in the deep inelastic reactions U+U and U+Pb at 7.5 MeV/amu
International Nuclear Information System (INIS)
Stoller, C.
1985-01-01
The K-shell ionisation probability of the heavy reaction products emerging from binary deep inelastic collisions of U + U and U + Pb at 7.5 MeV/amu has been measured as a function of the total kinetic energy loss - Q. After subtraction of the ionisation probability due to internal conversion of γ-rays, a strongly Q-dependent Psub(K) is found, in agreement with theoretical predictions relating the change in ionisation probability to the nuclear sticking time. The deduced nuclear reaction times are in qualitative agreement with predictions from nuclear models of deep inelastic reactions. (orig.)
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
DEFF Research Database (Denmark)
Overgaard, Martin; Cangemi, Claudia; Jensen, Martin L
2015-01-01
biomarker fibulin-1 and its circulating isoforms in human plasma. EXPERIMENTAL DESIGN:: We used bioinformatics analysis to predict total and isoform-specific tryptic peptides for absolute quantitation using SRM-MS. Fibulin-1 was quantitated in plasma by nanoflow-LC-SRM-MS in undepleted plasma and time......PURPOSE:: Targeted proteomics using SRM-MS combined with stable isotope dilution has emerged as a promising quantitative technique for the study of circulating protein biomarkers. The purpose of this study was to develop and characterize robust quantitative assays for the emerging cardiovascular......-resolved immunofluorometric assay (TRIFMA). Both methods were validated and compared to a commercial ELISA (CircuLex). Molecular size determination was performed under native conditions by SEC analysis coupled to SRM-MS and TRIFMA. RESULTS:: Absolute quantitation of total fibulin-1, isoforms -1C and -1D was performed by SRM...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Thomas, Peter; von der Helm, Christine; Schopf, Christoph; Mazoochian, Farhad; Frommelt, Lars; Gollwitzer, Hans; Schneider, Josef; Flaig, Michael; Krenn, Veit; Thomas, Benjamin; Summer, Burkhard
2015-01-01
We performed a combined approach to identify suspected allergy to knee arthroplasty (TKR): patch test (PT), lymphocyte transformation test (LTT), histopathology (overall grading; T- and B-lymphocytes, macrophages, and neutrophils), and semiquantitative Real-time-PCR-based periprosthetic inflammatory mediator analysis (IFNγ, TNFα, IL1-β, IL-2, IL-6, IL-8, IL-10, IL17, and TGFβ). We analyzed 25 TKR patients with yet unexplained complications like pain, effusion, and reduced range of motion. They consisted of 20 patients with proven metal sensitization (11 with PT reactions; 9 with only LTT reactivity). Control specimens were from 5 complicated TKR patients without metal sensitization, 12 OA patients before arthroplasty, and 8 PT patients without arthroplasty. Lymphocytic infiltrates were seen and fibrotic (Type IV membrane) tissue response was most frequent in the metal sensitive patients, for example, in 81% of the PT positive patients. The latter also had marked periprosthetic IFNγ expression. 8/9 patients with revision surgery using Ti-coated/oxinium based implants reported symptom relief. Our findings demonstrate that combining allergy diagnostics with histopathology and periprosthetic cytokine assessment could allow us to design better diagnostic strategies.
Directory of Open Access Journals (Sweden)
Peter Thomas
2015-01-01
Full Text Available We performed a combined approach to identify suspected allergy to knee arthroplasty (TKR: patch test (PT, lymphocyte transformation test (LTT, histopathology (overall grading; T- and B-lymphocytes, macrophages, and neutrophils, and semiquantitative Real-time-PCR-based periprosthetic inflammatory mediator analysis (IFNγ, TNFα, IL1-β, IL-2, IL-6, IL-8, IL-10, IL17, and TGFβ. We analyzed 25 TKR patients with yet unexplained complications like pain, effusion, and reduced range of motion. They consisted of 20 patients with proven metal sensitization (11 with PT reactions; 9 with only LTT reactivity. Control specimens were from 5 complicated TKR patients without metal sensitization, 12 OA patients before arthroplasty, and 8 PT patients without arthroplasty. Lymphocytic infiltrates were seen and fibrotic (Type IV membrane tissue response was most frequent in the metal sensitive patients, for example, in 81% of the PT positive patients. The latter also had marked periprosthetic IFNγ expression. 8/9 patients with revision surgery using Ti-coated/oxinium based implants reported symptom relief. Our findings demonstrate that combining allergy diagnostics with histopathology and periprosthetic cytokine assessment could allow us to design better diagnostic strategies.
Collaboration, T. Csörgő for the TOTEM; :; Antchev, G.; Aspell, P.; Atanassov, I.; Avati, V.; Baechler, J.; Berardi, V.; Berretti, M.; Bossini, E.; Bozzo, M.; Brogi, P.; Brücken, E.; Buzzo, A.; Cafagna, F. S.
2012-01-01
Proton-proton elastic scattering has been measured by the TOTEM experiment at the CERN Large Hadron Collider at $\\sqrt{s} = 7 $ TeV in special runs with the Roman Pot detectors placed as close to the outgoing beam as seven times the transverse beam size. The differential cross-section measurements are reported in the |t|-range of 0.36 to 2.5 GeV^2. Extending the range of data to low t values from 0.02 to 0.33 GeV^2,and utilizing the luminosity measurements of CMS, the total proton-proton cros...
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Quantum wave packet study of D+OF reaction
International Nuclear Information System (INIS)
Kurban, M.; Karabulut, E.; Tutuk, R.; Goektas, F.
2010-01-01
The quantum dynamics of the D+OF reaction on the adiabatic potential energy surface of the ground 1 3 A ' state has been studied by using a time-dependent quantum real wave packet method. The state-to-state and state-to-all reaction probabilities for total angular momentum J = 0 have been calculated. The probabilities for J > 0 have been calculated by J-shifting the J = 0 results by means of capture model. Then, the integral cross sections and initial state selected rate constants have been calculated. The initial state-selected reaction probabilities and reaction cross section show threshold but not manifest any resonances and the initial state selected rate constants are sensitive to the temperature.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Csörgö, Tamás; Aspell, P; Atanassov, I; Avati, V; Baechler, J; Berardi, V; Berretti, M; Bossini, E; Bozzo, M; Brogi, P; Brücken, E; Buzzo, A; Cafagna, F S; Calicchio, M; Catanesi, M G; Covault, C; Csanád, M; Deile, M; Dimovasili, E; Doubek, M; Eggert, K; Eremin, V; Ferretti, R; Ferro, F; Fiergolski, A; Garcia, F; Giani, S; Greco, V; Grzanka, L; Heino, J; Hilden, T; Intonti, M R; Janda, M; Kaspar, J; Kopal, J; Kundrát, V; Kurvinen, K; Lami, S; Latino, G; Lauhakangas, R; Leszko, T; Lippmaa, E; Lokajícek, M; Lo Vetere, M; Lucas Rodríguez, F; Macrí, M; Magaletti, L; Magazzù, G; Mercadante, A; Meucci, M; Minutoli, S; Nemes, F; Niewiadomski, H; Noschis, E; Novák, T; Oliveri, E; Oljemark, F; Orava, R; Oriunno, M; Österberg, K; Palazzi, P; Perrot, A-L; Pedreschi, E; Petäjäjärvi, J; Procházka, J; Quinto, M; Radermacher, E; Radicioni, E; Ravotti, F; Robutti, E; Ropelewski, L; Ruggiero, G; Saarikko, H; Sanguinetti, G; Santroni, A; Scribano, A; Sette, G; Snoeys, W; Spinella, F; Sziklai, J; Taylor, C; Turini, N; Vacek, V; Vítek, M; Welti, J; Whitmore, J
2012-01-01
Proton-proton elastic scattering has been measured by the TOTEM experiment at the CERN Large Hadron Collider at $\\sqrt{s} = 7 $ TeV in special runs with the Roman Pot detectors placed as close to the outgoing beam as seven times the transverse beam size. The differential cross-section measurements are reported in the |t|-range of 0.36 to 2.5 GeV$^2$. Extending the range of data to low t values from 0.02 to 0.33 GeV$^2$,and utilizing the luminosity measurements of CMS, the total proton-proton cross section at $\\sqrt{s}$ = 7 TeV is measured to be $(98.3 \\pm 0.2^{stat} \\pm 2.8^{syst})$ mb.
Csörgő, T.; Antchev, G.; Aspell, P.; Atanassov, I.; Avati, V.; Baechler, J.; Berardi, V.; Berretti, M.; Bossini, E.; Bozzo, M.; Brogi, P.; Brücken, E.; Buzzo, A.; Cafagna, F. S.; Calicchio, M.; Catanesi, M. G.; Covault, C.; Csanád, M.; Deile, M.; Dimovasili, E.; Doubek, M.; Eggert, K.; Eremin, V.; Ferretti, R.; Ferro, F.; Fiergolski, A.; Garcia, F.; Giani, S.; Greco, V.; Grzanka, L.; Heino, J.; Hilden, T.; Intonti, M. R.; Janda, M.; Kašpar, J.; Kopal, J.; Kundrát, V.; Kurvinen, K.; Lami, S.; Latino, G.; Lauhakangas, R.; Leszko, T.; Lippmaa, E.; Lokajíček, M.; Lo Vetere, M.; Lucas Rodríguez, F.; Macrí, M.; Magaletti, L.; Magazzù, G.; Mercadante, A.; Meucci, M.; Minutoli, S.; Nemes, F.; Niewiadomski, H.; Noschis, E.; Novák, T.; Oliveri, E.; Oljemark, F.; Orava, R.; Oriunno, M.; Österberg, K.; Palazzi, P.; Perrot, A.-L.; Pedreschi, E.; PetäJäjärvi, J.; Procházka, J.; Quinto, M.; Radermacher, E.; Radicioni, E.; Ravotti, F.; Robutti, E.; Ropelewski, L.; Ruggiero, G.; Saarikko, H.; Sanguinetti, G.; Santroni, A.; Scribano, A.; Sette, G.; Snoeys, W.; Spinella, F.; Sziklai, J.; Taylor, C.; Turini, N.; Vacek, V.; Vítek, M.; Welti, J.; Whitmore, J.; Totem Collaboration
Proton-proton elastic scattering has been measured by the TOTEMexperiment at the CERN Large Hadron Collider at √{s} = 7 TeV in special runs with the Roman Pot detectors placed as close to the outgoing beam as seven times the transverse beam size. The differential cross-section measurements are reported in the |t|-range of 0.36 to 2.5 GeV^{2}. Extending the range of data to low t values from 0.02 to 0.33 GeV^2, and utilizing the luminosity measurements of CMS, the total proton-proton cross section at √{s} = 7 TeV is measured to be (98.3 ± 0.2^{stat} ± 2.8^{syst}) mb.
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Sugiyama, Koji; Kawanishi, Shinji; Oki, Yasuhiro; Kamiya, Marin; Hanada, Ryosuke; Egi, Masahiro; Akai, Shuji
2018-04-01
One-pot sequential reactions using the acyl moieties installed by enzymatic dynamic kinetic resolution of alcohols have been little investigated. In this work, the acryloyl moiety installed via the lipase/oxovanadium combo-catalyzed dynamic kinetic resolution of a racemic dienol [4-(cyclohex-1-en-1-yl)but-3-en-2-ol or 1-(cyclohex-1-en-1-yl)but-2-en-1-ol] with a (Z)-3-(phenylsulfonyl)acrylate underwent an intramolecular Diels-Alder reaction in a one-pot procedure to produce an optically active naphtho[2,3-c]furan-1(3H)-one derivative (98% ee). This method was successfully applied to the asymmetric total synthesis of (-)-himbacine. Copyright © 2017 Elsevier Ltd. All rights reserved.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Spallation reactions: calculations
International Nuclear Information System (INIS)
Bertini, H.W.
1975-01-01
Current methods for calculating spallation reactions over various energy ranges are described and evaluated. Recent semiempirical fits to existing data will probably yield the most accurate predictions for these reactions in general. However, if the products in question have binding energies appreciably different from their isotropic neighbors and if the cross section is approximately 30 mb or larger, then the intranuclear-cascade-evaporation approach is probably better suited. (6 tables, 12 figures, 34 references) (U.S.)
Benchmark calculations of thermal reaction rates. I - Quantal scattering theory
Chatfield, David C.; Truhlar, Donald G.; Schwenke, David W.
1991-01-01
The thermal rate coefficient for the prototype reaction H + H2 yields H2 + H with zero total angular momentum is calculated by summing, averaging, and numerically integrating state-to-state reaction probabilities calculated by time-independent quantum-mechanical scattering theory. The results are very carefully converged with respect to all numerical parameters in order to provide high-precision benchmark results for confirming the accuracy of new methods and testing their efficiency.
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
International Nuclear Information System (INIS)
Haratym, Z.; Kempisty, T.; Mikolajewski, S.; Rurarz, E.
1999-01-01
The status of in vivo neutron activation analysis techniques for the measurement of total body calcium in human subject is reviewed. Relevant data on the nuclear characteristics of calcium isotopes during interaction with neutrons ranging from slow up to 14 MeV neutrons are presented. Physical aspects of the measurement of in vivo total body calcium (TBCa) using 44 K activity induced in the 44 Ca(n,p) 44 K(T 1/2 =22.3 min) reaction by 14 MeV neutrons are discussed. The measurement of delayed γ-ray emitted during decay of activities induced in enriched 44 Ca, nat Ca, phantom filled with water solution of natural calcium and skeletal arm are considered. Results of measurements on the phantom and skeletal arm indicate a possibility to measure the TBCa using the 44 K activity. (author)
Probable Gastrointestinal Toxicity of Kombucha Tea
Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David
1997-01-01
Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462
Probable maximum flood control
International Nuclear Information System (INIS)
DeGabriele, C.E.; Wu, C.L.
1991-11-01
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
Nucleus-nucleus total reaction cross sections
International Nuclear Information System (INIS)
DeVries, R.M.; Peng, J.C.
1980-01-01
We compare sigma/sub R/(E) for nucleus-nucleus systems (obtained from existing direct measurements and derived from elastic scattering data) with nucleon-nucleon and nucleon-nucleus data. The energy dependence of sigma/sub R/(E) for nucleus-nucleus systems is found to be quite rapid; there appears to be no evidence for an energy independent, geometric sigma/sub R/. Simple parameter free microscopic calculations are able to quantitatively reproduce the data and thus, emphasize the dominance of nucleon-nucleon interactions in medium energy nucleus-nucleus collisions
Indian Academy of Sciences (India)
An axiomatic development of such a model is given below. It is also shown ... teacher needs to decide which students deserve to be promoted to the next class - it is not ... whether an unborn child would be a boy or a girl, the total number of births in a ..... that the outcome of the previous trials has no influence on the next trial.
... page: //medlineplus.gov/ency/article/003483.htm Total protein To use the sharing features on this page, please enable JavaScript. The total protein test measures the total amount of two classes ...
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-05-01
Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Tel, G.
We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Prediction and probability in sciences
International Nuclear Information System (INIS)
Klein, E.; Sacquin, Y.
1998-01-01
This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Owens, Tom
2006-01-01
This article presents an interview with James Howe, author of "The Misfits" and "Totally Joe". In this interview, Howe discusses tolerance, diversity and the parallels between his own life and his literature. Howe's four books in addition to "The Misfits" and "Totally Joe" and his list of recommended books with lesbian, gay, bisexual, transgender,…
Quantum Dynamics Study of the Isotopic Effect on Capture Reactions: HD, D2 + CH3
Wang, Dunyou; Kwak, Dochan (Technical Monitor)
2002-01-01
Time-dependent wave-packet-propagation calculations are reported for the isotopic reactions, HD + CH3 and D2 + CH3, in six degrees of freedom and for zero total angular momentum. Initial state selected reaction probabilities for different initial rotational-vibrational states are presented in this study. This study shows that excitations of the HD(D2) enhances the reactivities; whereas the excitations of the CH3 umbrella mode have the opposite effects. This is consistent with the reaction of H2 + CH3. The comparison of these three isotopic reactions also shows the isotopic effects in the initial-state-selected reaction probabilities. The cumulative reaction probabilities (CRP) are obtained by summing over initial-state-selected reaction probabilities. The energy-shift approximation to account for the contribution of degrees of freedom missing in the six dimensionality calculation is employed to obtain approximate full-dimensional CRPs. The rate constant comparison shows H2 + CH3 reaction has the biggest reactivity, then HD + CH3, and D2 + CH3 has the smallest.
International Nuclear Information System (INIS)
Sutherland, D.E.; Ferguson, R.M.; Simmons, R.L.; Kim, T.H.; Slavin, S.; Najarian, J.S.
1983-01-01
Total lymphoid irradiation by itself can produce sufficient immunosuppression to prolong the survival of a variety of organ allografts in experimental animals. The degree of prolongation is dose-dependent and is limited by the toxicity that occurs with higher doses. Total lymphoid irradiation is more effective before transplantation than after, but when used after transplantation can be combined with pharmacologic immunosuppression to achieve a positive effect. In some animal models, total lymphoid irradiation induces an environment in which fully allogeneic bone marrow will engraft and induce permanent chimerism in the recipients who are then tolerant to organ allografts from the donor strain. If total lymphoid irradiation is ever to have clinical applicability on a large scale, it would seem that it would have to be under circumstances in which tolerance can be induced. However, in some animal models graft-versus-host disease occurs following bone marrow transplantation, and methods to obviate its occurrence probably will be needed if this approach is to be applied clinically. In recent years, patient and graft survival rates in renal allograft recipients treated with conventional immunosuppression have improved considerably, and thus the impetus to utilize total lymphoid irradiation for its immunosuppressive effect alone is less compelling. The future of total lymphoid irradiation probably lies in devising protocols in which maintenance immunosuppression can be eliminated, or nearly eliminated, altogether. Such protocols are effective in rodents. Whether they can be applied to clinical transplantation remains to be seen
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
International Nuclear Information System (INIS)
Aroumougame, R.; Gupta, R.K.
1979-01-01
The possible reaction partners of a cool compound nucleus reaction for the synthesis of the elements Z = 104, 106 and 108 are studied in terms of the potential energy surfaces, interaction barriers and the nuclear shapes calculated within the frame work of the Fragmentation theory based on two centre shell model. An estimate of the total reaction cross-section suggests that for larger fusion probabilities, the mass and charge asymmetries are the only essential criterion for the optimum choice of a cooler compound nuclear reaction. Larger the mass and charge asymmetries, larger is the fusion cross-section. (auth.)
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Experimental evidence for the reducibility of multifragment emission probabilities
International Nuclear Information System (INIS)
Wozniak, G.J.; Tso, K.; Phair, L.
1995-01-01
Multifragmentation has been studied for 36 Ar-induced reactions on a 197 Au target at E/A = 80 and 110 MeV and for 129 Xe-induced reactions on several targets ( nat Cu, 89 y, 165 ho, 197 Au) and E/A = 40, 50 and 60 MeV. The probability of emitting n intermediate-mass-fragments is shown to be binomial at each transversal energy and reducible to an elementary binary probability p. For each target and at each bombarding energy, this probability p shows a thermal nature by giving linear Arrhenius plots. For the 129 Xe-induced reactions, a nearly universal linear Arrhenius plot is observed at each bombarding energy, indicating a large degree of target independence
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Joint probabilities and quantum cognition
International Nuclear Information System (INIS)
Acacio de Barros, J.
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Joint probabilities and quantum cognition
Energy Technology Data Exchange (ETDEWEB)
Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)
2012-12-18
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
The Probabilities of Unique Events
2012-08-30
Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the
Probability Matching, Fast and Slow
Koehler, Derek J.; James, Greta
2014-01-01
A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Medication Desensitization: Characterization of Outcomes and Risk Factors for Reactions.
Murray, Taryn S; Rice, Todd W; Wheeler, Arthur P; Phillips, Elizabeth J; Dworski, Ryszard T; Stollings, Joanna L
2016-03-01
Although its mechanisms are poorly understood, desensitization has been used to induce a temporary state of immune unresponsiveness in patients who have IgE-, non-IgE-, or pharmacologically mediated reactions when a drug has no alternatives. The purpose of this study was to characterize the outcomes and identify risk factors for reactions during drug desensitization. A retrospective review of electronic medical records of adult patients undergoing drug desensitization from January 1, 2011, to December 31, 2013, was conducted in 2 intensive care units at a tertiary medical center. We used multivariate analysis to determine if specified risk factors were associated with reacting during the desensitization. Reactions were classified according to the pretest probability prior to desensitization, and then, reactions during desensitization were classified based on the occurrence of cutaneous reactions as follows: successful with no reaction, mild reaction, moderate reaction, or failed. Failure could result from any systemic allergic or cutaneous reaction resulting in procedure termination. The desensitizations were also assessed to determine if the patient required de-escalation secondary to a reaction. A total of 88 desensitizations were performed in 69 patients. Desensitization was completed with no cutaneous reaction in 85% of patients. No baseline characteristic, medication class (P = 0.46), or indication for desensitization (P = 0.59) was associated with having a reaction. Reported histories of urticaria (P desensitization. However, neither history of urticaria nor labored breathing was independently associated with having a reaction in multivariate analysis (OR = 0.979, 95% CI = 0.325-2.952, P = 0.970, and OR = 1.626, 95% CI = 0.536-4.931, P = 0.739, respectively). Drug desensitization is safe for patients who have no alternative for therapy. Reported allergy histories of urticaria and labored breathing are both associated with having a reaction during the
Dependent Human Error Probability Assessment
International Nuclear Information System (INIS)
Simic, Z.; Mikulicic, V.; Vukovic, I.
2006-01-01
This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Solving probability reasoning based on DNA strand displacement and probability modules.
Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun
2017-12-01
In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Directory of Open Access Journals (Sweden)
Lopez Moris E
2016-06-01
Full Text Available Total thyroidectomy is a surgery that removes all the thyroid tissue from the patient. The suspect of cancer in a thyroid nodule is the most frequent indication and it is presume when previous fine needle puncture is positive or a goiter has significant volume increase or symptomes. Less frequent indications are hyperthyroidism when it is refractory to treatment with Iodine 131 or it is contraindicated, and in cases of symptomatic thyroiditis. The thyroid gland has an important anatomic relation whith the inferior laryngeal nerve and the parathyroid glands, for this reason it is imperative to perform extremely meticulous dissection to recognize each one of these elements and ensure their preservation. It is also essential to maintain strict hemostasis, in order to avoid any postoperative bleeding that could lead to a suffocating neck hematoma, feared complication that represents a surgical emergency and endangers the patient’s life.It is essential to run a formal technique, without skipping steps, and maintain prudence and patience that should rule any surgical act.
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Probability and statistics: A reminder
International Nuclear Information System (INIS)
Clement, B.
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
On probability-possibility transformations
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
The Efficacy of Using Diagrams When Solving Probability Word Problems in College
Beitzel, Brian D.; Staley, Richard K.
2015-01-01
Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Probability matching and strategy availability.
Koehler, Derek J; James, Greta
2010-09-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Logic, Probability, and Human Reasoning
2015-01-01
accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Probability matching and strategy availability
J. Koehler, Derek; Koehler, Derek J.; James, Greta
2010-01-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...
The decay of hot nuclei formed in La-induced reactions at intermediate energies
International Nuclear Information System (INIS)
Libby, B.; Mignerey, A.C.; Madani, H.; Marchetti, A.A.; Colonna, M.; DiToro, M.
1992-01-01
The decay of hot nuclei formed in lanthanum-induced reactions utilizing inverse kinematics has been studied from E/A = 35 to 55 MeV. At each bombarding energy studied, the probability for the multiple emission of complex fragments has been found to be independent of target. Global features (total charge, source velocity) of the reaction La + Al at E/A = 45 MeV have been reproduced by coupling a dynamical model to study the collision stage of the reaction to a statistical model of nuclear decay
International Nuclear Information System (INIS)
Lane, A.M.
1980-01-01
In reviewing work at Harwell over the past 25 years on nuclear reactions it is stated that a balance has to be struck in both experiment and theory between work on cross-sections of direct practical relevance to reactors and on those relevant to an overall understanding of reaction processes. The compound nucleus and direct process reactions are described. Having listed the contributions from AERE, Harwell to developments in nuclear reaction research in the period, work on the optical model, neutron capture theory, reactions at doorway states with fine structure, and sum-rules for spectroscopic factors are considered in more detail. (UK)
Total Synthesis of Hyperforin.
Ting, Chi P; Maimone, Thomas J
2015-08-26
A 10-step total synthesis of the polycyclic polyprenylated acylphloroglucinol (PPAP) natural product hyperforin from 2-methylcyclopent-2-en-1-one is reported. This route was enabled by a diketene annulation reaction and an oxidative ring expansion strategy designed to complement the presumed biosynthesis of this complex meroterpene. The described work enables the preparation of a highly substituted bicyclo[3.3.1]nonane-1,3,5-trione motif in only six steps and thus serves as a platform for the construction of easily synthesized, highly diverse PPAPs modifiable at every position.
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
The total angular moment selectivity in 7Li(α, α) 7Li(4.63 MeV, 7/2-) reaction at Eα = 27.2 MeV
International Nuclear Information System (INIS)
Dmitrenko, V.N.; Kozyr', Yu.E.
1995-01-01
The DWBA calculation of tensor polarisation of residual nuclei for direct inelastic scattering 7 Li(α, α) 7 Li(4.63 MeV, 7/2 - ) gives the lest approximation to experimental data at selected total angular moment and parity values J π 13/2 + . The microscopic coupled channel calculation also predicts a significant role of total angular moment states with J ≥ 13/2. at E α 27.2 MeV
Lectures on probability and statistics
International Nuclear Information System (INIS)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Excluding joint probabilities from quantum theory
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Hot wire radicals and reactions
International Nuclear Information System (INIS)
Zheng Wengang; Gallagher, Alan
2006-01-01
Threshold ionization mass spectroscopy is used to measure radical (and stable gas) densities at the substrate of a tungsten hot wire (HW) reactor. We report measurements of the silane reaction probability on the HW and the probability of Si and H release from the HW. We describe a model for the atomic H release, based on the H 2 dissociation model. We note major variations in silicon-release, with dependence on prior silane exposure. Measured radical densities versus silane pressure yield silicon-silane and H-silane reaction rate coefficients, and the dominant radical fluxes to the substrate
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Consistent post-reaction vibrational energy redistribution in DSMC simulations using TCE model
Borges Sebastião, Israel; Alexeenko, Alina
2016-10-01
The direct simulation Monte Carlo (DSMC) method has been widely applied to study shockwaves, hypersonic reentry flows, and other nonequilibrium flow phenomena. Although there is currently active research on high-fidelity models based on ab initio data, the total collision energy (TCE) and Larsen-Borgnakke (LB) models remain the most often used chemistry and relaxation models in DSMC simulations, respectively. The conventional implementation of the discrete LB model, however, may not satisfy detailed balance when recombination and exchange reactions play an important role in the flow energy balance. This issue can become even more critical in reacting mixtures involving polyatomic molecules, such as in combustion. In this work, this important shortcoming is addressed and an empirical approach to consistently specify the post-reaction vibrational states close to thermochemical equilibrium conditions is proposed within the TCE framework. Following Bird's quantum-kinetic (QK) methodology for populating post-reaction states, the new TCE-based approach involves two main steps. The state-specific TCE reaction probabilities for a forward reaction are first pre-computed from equilibrium 0-D simulations. These probabilities are then employed to populate the post-reaction vibrational states of the corresponding reverse reaction. The new approach is illustrated by application to exchange and recombination reactions relevant to H2-O2 combustion processes.
Investigation of nuclear structures using transition probabilities
International Nuclear Information System (INIS)
Dewald, A.; Moeller, O.; Peusquens, R.
2002-01-01
Magnetic rotation which appears as regular M1 bands in the spectra, is a well established phenomenon in several Pb isotopes. In the A = 130 region where similar M1 bands are known, e.g. in 124 Xe and 128 Ba, it is still not clear whether it does exists. Crucial experimental observables are the B (M1) values which -are expected to decrease with in creasing spin. At Strasbourg a recoil distance measurement (RDM) with the EUROBALL spectrometer at Strasbourg and the Koeln plunger using the reaction 110 Pd( 18 O, 4n) 124 Xe at a beam energy of 86 MeV yielded preliminary lifetimes of ground band states and states of the M1 band. The deduced B(M1) values show the expected behaviour for magnetic rotation. It is also shown that the experimental B(M1) values can be described as well on the basis of a rotational band. The measured B(E2) values are used to investigate the nuclear deformation of 124 Xe as well as the interaction of the ground state band with two s-bands. Spherical deformed shape coexistence is investigated by means of electromagnetic transition probabilities in the case of 188 Pb. Lifetimes were measured in 188 Pb using a novel combination of the Koeln plunger device with the GSFMA set-up at ATLAS. The reaction 40 Ca ( 152 Sm, 4n) 188 Pb at a beam energy of 725 MeV in inverse kinematics is used. It is found that the lowest 2 + state is predominantly of prolate structure
Normal tissue complication probability for salivary glands
International Nuclear Information System (INIS)
Rana, B.S.
2008-01-01
The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years
Future southcentral US wildfire probability due to climate change
Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.
2018-01-01
Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.
Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions
International Nuclear Information System (INIS)
Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.
2010-01-01
A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Catalysis of Nuclear Reactions by Electrons
Lipoglavšek, Matej
2018-01-01
Electron screening enhances nuclear reaction cross sections at low energies. We studied the nuclear reaction 1H(19F,αγ)16O in inverse kinematics in different solid hydrogen targets. Measured resonance strengths differed by up to a factor of 10 in different targets. We also studied the 2H(p,γ)3He fusion reaction and observed electrons emitted as reaction products instead of γ rays. In this case electron screening greatly enhances internal conversion probability.
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
International Nuclear Information System (INIS)
Henning, W.
1979-01-01
Quasielastic reaction studies, because of their capability to microscopically probe nuclear structure, are still of considerable interest in heavy-ion reactions. The recent progress in understanding various aspects of the reaction mechanism make this aim appear closer. The relation between microscopic and macroscopic behavior, as suggested, for example, by the single proton transfer data to individual final states or averaged excitation energy intervals, needs to be explored. It seems particularly useful to extend measurements to higher incident energies, to explore and understand nuclear structure aspects up to the limit of the energy range where they are important
Total Synthesis of Adunctin B.
Dethe, Dattatraya H; Dherange, Balu D
2018-03-16
Total synthesis of (±)-adunctin B, a natural product isolated from Piper aduncum (Piperaceae), has been achieved using two different strategies, in seven and three steps. The efficient approach features highly atom economical and diastereoselective Friedel-Crafts acylation, alkylation reaction and palladium catalyzed Wacker type oxidative cyclization.
Energy Technology Data Exchange (ETDEWEB)
Turlay, R [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1962-07-01
In the first part of this experiment, we determined the total cross section for processes yielding only neutral particles, from 300 to 1600 MeV. For this, we counted the number of incident {pi}{sup -}, as defined by a counter telescope, which interact in a liquid-hydrogen target without giving charged particles in a 4{pi} counter surrounding the target. In the second part of this experiment, we have separated the reaction {pi}{sup -} p {yields} {pi}{sup 0} n and {pi}{sup -} p {yields} {pi}{sup 0} {pi}{sup 0} n between 300 and 1100 MeV, by supposing that only these two reactions was realized by placing lead absorbers between the target and 4{pi} counter and by comparing the counting rate for neutral events with and without lead. The transmission thus measured is a function of the average number of photons produced and therefore of the ratio between the two neutral channels, {pi}{sup 0} n and {pi}{sup 0} {pi}{sup 0} n. In the last section of this work, we discuss the experimental results and compare them to those obtained by other authors in the study of photoproduction and the {pi}-nucleon interaction. (author) [French] Dans une premiere partie de cette experience, nous determinons la section efficace totale des processus ne donnant naissance qu'a des particules neutres de 300 et 1 600 MeV. Pour cela nous comptons le nombre de {pi}{sup -}, defini par un telescope incident, qui interagissent dans une cible d'hydrogene liquide sans donner de particules chargees dans un compteur 4{pi} entourant cette cible. Dans la deuxieme partie de l'experience nous avons separe les reactions {pi}{sup -} p {yields} {pi}{sup 0} n et {pi}{sup -} p {yields} {pi}{sup 0} {pi}{sup 0} n entre 300 et 1 600 MeV en supposant que seules ces deux voies soient importantes a ces energies. La separation de ces deux reactions a ete realisee en placant des ecrans de plomb entre la cible et le compteur 4 {pi}, et en comparant les traces de comptage des evenements a secondaires neutres avec et sans
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Indian Academy of Sciences (India)
Unknown
Molecular Modeling Group, Organic Chemical Sciences, Indian Institute of Chemical Technology,. Hyderabad ... thus obtained are helpful to model the regioselectivity ... compromise to model Diels–Alder reactions involving ...... acceptance.
Optimizing Chemical Reactions with Deep Reinforcement Learning.
Zhou, Zhenpeng; Li, Xiaocheng; Zare, Richard N
2017-12-27
Deep reinforcement learning was employed to optimize chemical reactions. Our model iteratively records the results of a chemical reaction and chooses new experimental conditions to improve the reaction outcome. This model outperformed a state-of-the-art blackbox optimization algorithm by using 71% fewer steps on both simulations and real reactions. Furthermore, we introduced an efficient exploration strategy by drawing the reaction conditions from certain probability distributions, which resulted in an improvement on regret from 0.062 to 0.039 compared with a deterministic policy. Combining the efficient exploration policy with accelerated microdroplet reactions, optimal reaction conditions were determined in 30 min for the four reactions considered, and a better understanding of the factors that control microdroplet reactions was reached. Moreover, our model showed a better performance after training on reactions with similar or even dissimilar underlying mechanisms, which demonstrates its learning ability.
Adverse reactions to antituberculosis drugs in Manguinhos, Rio de Janeiro, Brazil
Directory of Open Access Journals (Sweden)
Glauciene Santana Damasceno
2013-01-01
Full Text Available OBJECTIVES: This study aimed to characterize and estimate the frequency of adverse reactions to antituberculosis drugs in the population treated at the Centro de Saúde Escola Germano Sinval Faria, a primary health care clinic in Manguinhos, Rio de Janeiro City, and to explore the relationship between adverse drug reactions and some of the patients' demographic and health characteristics. METHODS: This descriptive study was conducted via patient record review of incident cases between 2004 and 2008. RESULTS: Of the 176 patients studied, 41.5% developed one or more adverse reactions to antituberculosis drugs, totaling 126 occurrences. The rate of adverse reactions to antituberculosis drugs was higher among women, patients aged 50 years or older, those with four or more comorbidities, and those who used five or more drugs. Of the total reactions, 71.4% were mild. The organ systems most affected were as follows: the gastrointestinal tract (29.4%, the skin and appendages (21.4%, and the central and peripheral nervous systems (14.3%. Of the patients who experienced adverse reactions to antituberculosis drugs, 65.8% received no drug treatment for their adverse reactions, and 4.1% had one of the antituberculosis drugs suspended because of adverse reactions. "Probable reactions" (75% predominated over "possible reactions" (24%. In the study sample, 64.3% of the reactions occurred during the first two months of treatment, and most (92.6% of the reactions were ascribed to the combination of rifampicin + isoniazid + pyrazinamide (Regimen I. A high dropout rate from tuberculosis treatment (24.4% was also observed. CONCLUSION: This study suggests a high rate of adverse reactions to antituberculosis drugs.
Foundations of the theory of probability
Kolmogorov, AN
2018-01-01
This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
Directory of Open Access Journals (Sweden)
Salida Mirzoeva
Full Text Available The goal of this study was to elucidate the action of the CD28 mimetic peptide p2TA (AB103 that attenuates an excessive inflammatory response in mitigating radiation-induced inflammatory injuries. BALB/c and A/J mice were divided into four groups: Control (C, Peptide (P; 5 mg/kg of p2TA peptide, Radiation (R; total body irradiation with 8 Gy γ-rays, and Radiation + Peptide (RP; irradiation followed by p2TA peptide 24 h later. Gastrointestinal tissue damage was evaluated by analysis of jejunum histopathology and immunohistochemistry for cell proliferation (Cyclin D1 and inflammation (COX-2 markers, as well as the presence of macrophages (F4/80. Pro-inflammatory cytokines IL-6 and KC as well as fibrinogen were quantified in plasma samples obtained from the same mice. Our results demonstrated that administration of p2TA peptide significantly reduced the irradiation-induced increase of IL-6 and fibrinogen in plasma 7 days after exposure. Seven days after total body irradiation with 8 Gy of gamma rays numbers of intestinal crypt cells were reduced and villi were shorter in irradiated animals compared to the controls. The p2TA peptide delivery 24 h after irradiation led to improved morphology of villi and crypts, increased Cyclin D1 expression, decreased COX-2 staining and decreased numbers of macrophages in small intestine of irradiated mice. Our study suggests that attenuation of CD28 signaling is a promising therapeutic approach for mitigation of radiation-induced tissue injury.
Void probability scaling in hadron nucleus interactions
International Nuclear Information System (INIS)
Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima
2002-01-01
Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Two-slit experiment: quantum and classical probabilities
International Nuclear Information System (INIS)
Khrennikov, Andrei
2015-01-01
Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)
Heart sounds analysis using probability assessment.
Plesinger, F; Viscor, I; Halamek, J; Jurco, J; Jurak, P
2017-07-31
This paper describes a method for automated discrimination of heart sounds recordings according to the Physionet Challenge 2016. The goal was to decide if the recording refers to normal or abnormal heart sounds or if it is not possible to decide (i.e. 'unsure' recordings). Heart sounds S1 and S2 are detected using amplitude envelopes in the band 15-90 Hz. The averaged shape of the S1/S2 pair is computed from amplitude envelopes in five different bands (15-90 Hz; 55-150 Hz; 100-250 Hz; 200-450 Hz; 400-800 Hz). A total of 53 features are extracted from the data. The largest group of features is extracted from the statistical properties of the averaged shapes; other features are extracted from the symmetry of averaged shapes, and the last group of features is independent of S1 and S2 detection. Generated features are processed using logical rules and probability assessment, a prototype of a new machine-learning method. The method was trained using 3155 records and tested on 1277 hidden records. It resulted in a training score of 0.903 (sensitivity 0.869, specificity 0.937) and a testing score of 0.841 (sensitivity 0.770, specificity 0.913). The revised method led to a test score of 0.853 in the follow-up phase of the challenge. The presented solution achieved 7th place out of 48 competing entries in the Physionet Challenge 2016 (official phase). In addition, the PROBAfind software for probability assessment was introduced.
Probability analysis of MCO over-pressurization during staging
International Nuclear Information System (INIS)
Pajunen, A.L.
1997-01-01
The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience
Energy Technology Data Exchange (ETDEWEB)
Austern, N. [University of Pittsburgh, Pittsburgh, PA (United States)
1963-01-15
In order to give a unified presentation of one point of view, these lectures are devoted only to a detailed development of the standard theories of direct reactions, starting from basic principles. Discussion is given of the present status of the theories, of the techniques used for practical calculation, and of possible future developments. The direct interaction (DI) aspects of a reaction are those which involve only a few of the many degrees of freedom of a nucleus. In fact the minimum number of degrees of freedom which must be involved in a reaction are those required to describe the initial and final channels, and DI studies typically consider these degrees of freedom and no others. Because of this simplicity DI theories may be worked out in painstaking detail. DI processes concern only part of the wave function for a problem. The other part involves complicated excitations of many degrees of freedom, and gives the compound nucleus (CN) effects. While it is extremely interesting to learn how to separate DI and CN effects in an orderly manner, if they are both present in a reaction, no suitable method has yet been found. Instead, current work stresses the kinds of reactions and the kinds of final states in which DI effects dominate and in which CN effects may almost be forgotten. The DI cross-sections which are studied are often extremely large, comparable to elastic scattering cross-sections. (author)
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
International Nuclear Information System (INIS)
Nguyen Trong Anh
1988-01-01
The 1988 progress report of the Reaction Mechanisms laboratory (Polytechnic School, France), is presented. The research topics are: the valence bond methods, the radical chemistry, the modelling of the transition states by applying geometric constraints, the long range interactions (ion - molecule) in gaseous phase, the reaction sites in gaseous phase and the mass spectroscopy applications. The points of convergence between the investigations of the mass spectroscopy and the theoretical chemistry teams, as well as the purposes guiding the research programs, are discussed. The published papers, the conferences, the congress communications and the thesis, are also reported [fr
Selection of risk reduction portfolios under interval-valued probabilities
International Nuclear Information System (INIS)
Toppila, Antti; Salo, Ahti
2017-01-01
A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.
Modeling of Reaction Calorimeter
Farzad, Reza
2014-01-01
The purpose of this project was to model the reaction calorimeter in order to calculate the heat of absorption which is the most important parameter in this work. Reaction calorimeter is an apparatus which is used in measuring the heat of absorption of CO2 as well as the total pressure in vapor phase based on vapor-liquid equilibrium state. Mixture of monoethanolamine (MEA) and water was used as a solvent to absorb the CO2.Project was divided in to three parts in order to make the programming...
... that don't bother most people (such as venom from bee stings and certain foods, medicines, and pollens) can ... person. If the allergic reaction is from a bee sting, scrape the ... more venom. If the person has emergency allergy medicine on ...
Joint probability distributions and fluctuation theorems
International Nuclear Information System (INIS)
García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien
2012-01-01
We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL) probability wheel
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Defining Probability in Sex Offender Risk Assessment.
Elwood, Richard W
2016-12-01
There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.
Total Quality Management in Education. Second Edition.
Sallis, Edward
Quality is at the top of most agendas, and improving quality is probably the most important task facing any institution. In addition, quality is difficult to define or measure. This book, the second edition of "Total Quality Management in Education," introduces the key concepts of Total Quality Management (TQM) and demonstrates how they…
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
The First Total Synthesis of Isoliquiritin
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
A first total synthesis of isoliquiritin was accomplished starting from p-hydroxy- benzaldehyde and 2,4-dihydroxyacetylphenone. The key step is condensation reaction. In synthetic process need not protect the hydroxy group of reacting substance.
Is probability of frequency too narrow?
International Nuclear Information System (INIS)
Martz, H.F.
1993-01-01
Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed
Impact parameter dependence of inner-shell ionization probabilities
International Nuclear Information System (INIS)
Cocke, C.L.
1974-01-01
The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed
Human Inferences about Sequences: A Minimal Transition Probability Model.
Directory of Open Access Journals (Sweden)
Florent Meyniel
2016-12-01
Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.
Evaluating probability measures related to subsurface flow and transport
International Nuclear Information System (INIS)
Cawlfield, J.D.
1991-01-01
Probabilistic modeling approaches are being used increasingly in order to carry out quantified risk analysis and to evaluate the uncertainty existing in subsurface flow and transport analyses. The work presented in this paper addresses three issues: comparison of common probabilistic modeling techniques, recent results regarding the sensitivity of probability measures to likely changes in the uncertain variables for transport in porous media, and a discussion of some questions regarding fundamental modeling philosophy within a probabilistic framework. Recent results indicate that uncertainty regarding average flow velocity controls the probabilistic outcome, while uncertainty in the dispersivity and diffusion coefficient does not seem very important. Uncertainty of reaction terms is important only at early times in the transport process. Questions are posed regarding (1) the inclusion of macrodispersion in a probabilistic analysis, (2) statistics of flow velocity and (3) the notion of an ultimate probability measure for subsurface flow analyses
A Challenge to Ludwig von Mises’s Theory of Probability
Directory of Open Access Journals (Sweden)
Mark R. Crovelli
2010-10-01
Full Text Available The most interesting and completely overlooked aspect of Ludwig von Mises’s theory of probability is the total absence of any explicit definition for probability in his theory. This paper examines Mises’s theory of probability in light of the fact that his theory possesses no definition for probability. It is argued, first, that Mises’s theory differs in important respects from his brother’s famous theory of probability. A defense of the subjective definition for probability is then provided, which is subsequently used to critique Ludwig von Mises’s theory. It is argued that only the subjective definition for probability comports with Mises’s other philosophical positions. Since Mises did not provide an explicit definition for probability, it is suggested that he ought to have adopted a subjective definition.
Energy Technology Data Exchange (ETDEWEB)
Lagana, Antonio; Faginas Lago, Noelia; Rampino, Sergio [Dipartimento di Chimica, Universita di Perugia, 06123 Perugia (Italy); Huarte-Larranaga, FermIn [Computer Simulation and Modeling Lab (CoSMoLab), Parc CientIfic de Barcelona, 08028 Barcelona (Spain); GarcIa, Ernesto [Departamento de Quimica Fisica, Universidad del PaIs Vasco, 01006 Vitoria (Spain)], E-mail: lagana05@gmail.com, E-mail: fhuarte@pcb.ub.es, E-mail: e.garcia@ehu.es
2008-10-15
Zero total angular momentum exact quantum calculations of the probabilities of the N+N{sub 2} reaction have been performed on the L3 potential energy surface having a bent transition state. This has allowed us to work out J-shifting estimates of the thermal rate coefficient based on the calculation of either detailed (state-to-state) or cumulative (multiconfiguration) probabilities. The results obtained are used to compare the numerical outcomes and the concurrent computational machineries of both quantum and semiclassical approaches as well as to exploit the potentialities of the J-shifting model. The implications of moving the barrier to reaction from the previously proposed collinear geometry of the LEPS to the bent one of L3 are also investigated by comparing the related detailed reactive probabilities.
Use of probabilistic methods for estimating failure probabilities and directing ISI-efforts
Energy Technology Data Exchange (ETDEWEB)
Nilsson, F; Brickstad, B [University of Uppsala, (Switzerland)
1988-12-31
Some general aspects of the role of Non Destructive Testing (NDT) efforts on the resulting probability of core damage is discussed. A simple model for the estimation of the pipe break probability due to IGSCC is discussed. It is partly based on analytical procedures, partly on service experience from the Swedish BWR program. Estimates of the break probabilities indicate that further studies are urgently needed. It is found that the uncertainties about the initial crack configuration are large contributors to the total uncertainty. Some effects of the inservice inspection are studied and it is found that the detection probabilities influence the failure probabilities. (authors).
On the probability of cure for heavy-ion radiotherapy
International Nuclear Information System (INIS)
Hanin, Leonid; Zaider, Marco
2014-01-01
The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)
Stochastic simulation of enzyme-catalyzed reactions with disparate timescales.
Barik, Debashis; Paul, Mark R; Baumann, William T; Cao, Yang; Tyson, John J
2008-10-01
Many physiological characteristics of living cells are regulated by protein interaction networks. Because the total numbers of these protein species can be small, molecular noise can have significant effects on the dynamical properties of a regulatory network. Computing these stochastic effects is made difficult by the large timescale separations typical of protein interactions (e.g., complex formation may occur in fractions of a second, whereas catalytic conversions may take minutes). Exact stochastic simulation may be very inefficient under these circumstances, and methods for speeding up the simulation without sacrificing accuracy have been widely studied. We show that the "total quasi-steady-state approximation" for enzyme-catalyzed reactions provides a useful framework for efficient and accurate stochastic simulations. The method is applied to three examples: a simple enzyme-catalyzed reaction where enzyme and substrate have comparable abundances, a Goldbeter-Koshland switch, where a kinase and phosphatase regulate the phosphorylation state of a common substrate, and coupled Goldbeter-Koshland switches that exhibit bistability. Simulations based on the total quasi-steady-state approximation accurately capture the steady-state probability distributions of all components of these reaction networks. In many respects, the approximation also faithfully reproduces time-dependent aspects of the fluctuations. The method is accurate even under conditions of poor timescale separation.
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Introducing Disjoint and Independent Events in Probability.
Kelly, I. W.; Zwiers, F. W.
Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Collective probabilities algorithm for surface hopping calculations
International Nuclear Information System (INIS)
Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto
2003-01-01
General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method
Examples of Neutrosophic Probability in Physics
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Some open problems in noncommutative probability
International Nuclear Information System (INIS)
Kruszynski, P.
1981-01-01
A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Against All Odds: When Logic Meets Probability
van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.
2017-01-01
This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
The probability of the false vacuum decay
International Nuclear Information System (INIS)
Kiselev, V.; Selivanov, K.
1983-01-01
The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
The transition probabilities of the reciprocity model
Snijders, T.A.B.
1999-01-01
The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Fusion probability and survivability in estimates of heaviest nuclei production
International Nuclear Information System (INIS)
Sagaidak, Roman
2012-01-01
A number of theoretical models have been recently developed to predict production cross sections for the heaviest nuclei in fusion-evaporation reactions. All the models reproduce cross sections obtained in experiments quite well. At the same time they give fusion probability values P fus ≡ P CN differed within several orders of the value. This difference implies a corresponding distinction in the calculated values of survivability. The production of the heaviest nuclei (from Cm to the region of superheavy elements (SHE) close to Z = 114 and N = 184) in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing (fusion) model coupled with the standard statistical model (SSM) of the compound nucleus (CN) decay. Both models are incorporated into the HIVAP code. Available data on the excitation functions for fission and evaporation residues (ER) produced in very asymmetric combinations can be described rather well within the framework of HIVAP. Cross-section data obtained in these reactions allow one to choose model parameters quite definitely. Thus one can scale and fix macroscopic (liquid-drop) fission barriers for nuclei involved in the evaporation-fission cascade. In less asymmetric combinations (with 22 Ne and heavier projectiles) effects of fusion suppression caused by quasi-fission are starting to appear in the entrance channel of reactions. The P fus values derived from the capture-fission and fusion-fission cross-sections obtained at energies above the Bass barrier were plotted as a function of the Coulomb parameter. For more symmetric combinations one can deduce the P fus values semi-empirically, using the ER and fission excitation functions measured in experiments, and applying SSM model with parameters obtained in the analysis of a very asymmetric combination leading to the production of (nearly) the same CN, as was done for reactions leading to the pre-actinide nuclei formation
The enigma of probability and physics
International Nuclear Information System (INIS)
Mayants, L.
1984-01-01
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
Czech Academy of Sciences Publication Activity Database
Janíček, P.; Fuis, Vladimír; Málek, M.
2010-01-01
Roč. 14, č. 4 (2010), s. 42-51 ISSN 1335-2393 Institutional research plan: CEZ:AV0Z20760514 Keywords : computational modeling * ceramic head * in vivo destructions * hip joint endoprosthesis * probabily of rupture Subject RIV: BO - Biophysics
Energy Technology Data Exchange (ETDEWEB)
Xu Yuntao; Xiong Bo; Chang, Yih Chung; Ng, C. Y. [Department of Chemistry, University of California, Davis, California 95616 (United States)
2012-12-28
By employing the newly established vacuum ultraviolet laser pulsed field ionization-photoion (PFI-PI) double quadrupole-double octopole ion guide apparatus, we have measured the rovibrationally selected absolute total cross sections of the ion-molecule reaction H{sub 2}O{sup +}(X{sup 2}B{sub 1}; v{sub 1}{sup +}v{sub 2}{sup +}v{sub 3}{sup +}= 000; N{sup +}{sub Ka+Kc+)}+ D{sub 2}{yields} H{sub 2}DO{sup +}+ D in the center-of-mass collision energy (E{sub cm}) range of 0.05-10.00 eV. The pulsing scheme used for the generation of PFI-PIs has made possible the preparation of reactant H{sub 2}O{sup +}(X{sup 2}B{sub 1}; v{sub 1}{sup +}v{sub 2}{sup +}v{sub 3}{sup +}= 000) ions in single N{sup +}{sub Ka+Kc+} rotational levels with high kinetic energy resolutions. The absolute total cross sections observed in different N{sup +}{sub Ka+Kc+} levels with rotational energies in the range of 0-200 cm{sup -1} were found to exhibit a significant rotational enhancement on the reactivity for the titled reaction. In contrast, the measured cross sections reveal a decreasing trend with increasing E{sub cm}, indicating that the rotational enhancement observed is not a total energy effect, but a dynamical effect. Furthermore, the rotational enhancement is found to be more pronounced as E{sub cm} is decreased. This experiment provided evidence that the coupling of the core rotational angular momentum with the orbital angular momentum could play a role in chemical reactivity, particularly at low E{sub cm}.
International Nuclear Information System (INIS)
Shimada, Yoshio
2000-01-01
It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)
International Nuclear Information System (INIS)
Hansen, O.
1983-01-01
A brief review is presented of the experimental and theoretical situation regarding transfer reactions and inelastic scattering. In the first category there is little (very little) precision data for heavy projectiles and consequently almost no experience with quantitative theoretical analysis. For the inelastic scattering the rather extensive data strongly supports the coupled channels models with collective formfactors. At the most back angles, at intensities about 10 -5 of Rutherford scattering, a second, compound-like mechanism becomes dominant. The description of the interplay of these two opposite mechanisms provides a new challenge for our understanding
Failure-probability driven dose painting
International Nuclear Information System (INIS)
Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.
2013-01-01
Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Miniati, M.; Pistolesi, M.
2001-01-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Assumed Probability Density Functions for Shallow and Deep Convection
Steven K Krueger; Peter A Bogenschutz; Marat Khairoutdinov
2010-01-01
The assumed joint probability density function (PDF) between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS) parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PD...
International Nuclear Information System (INIS)
Corner, J.; Richardson, K.; Fenton, N.
1990-01-01
Nuclear reactions' marks a new development in the study of television as an agency of public policy debate. During the Eighties, nuclear energy became a major international issue. The disasters at Three-mile Island and Chernobyl created a global anxiety about its risks and a new sensitivity to it among politicians and journalists. This book is a case-study into documentary depictions of nuclear energy in television and video programmes and into the interpretations and responses of viewers drawn from many different occupational groupings. How are the complex and specialist arguments about benefit, risk and proof conveyed through the different conventions of commentary, interview and film sequence? What symbolic associations does the visual language of television bring to portrayals of the issue? And how do viewers make sense of various and conflicting accounts, connecting what they see and hear on the screen with their pre-existing knowledge, experience and 'civic' expectations. The authors examine some of the contrasting forms and themes which have been used by programme makers to explain and persuade, and then give a sustained analysis of the nature and sources of viewers' own accounts. 'Nuclear Reactions' inquires into the public meanings surrounding energy and the environment, spelling out in its conclusion some of the implications for future media treatments of this issue. It is also a key contribution to the international literature on 'television knowledge' and the processes of active viewing. (author)
Upgrading Probability via Fractions of Events
Directory of Open Access Journals (Sweden)
Frič Roman
2016-08-01
Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.
Failure probability analysis of optical grid
Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng
2008-11-01
Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.
Pirre, Michel; Marceau, Francois J.; Lebras, Georges; Maguin, Francoise; Poulet, Gille; Ramaroson, Radiela
1994-01-01
The impact of new laboratory data for the reaction BrO + HO2 yields HOBr + O2 in the depletion of global stratospheric ozone has been estimated using a one-dimensional photochemical model taking into account the heterogeneous reaction on sulphate aerosols which converts N2O5 into HNO3. Assuring an aerosol loading 2 times as large as the 'background' and a reaction probability of 0.1 for the above heterogeneous reaction, the 6 fold increase in the measured rate constant for the reaction of BrO with HO2 increases the computed depletion of global ozone produced by 20 ppt of total bromine from 2.01 percent to 2.36 percent. The use of the higher rate constant increases the HOBr mixing ratio and makes the bromine partitioning and the ozone depletion very sensitive to the branching ratio of the potential channel forming HBr in the BrO + HO2 reaction.
Uncertainty about probability: a decision analysis perspective
International Nuclear Information System (INIS)
Howard, R.A.
1988-01-01
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Dependency models and probability of joint events
International Nuclear Information System (INIS)
Oerjasaeter, O.
1982-08-01
Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Probabilities on Streams and Reflexive Games
Directory of Open Access Journals (Sweden)
Andrew Schumann
2014-01-01
Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Modeling experiments using quantum and Kolmogorov probability
International Nuclear Information System (INIS)
Hess, Karl
2008-01-01
Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.
Keren, G.; Teigen, K.H.
2001-01-01
This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which
Spallation reactions; Reactions de spallation
Energy Technology Data Exchange (ETDEWEB)
Cugon, J.
1996-12-31
Spallation reactions dominate the interactions of hadrons with nuclei in the GeV range (from {approx} 0.1 to {approx} 10 GeV). They correspond to a sometimes important ejection of light particles leaving most of the time a residue of mass commensurate with the target mass. The main features of the experimental data are briefly reviewed. The most successful theoretical model, namely the intranuclear cascade + evaporation model, is presented. Its physical content, results and possible improvements are critically discussed. Alternative approaches are shortly reviewed. (author). 84 refs.
International Nuclear Information System (INIS)
Turlay, R.
1962-01-01
In the first part of this experiment, we determined the total cross section for processes yielding only neutral particles, from 300 to 1600 MeV. For this, we counted the number of incident π - , as defined by a counter telescope, which interact in a liquid-hydrogen target without giving charged particles in a 4π counter surrounding the target. In the second part of this experiment, we have separated the reaction π - p → π 0 n and π - p → π 0 π 0 n between 300 and 1100 MeV, by supposing that only these two reactions was realized by placing lead absorbers between the target and 4π counter and by comparing the counting rate for neutral events with and without lead. The transmission thus measured is a function of the average number of photons produced and therefore of the ratio between the two neutral channels, π 0 n and π 0 π 0 n. In the last section of this work, we discuss the experimental results and compare them to those obtained by other authors in the study of photoproduction and the π-nucleon interaction. (author) [fr
Energy Technology Data Exchange (ETDEWEB)
Compton, N.; Taylor, C. E.; Hicks, K.; Cole, P.; Zachariou, N.; Ilieva, Y.; Nadel-Turonski, P.; Klempt, E.; Nikonov, V. A.; Sarantsev, A. V.; Adhikari, K. P.; Adhikari, S.; Akbar, Z.; Anefalos Pereira, S.; Avakian, H.; Baltzell, N. A.; Battaglieri, M.; Batourine, V.; Bedlinskiy, I.; Biselli, A. S.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Camp, M.; Cao, Frank Thanh; Cao, T.; Carman, D. S.; Celentano, A.; Charles, G.; Chetry, T.; Ciullo, G.; Clark, L.; Cole, P. L.; Contalbrigo, M.; Cortes, O.; Crede, V.; D' Angelo, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Deur, A.; Djalali, C.; Dupre, R.; Egiyan, H.; El Alaoui, A.; El Fassi, L.; Elouadrhiri, L.; Eugenio, P.; Fedotov, G.; Filippi, A.; Fleming, J. A.; Fradi, A.; Gavalian, G.; Ghandilyan, Y.; Giovanetti, K. L.; Girod, F. X.; Glazier, D. I.; Gleason, C.; Golovatch, E.; Gothe, R. W.; Griffioen, K. A.; Guidal, M.; Guo, L.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Harrison, N.; Heddle, D.; Holtrop, M.; Hughes, S. M.; Hyde, C. E.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Jenkins, D.; Jo, H. S.; Joo, K.; Joosten, S.; Keller, D.; Khachatryan, G.; Khachatryan, M.; Khandaker, M.; Kim, W.; Klein, A.; Klein, F. J.; Kubarovsky, V.; Kuleshov, S. V.; Lanza, L.; Lenisa, P.; Livingston, K.; Lu, H. Y.; MacGregor, I. J. D.; Markov, N.; McKinnon, B.; Meyer, C. A.; Mineeva, T.; Mirazita, M.; Mokeev, V.; Montgomery, R. A.; Movsisyan, A.; Munevar, E.; Munoz Camacho, C.; Murdoch, G.; Nadel-Turonski, P.; Niccolai, S.; Niculescu, G.; Niculescu, I.; Osipenko, M.; Ostrovidov, A. I.; Paolone, M.; Paremuzyan, R.; Park, K.; Pasyuk, E.; Phelps, W.; Pisano, S.; Pogorelko, O.; Price, J. W.; Prok, Y.; Protopopescu, D.; Raue, B. A.; Ripani, M.; Ritchie, B. G.; Rizzo, A.; Rosner, G.; Sabatié, F.; Salgado, C.; Schumacher, R. A.; Sharabian, Y. G.; Simonyan, A.; Skorodumina, Iu.; Smith, G. D.; Sokhan, D.; Sparveris, N.; Stankovic, I.; Stepanyan, S.; Strakovsky, I. I.; Strauch, S.; Taiuti, M.; Torayev, B.; Trivedi, A.; Ungaro, M.; Voskanyan, H.; Voutier, E.; Walford, N. K.; Watts, D. P.; Wei, X.; Wood, M. H.; Zachariou, N.; Zhang, J.
2017-12-01
We report the first measurement of differential and total cross sections for the gamma d -> K-0 Lambda(p) reaction, using data from the CLAS detector at the Thomas Jefferson National Accelerator Facility. Data collected during two separate experimental runs were studied with photon-energy coverage 0.8-3.6 GeV and 0.5-2.6 GeV, respectively. The two measurements are consistent giving confidence in the method and determination of systematic uncertainties. The cross sections are compared with predictions from the KAON-MAID theoretical model (without kaon exchange), which deviate from the data at higher W and at forward kaon angles. These data, along with previously published cross sections for K+Lambda photoproduction, provide essential constraints on the nucleon resonance spectrum. A first partial wave analysis was performed that describes the data without the introduction of new resonances.
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Encounter Probability of Individual Wave Height
DEFF Research Database (Denmark)
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Predicting binary choices from probability phrase meanings.
Wallsten, Thomas S; Jang, Yoonhee
2008-08-01
The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.
Certainties and probabilities of the IPCC
International Nuclear Information System (INIS)
2004-01-01
Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)
The probability factor in establishing causation
International Nuclear Information System (INIS)
Hebert, J.
1988-01-01
This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr
Bayesian optimization for computationally extensive probability distributions.
Tamura, Ryo; Hukushima, Koji
2018-01-01
An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.
Characteristic length of the knotting probability revisited
International Nuclear Information System (INIS)
Uehara, Erica; Deguchi, Tetsuo
2015-01-01
We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)
Probability of Survival Decision Aid (PSDA)
National Research Council Canada - National Science Library
Xu, Xiaojiang; Amin, Mitesh; Santee, William R
2008-01-01
A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Determining probabilities of geologic events and processes
International Nuclear Information System (INIS)
Hunter, R.L.; Mann, C.J.; Cranwell, R.M.
1985-01-01
The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Probability of spent fuel transportation accidents
International Nuclear Information System (INIS)
McClure, J.D.
1981-07-01
The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Imprecise Probability Methods for Weapons UQ
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Escape and transmission probabilities in cylindrical geometry
International Nuclear Information System (INIS)
Bjerke, M.A.
1980-01-01
An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Collision Probabilities for Finite Cylinders and Cuboids
Energy Technology Data Exchange (ETDEWEB)
Carlvik, I
1967-05-15
Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
Charge-exchange reactions on 36 S
International Nuclear Information System (INIS)
Fifield, L.K.; Catford, W.N.; Orr, N.A.; Ophel, T.R.; Etchegoyen, A.; Etchegoyen, M.C.
1992-11-01
A series of charge-exchange reactions on 36 S targets have been investigated at beam energies ∼7 MeV/A. Pronounced selectivities to different final states in 36 P are observed which depend on the projectile employed. An interpretation of the data in terms of one- and two-step pictures of the reaction mechanism is presented. At least two, and probably all, of the reactions have a significant 1-step contribution to the reaction mechanism at these energies. 22 refs., 5 tabs., 5 figs
Directory of Open Access Journals (Sweden)
M.P. Silva
2006-06-01
coliforms at 45ºC and to compare the methods of Most Probable Number Method (MPN and Petrifilm EC in their efficiency to detect E. coli in Minas cheese, sausage, fresh vegetables and corn flower samples. Petrifilm EC was more accurate method than MNP method, which presented false negative results or underestimation of E. coli. In conclusion, Petrifilm EC was an efficient and practical method to detect E. coli. Therefore, it may be alternatively used for enumeration of total coliforms and E. coli in foods.
International Nuclear Information System (INIS)
Balogh, Brian.
1991-01-01
Chain Reaction is a work of recent American political history. It seeks to explain how and why America came to depend so heavily on its experts after World War II, how those experts translated that authority into political clout, and why that authority and political discretion declined in the 1970s. The author's research into the internal memoranda of the Atomic Energy Commission substantiates his argument in historical detail. It was not the ravages of American anti-intellectualism, as so many scholars have argued, that brought the experts back down to earth. Rather, their decline can be traced to the very roots of their success after World War II. The need to over-state anticipated results in order to garner public support, incessant professional and bureaucratic specialization, and the sheer proliferation of expertise pushed arcane and insulated debates between experts into public forums at the same time that a broad cross section of political participants found it easier to gain access to their own expertise. These tendencies ultimately undermined the political influence of all experts. (author)
Metal-catalyzed asymmetric aldol reactions
Energy Technology Data Exchange (ETDEWEB)
Dias, Luiz C.; Lucca Junior, Emilio C. de; Ferreira, Marco A. B.; Polo, Ellen C., E-mail: ldias@iqm.unicamp.br [Universidade de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica
2012-12-15
The aldol reaction is one of the most powerful and versatile methods for the construction of C-C bonds. Traditionally, this reaction was developed in a stoichiometric version; however, great efforts in the development of chiral catalysts for aldol reactions were performed in recent years. Thus, in this review article, the development of metal-mediated chiral catalysts in Mukaiyama-type aldol reaction, reductive aldol reaction and direct aldol reaction are discussed. Moreover, the application of these catalysts in the total synthesis of complex molecules is discussed. (author)
Dynamic effects in fragmentation reactions
International Nuclear Information System (INIS)
Bertsch, G. F.; Esbensen, H.
2002-01-01
Fragmentation reactions offer a useful tool to study the spectroscopy of halo nuclei, but the large extent of the halo wave function makes the reaction theory more difficult. The simple reaction models based on the eikonal approximation for the nuclear interaction or first-order perturbation theory for the Coulomb interaction have systematic errors that they investigate here, comparing to the predictions of complete dynamical calculations. They find that stripping probabilities are underpredicted by the eikonal model, leading to extracted spectroscopy strengths that are two large. In contrast, the Coulomb excitation is overpredicted by the simple theory. They attribute this to a screening effect, as is well known in the Barkas effect on stopping powers. The errors decrease with beam energy as E(sub beam)(sup -1), and are not significant at beam energies above 50 MeV/u. At lower beam energies, the effects should be taken into account when extracting quantitative spectroscopic strengths
Universal critical wrapping probabilities in the canonical ensemble
Directory of Open Access Journals (Sweden)
Hao Hu
2015-09-01
Full Text Available Universal dimensionless quantities, such as Binder ratios and wrapping probabilities, play an important role in the study of critical phenomena. We study the finite-size scaling behavior of the wrapping probability for the Potts model in the random-cluster representation, under the constraint that the total number of occupied bonds is fixed, so that the canonical ensemble applies. We derive that, in the limit L→∞, the critical values of the wrapping probability are different from those of the unconstrained model, i.e. the model in the grand-canonical ensemble, but still universal, for systems with 2yt−d>0 where yt=1/ν is the thermal renormalization exponent and d is the spatial dimension. Similar modifications apply to other dimensionless quantities, such as Binder ratios. For systems with 2yt−d≤0, these quantities share same critical universal values in the two ensembles. It is also derived that new finite-size corrections are induced. These findings apply more generally to systems in the canonical ensemble, e.g. the dilute Potts model with a fixed total number of vacancies. Finally, we formulate an efficient cluster-type algorithm for the canonical ensemble, and confirm these predictions by extensive simulations.
Fusion probability and survivability in estimates of heaviest nuclei production
Directory of Open Access Journals (Sweden)
Sagaidak Roman N.
2012-02-01
Full Text Available Production of the heavy and heaviest nuclei (from Po to the region of superheavy elements close to Z=114 and N=184 in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing model coupled with the statistical model (SM of de-excitation of a compound nucleus (CN. Excitation functions for fission and evaporation residues (ER measured in very asymmetric combinations can be described rather well. One can scale and fix macroscopic (liquid-drop fission barriers for nuclei involved in the calculation of survivability with SM. In less asymmetric combinations, effects of fusion suppression caused by quasi-fission (QF are starting to appear in the entrance channel of reactions. QF effects could be semi-empirically taken into account using fusion probabilities deduced as the ratio of measured ER cross sections to the ones obtained in the assumption of absence of the fusion suppression in corresponding reactions. SM parameters (fission barriers obtained at the analysis of a very asymmetric combination leading to the production of (nearly the same CN should be used for this evaluation.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!
Directory of Open Access Journals (Sweden)
Mark R. Crovelli
2009-06-01
Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.
Measurement of probability distributions for internal stresses in dislocated crystals
Energy Technology Data Exchange (ETDEWEB)
Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)
2014-11-03
Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.
Evolution of an array of elements with logistic transition probability
International Nuclear Information System (INIS)
Majernik, Vladimir; Surda, Anton
1996-01-01
The paper addresses the problem how the state of an array of elements changes if the transition probabilities of its elements is chosen in the form of a logistic map. This problem leads to a special type of a discrete-time Markov which we simulated numerically for the different transition probabilities and the number of elements in the array. We show that the time evolution of the array exhibits a wide scale of behavior depending on the value of the total number of its elements and on the logistic constant a. We point out that this problem can be applied for description of a spin system with a certain type of mean field and of the multispecies ecosystems with an internal noise. (authors)
Impact of spectral smoothing on gamma radiation portal alarm probabilities
International Nuclear Information System (INIS)
Burr, T.; Hamada, M.; Hengartner, N.
2011-01-01
Gamma detector counts are included in radiation portal monitors (RPM) to screen for illicit nuclear material. Gamma counts are sometimes smoothed to reduce variance in the estimated underlying true mean count rate, which is the 'signal' in our context. Smoothing reduces total error variance in the estimated signal if the bias that smoothing introduces is more than offset by the variance reduction. An empirical RPM study for vehicle screening applications is presented for unsmoothed and smoothed gamma counts in low-resolution plastic scintillator detectors and in medium-resolution NaI detectors. - Highlights: → We evaluate options for smoothing counts from gamma detectors deployed for portal monitoring. → A new multiplicative bias correction (MBC) is shown to reduce bias in peak and valley regions. → Performance is measured using mean squared error and detection probabilities for sources. → Smoothing with the MBC improves detection probabilities and the mean squared error.
Maximizing probable oil field profit: uncertainties on well spacing
International Nuclear Information System (INIS)
MacKay, J.A.; Lerche, I.
1997-01-01
The influence of uncertainties in field development costs, well costs, lifting costs, selling price, discount factor, and oil field reserves are evaluated for their impact on assessing probable ranges of uncertainty on present day worth (PDW), oil field lifetime τ 2/3 , optimum number of wells (OWI), and the minimum (n-) and maximum (n+) number of wells to produce a PDW ≥ O. The relative importance of different factors in contributing to the uncertainties in PDW, τ 2/3 , OWI, nsub(-) and nsub(+) is also analyzed. Numerical illustrations indicate how the maximum PDW depends on the ranges of parameter values, drawn from probability distributions using Monte Carlo simulations. In addition, the procedure illustrates the relative importance of contributions of individual factors to the total uncertainty, so that one can assess where to place effort to improve ranges of uncertainty; while the volatility of each estimate allows one to determine when such effort is needful. (author)
Uncertainty relation and probability. Numerical illustration
International Nuclear Information System (INIS)
Fujikawa, Kazuo; Umetsu, Koichiro
2011-01-01
The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...
Comparing coefficients of nested nonlinear probability models
DEFF Research Database (Denmark)
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...
A basic course in probability theory
Bhattacharya, Rabi
2016-01-01
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
Stotler, D.P.; Goldston, R.J.
1989-09-01
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Independent events in elementary probability theory
Csenki, Attila
2011-07-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY
Directory of Open Access Journals (Sweden)
Magdalena Hykšová
2012-03-01
Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.
Probability analysis of nuclear power plant hazards
International Nuclear Information System (INIS)
Kovacs, Z.
1985-01-01
The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Geometric modeling in probability and statistics
Calin, Ovidiu
2014-01-01
This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...
Fixation probability on clique-based graphs
Choi, Jeong-Ok; Yu, Unjong
2018-02-01
The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.
International Nuclear Information System (INIS)
Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan
2001-01-01
In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%
Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations
International Nuclear Information System (INIS)
El-Shanshoury, Gh.I.
2017-01-01
The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days
An empirical probability model of detecting species at low densities.
Delaney, David G; Leung, Brian
2010-06-01
False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.
Duelling idiots and other probability puzzlers
Nahin, Paul J
2002-01-01
What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Probability densities and Lévy densities
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Probabilities from entanglement, Born's rule from envariance
International Nuclear Information System (INIS)
Zurek, W.
2005-01-01
Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)
Risk Probability Estimating Based on Clustering
DEFF Research Database (Denmark)
Chen, Yong; Jensen, Christian D.; Gray, Elizabeth
2003-01-01
of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...
Fifty challenging problems in probability with solutions
Mosteller, Frederick
1987-01-01
Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall
Path probabilities of continuous time random walks
International Nuclear Information System (INIS)
Eule, Stephan; Friedrich, Rudolf
2014-01-01
Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa
Bayesian estimation of core-melt probability
International Nuclear Information System (INIS)
Lewis, H.W.
1984-01-01
A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease
Asymmetric Total Synthesis of Ieodomycin B
Directory of Open Access Journals (Sweden)
Shuangjie Lin
2017-01-01
Full Text Available Ieodomycin B, which shows in vitro antimicrobial activity, was isolated from a marine Bacillus species. A novel asymmetric total synthetic approach to ieodomycin B using commercially available geraniol was achieved. The approach involves the generation of 1,3-trans-dihydroxyl at C-3 and C-5 positions via a Crimmins-modified Evans aldol reaction and a chelation-controlled Mukaiyama aldol reaction of a p-methoxybenzyl-protected aldehyde, as well as the generation of a lactone ring in a deprotection–lactonization one-pot reaction.
Roxo, Sónia; de Almeida, José António; Matias, Filipa Vieira; Mata-Lima, Herlander; Barbosa, Sofia
2016-03-01
This paper proposes a multistep approach for creating a 3D stochastic model of total petroleum hydrocarbon (TPH) grade in potentially polluted soils of a deactivated oil storage site by using chemical analysis results as primary or hard data and classes of sensory perception variables as secondary or soft data. First, the statistical relationship between the sensory perception variables (e.g. colour, odour and oil-water reaction) and TPH grade is analysed, after which the sensory perception variable exhibiting the highest correlation is selected (oil-water reaction in this case study). The probabilities of cells belonging to classes of oil-water reaction are then estimated for the entire soil volume using indicator kriging. Next, local histograms of TPH grade for each grid cell are computed, combining the probabilities of belonging to a specific sensory perception indicator class and conditional to the simulated values of TPH grade. Finally, simulated images of TPH grade are generated by using the P-field simulation algorithm, utilising the local histograms of TPH grade for each grid cell. The set of simulated TPH values allows several calculations to be performed, such as average values, local uncertainties and the probability of the TPH grade of the soil exceeding a specific threshold value.
Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces
International Nuclear Information System (INIS)
Vourdas, A.
2014-01-01
The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities
Probability for human intake of an atom randomly released into ground, rivers, oceans and air
Energy Technology Data Exchange (ETDEWEB)
Cohen, B L
1984-08-01
Numerical estimates are developed for the probability of an atom randomly released in the top ground layers, in a river, or in the oceans to be ingested orally by a human, and for an atom emitted from an industrial source to be inhaled by a human. Estimates are obtained for both probability per year and for total eventual probability. Results vary considerably for different elements, but typical values for total probabilities are: ground, 3 X 10/sup -3/, oceans, 3 X 10/sup -4/; rivers, 1.7 x 10/sup -4/; and air, 5 X 10/sup -6/. Probabilities per year are typcially 1 X 10/sup -7/ for releases into the ground and 5 X 10/sup -8/ for releases into the oceans. These results indicate that for material with very long-lasting toxicity, it is important to include the pathways from the ground and from the oceans.
Total parenteral nutrition - infants
... medlineplus.gov/ency/article/007239.htm Total parenteral nutrition - infants To use the sharing features on this page, please enable JavaScript. Total parenteral nutrition (TPN) is a method of feeding that bypasses ...
... medlineplus.gov/ency/patientinstructions/000177.htm Total parenteral nutrition To use the sharing features on this page, please enable JavaScript. Total parenteral nutrition (TPN) is a method of feeding that bypasses ...
Technique of total thyroidectomy
International Nuclear Information System (INIS)
Rao, R.S.
1999-01-01
It is essential to define the various surgical procedures that are carried out for carcinoma of the thyroid gland. They are thyroid gland, subtotal lobectomy, total thyroidectomy and near total thyroidectomy
... page: //medlineplus.gov/ency/article/003489.htm Total iron binding capacity To use the sharing features on this page, please enable JavaScript. Total iron binding capacity (TIBC) is a blood test to ...
Tropical Cyclone Wind Probability Forecasting (WINDP).
1981-04-01
llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM
The Probability Heuristics Model of Syllogistic Reasoning.
Chater, Nick; Oaksford, Mike
1999-01-01
Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Critique of `Elements of Quantum Probability'
Gill, R.D.
1998-01-01
We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension
Independent Events in Elementary Probability Theory
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Probable Unusual Transmission of Zika Virus
Centers for Disease Control (CDC) Podcasts
This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
Spatial Probability Cuing and Right Hemisphere Damage
Shaqiri, Albulena; Anderson, Britt
2012-01-01
In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…
Sampling, Probability Models and Statistical Reasoning -RE ...
Indian Academy of Sciences (India)
random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.
Virus isolation: Specimen type and probable transmission
Indian Academy of Sciences (India)
First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.
Estimating the Probability of Negative Events
Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike
2009-01-01
How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…
Concurrency meets probability: theory and practice (abstract)
Katoen, Joost P.
Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between
Confusion between Odds and Probability, a Pandemic?
Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer
2012-01-01
This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…
Probability in Action: The Red Traffic Light
Shanks, John A.
2007-01-01
Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…
Probability & Statistics: Modular Learning Exercises. Teacher Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Probability & Statistics: Modular Learning Exercises. Student Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Conditional probability on MV-algebras
Czech Academy of Sciences Publication Activity Database
Kroupa, Tomáš
2005-01-01
Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005
Investigating Probability with the NBA Draft Lottery.
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…