DEFF Research Database (Denmark)
Helles, Glennie; Fonseca, Rasmus
2009-01-01
residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...... done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results: In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle...
Energy Technology Data Exchange (ETDEWEB)
Wang Jun; Liu Haiyan [University of Science and Technology of China, Hefei National Laboratory for Physical Sciences at the Microscale, and Key Laboratory of Structural Biology, School of Life Sciences (China)], E-mail: hyliu@ustc.edu.cn
2007-01-15
Chemical shifts contain substantial information about protein local conformations. We present a method to assign individual protein backbone dihedral angles into specific regions on the Ramachandran map based on the amino acid sequences and the chemical shifts of backbone atoms of tripeptide segments. The method uses a scoring function derived from the Bayesian probability for the central residue of a query tripeptide segment to have a particular conformation. The Ramachandran map is partitioned into representative regions at two levels of resolution. The lower resolution partitioning is equivalent to the conventional definitions of different secondary structure regions on the map. At the higher resolution level, the {alpha} and {beta} regions are further divided into subregions. Predictions are attempted at both levels of resolution. We compared our method with TALOS using the original TALOS database, and obtained comparable results. Although TALOS may produce the best results with currently available databases which are much enlarged, the Bayesian-probability-based approach can provide a quantitative measure for the reliability of predictions.
Prediction of backbone dihedral angles and protein secondary structure using support vector machines
Directory of Open Access Journals (Sweden)
Hirst Jonathan D
2009-12-01
Full Text Available Abstract Background The prediction of the secondary structure of a protein is a critical step in the prediction of its tertiary structure and, potentially, its function. Moreover, the backbone dihedral angles, highly correlated with secondary structures, provide crucial information about the local three-dimensional structure. Results We predict independently both the secondary structure and the backbone dihedral angles and combine the results in a loop to enhance each prediction reciprocally. Support vector machines, a state-of-the-art supervised classification technique, achieve secondary structure predictive accuracy of 80% on a non-redundant set of 513 proteins, significantly higher than other methods on the same dataset. The dihedral angle space is divided into a number of regions using two unsupervised clustering techniques in order to predict the region in which a new residue belongs. The performance of our method is comparable to, and in some cases more accurate than, other multi-class dihedral prediction methods. Conclusions We have created an accurate predictor of backbone dihedral angles and secondary structure. Our method, called DISSPred, is available online at http://comp.chem.nottingham.ac.uk/disspred/.
Pelloni, S; Provasi, P F; Pagola, G I; Ferraro, M B; Lazzeretti, P
2017-12-07
The trace of tensors that account for chiroptical response of the H 2 O 2 molecule is a function of the HO-OH dihedral angle. It vanishes at 0° and 180°, due to the presence of molecular symmetry planes, but also for values in the range 90-100° of this angle, in which the molecule is unquestionably chiral. Such an atypical effect is caused by counterbalancing contributions of diagonal tensor components with nearly maximal magnitude but opposite sign, determined by electron flow in open or closed helical paths, and associated with induced electric and magnetic dipole moments and anapole moments. For values of dihedral angle external to the 90-100° interval, the helical paths become smaller in size, thus reducing the amount of cancellation among diagonal components. Shrinking of helical paths determines the appearance of extremum values of tensor traces approximately at 50° and 140° dihedral angles.
Kountouris, Petros; Hirst, Jonathan D
2010-07-31
Beta-turns are secondary structure elements usually classified as coil. Their prediction is important, because of their role in protein folding and their frequent occurrence in protein chains. We have developed a novel method that predicts beta-turns and their types using information from multiple sequence alignments, predicted secondary structures and, for the first time, predicted dihedral angles. Our method uses support vector machines, a supervised classification technique, and is trained and tested on three established datasets of 426, 547 and 823 protein chains. We achieve a Matthews correlation coefficient of up to 0.49, when predicting the location of beta-turns, the highest reported value to date. Moreover, the additional dihedral information improves the prediction of beta-turn types I, II, IV, VIII and "non-specific", achieving correlation coefficients up to 0.39, 0.33, 0.27, 0.14 and 0.38, respectively. Our results are more accurate than other methods. We have created an accurate predictor of beta-turns and their types. Our method, called DEBT, is available online at http://comp.chem.nottingham.ac.uk/debt/.
Dharmapurikar, Satej S.; Chithiravel, Sundaresan; Mane, Manoj V.; Deshmukh, Gunvant; Krishnamoorthy, Kothandam
2018-03-01
Diketopyrrolopyrrole (DPP) and i-Indigo (i-Ind) are two monomers that are widely explored as active materials in organic field effect transistor and solar cells. These two molecules showed impressive charge carrier mobility due to better packing that are facilitated by quadrupoles. We hypothesized that the copolymers of these monomers would also exhibit high charge carrier mobility. However, we envisioned that the dihedral angle at the connecting point between the monomers will play a crucial role in packing as well as charge transport. To understand the impact of dihedral angle on charge transport, we synthesized three copolymers, wherein the DPP was sandwiched between benzenes, thiophenes and furans. The copolymer of i-Indigo and furan comprising DPP showed a band gap of 1.4 eV with a very high dihedral angle of 179°. The polymer was found to pack better and the coherence length was found to be 112 Å. The hole carrier mobility of these polymer was found to be highest among the synthesized polymer i.e. 0.01 cm2/vs. The copolymer comprising benzene did not transport hole and electrons. The dihedral angle at the connecting point between i and Indigo and benzene DPP was 143 Å, which the packing and consequently charge transport properties.
Effects of Dihedral Angle on Pool Boiling Heat Transfer from Two Tubes in Vertical Alignment
Energy Technology Data Exchange (ETDEWEB)
Kang, Myeong-Gie [Andong National University, Andong (Korea, Republic of)
2014-10-15
to study the effects of the dihedral angle (α) and the heat flux of the lower tube on heat transfer enhancement of the upper tube, arranged one above the other in the same vertical plane. The combined effects of the dihedral angle and the heat flux of the lower tube on heat transfer enhancement of the upper tube were investigated. The increase in α eventually increases h{sub r} . When α changes from 2 .deg. to 18 .deg. the value of h{sub r} increases about 20.3% for q″{sub L}=10kW/m{sup 2}. The enhancement is clearly observed at the heat fluxes where the convective effect is dominant.
International Nuclear Information System (INIS)
Takahashi, Hideo; Shimada, Ichio
2007-01-01
Novel cross-correlated spin relaxation (CCR) experiments are described, which measure pairwise CCR rates for obtaining peptide dihedral angles Φ. The experiments utilize intra-HNCA type coherence transfer to refocus 2-bond J NCα coupling evolution and generate the N (i)-C α (i) or C'(i-1)-C α (i) multiple quantum coherences which are required for measuring the desired CCR rates. The contribution from other coherences is also discussed and an appropriate setting of the evolution delays is presented. These CCR experiments were applied to 15 N- and 13 C-labeled human ubiquitin. The relevant CCR rates showed a high degree of correlation with the Φ angles observed in the X-ray structure. By utilizing these CCR experiments in combination with those previously established for obtaining dihedral angle Ψ, we can determine high resolution structures of peptides that bind weakly to large target molecules
Disequilibrium dihedral angles in layered intrusions: the microstructural record of fractionation
Holness, Marian; Namur, Olivier; Cawthorn, Grant
2013-04-01
The dihedral angle formed at junctions between two plagioclase grains and a grain of augite is only rarely in textural equilibrium in gabbros from km-scale crustal layered intrusions. The median of a population of these disequilibrium angles, Θcpp, varies systematically within individual layered intrusions, remaining constant over large stretches of stratigraphy with significant increases or decreases associated with the addition or reduction respectively of the number of phases on the liquidus of the bulk magma. The step-wise changes in Θcpp are present in Upper Zone of the Bushveld Complex, the Megacyclic Unit I of the Sept Iles Intrusion, and the Layered Series of the Skaergaard Intrusion. The plagioclase-bearing cumulates of Rum have a bimodal distribution of Θcpp, dependent on whether the cumulus assemblage includes clinopyroxene. The presence of the step-wise changes is independent of the order of arrival of cumulus phases and of the composition of either the cumulus phases or the interstitial liquid inferred to be present in the crystal mush. Step-wise changes in the rate of change in enthalpy with temperature (ΔH) of the cooling and crystallizing magma correspond to the observed variation of Θcpp, with increases of both ΔH and Θcpp associated with the addition of another liquidus phase, and decreases of both associated with the removal of a liquidus phase. The replacement of one phase by another (e.g. olivine ⇔ orthpyroxene) has little effect on ΔH and no discernible effect on Θcpp. An increase of ΔH is manifest by an increase in the fraction of the total enthalpy budget that is the latent heat of crystallization (the fractional latent heat). It also results in an increase in the amount crystallized in each incremental temperature drop (the crystal productivity). An increased fractional latent heat and crystal productivity result in an increased rate of plagioclase growth compared to that of augite during the final stages of solidification
Lyons, James; Dehzangi, Abdollah; Heffernan, Rhys; Sharma, Alok; Paliwal, Kuldip; Sattar, Abdul; Zhou, Yaoqi; Yang, Yuedong
2014-10-30
Because a nearly constant distance between two neighbouring Cα atoms, local backbone structure of proteins can be represented accurately by the angle between C(αi-1)-C(αi)-C(αi+1) (θ) and a dihedral angle rotated about the C(αi)-C(αi+1) bond (τ). θ and τ angles, as the representative of structural properties of three to four amino-acid residues, offer a description of backbone conformations that is complementary to φ and ψ angles (single residue) and secondary structures (>3 residues). Here, we report the first machine-learning technique for sequence-based prediction of θ and τ angles. Predicted angles based on an independent test have a mean absolute error of 9° for θ and 34° for τ with a distribution on the θ-τ plane close to that of native values. The average root-mean-square distance of 10-residue fragment structures constructed from predicted θ and τ angles is only 1.9Å from their corresponding native structures. Predicted θ and τ angles are expected to be complementary to predicted ϕ and ψ angles and secondary structures for using in model validation and template-based as well as template-free structure prediction. The deep neural network learning technique is available as an on-line server called Structural Property prediction with Integrated DEep neuRal network (SPIDER) at http://sparks-lab.org. Copyright © 2014 Wiley Periodicals, Inc.
The power of hard-sphere models: explaining side-chain dihedral angle distributions of Thr and Val.
Zhou, Alice Qinhua; O'Hern, Corey S; Regan, Lynne
2012-05-16
The energy functions used to predict protein structures typically include both molecular-mechanics and knowledge-based terms. In contrast, our approach is to develop robust physics- and geometry-based methods. Here, we investigate to what extent simple hard-sphere models can be used to predict side-chain conformations. The distributions of the side-chain dihedral angle χ(1) of Val and Thr in proteins of known structure show distinctive features: Val side chains predominantly adopt χ(1) = 180°, whereas Thr side chains typically adopt χ(1) = 60° and 300° (i.e., χ(1) = ±60° or g- and g(+) configurations). Several hypotheses have been proposed to explain these differences, including interresidue steric clashes and hydrogen-bonding interactions. In contrast, we show that the observed side-chain dihedral angle distributions for both Val and Thr can be explained using only local steric interactions in a dipeptide mimetic. Our results emphasize the power of simple physical approaches and their importance for future advances in protein engineering and design. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Wako, Hiroshi; Endo, Shigeru
2013-06-01
We have developed a computer program, named PDBETA, that performs normal mode analysis (NMA) based on an elastic network model that uses dihedral angles as independent variables. Taking advantage of the relatively small number of degrees of freedom required to describe a molecular structure in dihedral angle space and a simple potential-energy function independent of atom types, we aimed to develop a program applicable to a full-atom system of any molecule in the Protein Data Bank (PDB). The algorithm for NMA used in PDBETA is the same as the computer program FEDER/2, developed previously. Therefore, the main challenge in developing PDBETA was to find a method that can automatically convert PDB data into molecular structure information in dihedral angle space. Here, we illustrate the performance of PDBETA with a protein-DNA complex, a protein-tRNA complex, and some non-protein small molecules, and show that the atomic fluctuations calculated by PDBETA reproduce the temperature factor data of these molecules in the PDB. A comparison was also made with elastic-network-model based NMA in a Cartesian-coordinate system. Copyright © 2013 Elsevier Ltd. All rights reserved.
Takeuchi, Koh; Frueh, Dominique P.; Sun, Zhen-Yu J.; Hiller, Sebastian
2010-01-01
We present a 13C direct detection CACA-TOCSY experiment for samples with alternate 13C–12C labeling. It provides inter-residue correlations between 13Cα resonances of residue i and adjacent Cαs at positions i − 1 and i + 1. Furthermore, longer mixing times yield correlations to Cα nuclei separated by more than one residue. The experiment also provides Cα-to-sidechain correlations, some amino acid type identifications and estimates for ψ dihedral angles. The power of the experiment derives from the alternate 13C–12C labeling with [1,3-13C] glycerol or [2-13C] glycerol, which allows utilizing the small scalar 3JCC couplings that are masked by strong 1JCC couplings in uniformly 13C labeled samples. PMID:20383561
International Nuclear Information System (INIS)
Takeuchi, Koh; Frueh, Dominique P.; Sun, Zhen-Yu J.; Hiller, Sebastian; Wagner, Gerhard
2010-01-01
We present a 13 C direct detection CACA-TOCSY experiment for samples with alternate 13 C- 12 C labeling. It provides inter-residue correlations between 13 C α resonances of residue i and adjacent C α s at positions i - 1 and i + 1. Furthermore, longer mixing times yield correlations to C α nuclei separated by more than one residue. The experiment also provides C α -to-sidechain correlations, some amino acid type identifications and estimates for ψ dihedral angles. The power of the experiment derives from the alternate 13 C- 12 C labeling with [1,3- 13 C] glycerol or [2- 13 C] glycerol, which allows utilizing the small scalar 3 J CC couplings that are masked by strong 1 J CC couplings in uniformly 13 C labeled samples.
Takeuchi, Koh; Frueh, Dominique P; Sun, Zhen-Yu J; Hiller, Sebastian; Wagner, Gerhard
2010-05-01
We present a (13)C direct detection CACA-TOCSY experiment for samples with alternate (13)C-(12)C labeling. It provides inter-residue correlations between (13)C(alpha) resonances of residue i and adjacent C(alpha)s at positions i - 1 and i + 1. Furthermore, longer mixing times yield correlations to C(alpha) nuclei separated by more than one residue. The experiment also provides C(alpha)-to-sidechain correlations, some amino acid type identifications and estimates for psi dihedral angles. The power of the experiment derives from the alternate (13)C-(12)C labeling with [1,3-(13)C] glycerol or [2-(13)C] glycerol, which allows utilizing the small scalar (3)J(CC) couplings that are masked by strong (1)J(CC) couplings in uniformly (13)C labeled samples.
Taguchi, Alexander T; Mattis, Aidas J; O'Malley, Patrick J; Dikanov, Sergei A; Wraight, Colin A
2013-10-15
Only quinones with a 2-methoxy group can act simultaneously as the primary (QA) and secondary (QB) electron acceptors in photosynthetic reaction centers from Rhodobacter sphaeroides. (13)C hyperfine sublevel correlation measurements of the 2-methoxy in the semiquinone states, SQA and SQB, were compared with quantum mechanics calculations of the (13)C couplings as a function of the dihedral angle. X-ray structures support dihedral angle assignments corresponding to a redox potential gap (ΔEm) between QA and QB of ~180 mV. This is consistent with the failure of a ubiquinone analogue lacking the 2-methoxy to function as QB in mutant reaction centers with a ΔEm of ≈160-195 mV.
Energy Technology Data Exchange (ETDEWEB)
Blum, Alexander Simon
2009-06-10
This thesis deals with the possibility of describing the flavor sector of the Standard Model of Particle Physics (with neutrino masses), that is the fermion masses and mixing matrices, with a discrete, non-abelian flavor symmetry. In particular, mass independent textures are considered, where one or several of the mixing angles are determined by group theory alone and are independent of the fermion masses. To this end a systematic analysis of a large class of discrete symmetries, the dihedral groups, is analyzed. Mass independent textures originating from such symmetries are described and it is shown that such structures arise naturally from the minimization of scalar potentials, where the scalars are gauge singlet flavons transforming non-trivially only under the flavor group. Two models are constructed from this input, one describing leptons, based on the group D{sub 4}, the other describing quarks and employing the symmetry D{sub 14}. In the latter model it is the quark mixing matrix element V{sub ud} - basically the Cabibbo angle - which is at leading order predicted from group theory. Finally, discrete flavor groups are discussed as subgroups of a continuous gauge symmetry and it is shown that this implies that the original gauge symmetry is broken by fairly large representations. (orig.)
International Nuclear Information System (INIS)
Blum, Alexander Simon
2009-01-01
This thesis deals with the possibility of describing the flavor sector of the Standard Model of Particle Physics (with neutrino masses), that is the fermion masses and mixing matrices, with a discrete, non-abelian flavor symmetry. In particular, mass independent textures are considered, where one or several of the mixing angles are determined by group theory alone and are independent of the fermion masses. To this end a systematic analysis of a large class of discrete symmetries, the dihedral groups, is analyzed. Mass independent textures originating from such symmetries are described and it is shown that such structures arise naturally from the minimization of scalar potentials, where the scalars are gauge singlet flavons transforming non-trivially only under the flavor group. Two models are constructed from this input, one describing leptons, based on the group D 4 , the other describing quarks and employing the symmetry D 14 . In the latter model it is the quark mixing matrix element V ud - basically the Cabibbo angle - which is at leading order predicted from group theory. Finally, discrete flavor groups are discussed as subgroups of a continuous gauge symmetry and it is shown that this implies that the original gauge symmetry is broken by fairly large representations. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Takeuchi, Koh [National Institute of Advanced Industrial Science and Technology (AIST), Biomedicinal Information Research Center (BIRC) (Japan); Frueh, Dominique P.; Sun, Zhen-Yu J.; Hiller, Sebastian; Wagner, Gerhard, E-mail: gerhard_wagner@hms.harvard.ed [Harvard Medical School, Department of Biological Chemistry and Molecular Pharmacology (United States)
2010-05-15
We present a {sup 13}C direct detection CACA-TOCSY experiment for samples with alternate {sup 13}C-{sup 12}C labeling. It provides inter-residue correlations between {sup 13}C{sup {alpha}} resonances of residue i and adjacent C{sup {alpha}s} at positions i - 1 and i + 1. Furthermore, longer mixing times yield correlations to C{sup {alpha}} nuclei separated by more than one residue. The experiment also provides C{sup {alpha}}-to-side chain correlations, some amino acid type identifications and estimates for {psi} dihedral angles. The power of the experiment derives from the alternate {sup 13}C-{sup 12}C labeling with [1,3-{sup 13}C] glycerol or [2-{sup 13}C] glycerol, which allows utilizing the small scalar {sup 3}J{sub CC} couplings that are masked by strong {sup 1}J{sub CC} couplings in uniformly {sup 13}C labeled samples.
Ao, Dongyang; Li, Yuanhao; Hu, Cheng; Tian, Weiming
2017-12-22
The dihedral corner reflectors are the basic geometric structure of many targets and are the main contributions of radar cross section (RCS) in the synthetic aperture radar (SAR) images. In stealth technologies, the elaborate design of the dihedral corners with different opening angles is a useful approach to reduce the high RCS generated by multiple reflections. As bistatic synthetic aperture sensors have flexible geometric configurations and are sensitive to the dihedral corners with different opening angles, they specially fit for the stealth target detections. In this paper, the scattering characteristic of dihedral corner reflectors is accurately analyzed in bistatic synthetic aperture images. The variation of RCS with the changing opening angle is formulated and the method to design a proper bistatic radar for maximizing the detection capability is provided. Both the results of the theoretical analysis and the experiments show the bistatic SAR could detect the dihedral corners, under a certain bistatic angle which is related to the geometry of target structures.
International Nuclear Information System (INIS)
Neskovic, N.; Ciric, D.; Perovic, B.
1982-01-01
The survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces is considered. The model is based on the momentum approximation. The projectiles are K + ions and the target is the (001)Ni+K surface. The incident energy is 100 eV and the incident angle 5 0 . The interaction potential of the projectile and the target consists of the Born-Mayer, the dipole and the image charge potentials. The transition probability function corresponds to the resonant electron transition to the 4s projectile energy level. (orig.)
A fast and accurate dihedral interpolation loop subdivision scheme
Shi, Zhuo; An, Yalei; Wang, Zhongshuai; Yu, Ke; Zhong, Si; Lan, Rushi; Luo, Xiaonan
2018-04-01
In this paper, we propose a fast and accurate dihedral interpolation Loop subdivision scheme for subdivision surfaces based on triangular meshes. In order to solve the problem of surface shrinkage, we keep the limit condition unchanged, which is important. Extraordinary vertices are handled using modified Butterfly rules. Subdivision schemes are computationally costly as the number of faces grows exponentially at higher levels of subdivision. To address this problem, our approach is to use local surface information to adaptively refine the model. This is achieved simply by changing the threshold value of the dihedral angle parameter, i.e., the angle between the normals of a triangular face and its adjacent faces. We then demonstrate the effectiveness of the proposed method for various 3D graphic triangular meshes, and extensive experimental results show that it can match or exceed the expected results at lower computational cost.
International Nuclear Information System (INIS)
Oliveira, P.M.C. de.
1976-12-01
A method of calculation of the K atomic shell ionization probability by heavy particles impact, in the semi-classical approximation is presented. In this approximation, the projectile has a classical trajectory. The potential energy due to the projectile is taken as perturbation of the Hamiltonian of the neutral atom. We use scaled Thomas-Fermi wave function for the atomic electrons. The method is valid for intermediate atomic number elements and particle energies of some MeV. Probabilities are calculated for the case of Ag (Z = 47) and protons of 1 and 2 MeV. Results are given as function of scattering angle, and agree well known experimental data and also improve older calculations. (Author) [pt
Accurate Analysis of Target Characteristic in Bistatic SAR Images: A Dihedral Corner Reflectors Case
Directory of Open Access Journals (Sweden)
Dongyang Ao
2017-12-01
Full Text Available The dihedral corner reflectors are the basic geometric structure of many targets and are the main contributions of radar cross section (RCS in the synthetic aperture radar (SAR images. In stealth technologies, the elaborate design of the dihedral corners with different opening angles is a useful approach to reduce the high RCS generated by multiple reflections. As bistatic synthetic aperture sensors have flexible geometric configurations and are sensitive to the dihedral corners with different opening angles, they specially fit for the stealth target detections. In this paper, the scattering characteristic of dihedral corner reflectors is accurately analyzed in bistatic synthetic aperture images. The variation of RCS with the changing opening angle is formulated and the method to design a proper bistatic radar for maximizing the detection capability is provided. Both the results of the theoretical analysis and the experiments show the bistatic SAR could detect the dihedral corners, under a certain bistatic angle which is related to the geometry of target structures.
Accurate Analysis of Target Characteristic in Bistatic SAR Images: A Dihedral Corner Reflectors Case
Ao, Dongyang; Hu, Cheng; Tian, Weiming
2017-01-01
The dihedral corner reflectors are the basic geometric structure of many targets and are the main contributions of radar cross section (RCS) in the synthetic aperture radar (SAR) images. In stealth technologies, the elaborate design of the dihedral corners with different opening angles is a useful approach to reduce the high RCS generated by multiple reflections. As bistatic synthetic aperture sensors have flexible geometric configurations and are sensitive to the dihedral corners with different opening angles, they specially fit for the stealth target detections. In this paper, the scattering characteristic of dihedral corner reflectors is accurately analyzed in bistatic synthetic aperture images. The variation of RCS with the changing opening angle is formulated and the method to design a proper bistatic radar for maximizing the detection capability is provided. Both the results of the theoretical analysis and the experiments show the bistatic SAR could detect the dihedral corners, under a certain bistatic angle which is related to the geometry of target structures. PMID:29271917
International Nuclear Information System (INIS)
Wang Lincong; Donald, Bruce Randall
2004-01-01
We have derived a quartic equation for computing the direction of an internuclear vector from residual dipolar couplings (RDCs) measured in two aligning media, and two simple trigonometric equations for computing the backbone (φ,ψ) angles from two backbone vectors in consecutive peptide planes. These equations make it possible to compute, exactly and in constant time, the backbone (φ,ψ) angles for a residue from RDCs in two media on any single backbone vector type. Building upon these exact solutions we have designed a novel algorithm for determining a protein backbone substructure consisting of α-helices and β-sheets. Our algorithm employs a systematic search technique to refine the conformation of both α-helices and β-sheets and to determine their orientations using exclusively the angular restraints from RDCs. The algorithm computes the backbone substructure employing very sparse distance restraints between pairs of α-helices and β-sheets refined by the systematic search. The algorithm has been demonstrated on the protein human ubiquitin using only backbone NH RDCs, plus twelve hydrogen bonds and four NOE distance restraints. Further, our results show that both the global orientations and the conformations of α-helices and β-strands can be determined with high accuracy using only two RDCs per residue. The algorithm requires, as its input, backbone resonance assignments, the identification of α-helices and β-sheets as well as sparse NOE distance and hydrogen bond restraints.Abbreviations: NMR - nuclear magnetic resonance; RDC - residual dipolar coupling; NOE - nuclear Overhauser effect; SVD - singular value decomposition; DFS - depth-first search; RMSD - root mean square deviation; POF - principal order frame; PDB - protein data bank; SA - simulated annealing; MD - molecular dynamics
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
Directory of Open Access Journals (Sweden)
Xiao Liu
2016-01-01
Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.
Automorphic Lie algebras with dihedral symmetry
International Nuclear Information System (INIS)
Knibbeler, V; Lombardo, S; A Sanders, J
2014-01-01
The concept of automorphic Lie algebras arises in the context of reduction groups introduced in the early 1980s in the field of integrable systems. automorphic Lie algebras are obtained by imposing a discrete group symmetry on a current algebra of Krichever–Novikov type. Past work shows remarkable uniformity between algebras associated to different reduction groups. For example, if the base Lie algebra is sl 2 (C) and the poles of the automorphic Lie algebra are restricted to an exceptional orbit of the symmetry group, changing the reduction group does not affect the Lie algebra structure. In this research we fix the reduction group to be the dihedral group and vary the orbit of poles as well as the group action on the base Lie algebra. We find a uniform description of automorphic Lie algebras with dihedral symmetry, valid for poles at exceptional and generic orbits. (paper)
Teaching Molecular Symmetry of Dihedral Point Groups by Drawing Useful 2D Projections
Chen, Lan; Sun, Hongwei; Lai, Chengming
2015-01-01
There are two main difficulties in studying molecular symmetry of dihedral point groups. One is locating the C[subscript 2] axes perpendicular to the C[subscript n] axis, while the other is finding the s[subscript]d planes which pass through the C[subscript n] axis and bisect the angles formed by adjacent C[subscript 2] axes. In this paper, a…
Integral pentavalent Cayley graphs on abelian or dihedral groups
Indian Academy of Sciences (India)
MOHSEN GHASEMI
ghasemi@urmia.ac.ir. MS received 8 July 2015; revised 10 July 2016. Abstract. A graph is called integral, if all of its eigenvalues are integers. In this paper, we give some results about integral pentavalent Cayley graphs on abelian or dihedral.
Registration of Images with N-fold Dihedral Blur
Czech Academy of Sciences Publication Activity Database
Pedone, M.; Flusser, Jan; Heikkila, J.
2015-01-01
Roč. 24, č. 3 (2015), s. 1036-1045 ISSN 1057-7149 R&D Projects: GA ČR GA13-29225S; GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Image registration * blurred images * N-fold rotational symmetry * dihedral symmetry * phase correlation Subject RIV: JD - Computer Applications, Robotics Impact factor: 3.735, year: 2015 http://library.utia.cas.cz/separaty/2015/ZOI/flusser-0441247.pdf
Energy Technology Data Exchange (ETDEWEB)
Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor
2015-01-01
SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.
Design and Polarization Characteristics Analysis of Dihedral Based on Salisbury Screen
Directory of Open Access Journals (Sweden)
Zhang Ran
2016-12-01
Full Text Available Salisbury screens have a number of unique electromagnetic scattering characteristics. When appropriately designed, the Salisbury screen can reach the radar target signature transform. Based on the electromagnetic scattering characteristics of the Salisbury screen, we designed a novel dihedral corner, and theoretically analyzed and simulated its electromagnetic scattering characteristics in this study. The results reveal the monostatic radar cross section curves of the 90°and 60° Salisbury screen dihedral and metal dihedral, respectively. Taking an orthogonal dihedral corner as an example, we obtained the polarization scattering matrixes for different incident degrees. In addition, we investigated the influence of illumination frequency, target gestures, and other key factors on the polarization characteristics of the Salisbury screen dihedral corner. The theoretical and simulation analysis results show that compared with the conventional metal dihedral corner, the Salisbury screen dihedral corner significantly influences the scattering characteristics and will have potential application in electronic warfare.
Freeman-Durden Decomposition with Oriented Dihedral Scattering
Directory of Open Access Journals (Sweden)
Yan Jian
2014-10-01
Full Text Available In this paper, when the azimuth direction of polarimetric Synthetic Aperature Radars (SAR differs from the planting direction of crops, the double bounce of the incident electromagnetic waves from the terrain surface to the growing crops is investigated and compared with the normal double bounce. Oriented dihedral scattering model is developed to explain the investigated double bounce and is introduced into the Freeman-Durden decomposition. The decomposition algorithm corresponding to the improved decomposition is then proposed. The airborne polarimetric SAR data for agricultural land covering two flight tracks are chosen to validate the algorithm; the decomposition results show that for agricultural vegetated land, the improved Freeman-Durden decomposition has the advantage of increasing the decomposition coherency among the polarimetric SAR data along the different flight tracks.
DEFF Research Database (Denmark)
Thulstrup, Peter Waaben; Hoffmann, Søren Vrønning; Hansen, Bjarke Knud Vilster
2011-01-01
A new analysis of the optical properties of the molecular rotor 1,4-diphenyl-1,3-butadiyne (diphenyl-diacetylene, DPDA) is presented, taking account of the conformational dynamics. The absorption spectra are interpreted in terms of simultaneous contributions from planar as well as non-planar rota......A new analysis of the optical properties of the molecular rotor 1,4-diphenyl-1,3-butadiyne (diphenyl-diacetylene, DPDA) is presented, taking account of the conformational dynamics. The absorption spectra are interpreted in terms of simultaneous contributions from planar as well as non...
Jindal, Shveta; Dada, Tanuj; Sreenivas, V; Gupta, Viney; Sihota, Ramanjit; Panda, Anita
2010-01-01
Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT) glaucoma probability score (GPS) with that of Moorfield’s regression analysis (MRA). Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k) for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 – 0.315). The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives) and least specific criteria (borderline results included as test positives). The MRA sensitivity and specificity were 30.61 and 98% (most specific) and 57.14 and 98% (least specific). The GPS sensitivity and specificity were 81.63 and 73.47% (most specific) and 95.92 and 34.69% (least specific). The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08) and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44).The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs. PMID:20952832
Directory of Open Access Journals (Sweden)
Jindal Shveta
2010-01-01
Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.
Strange, P.
2012-01-01
In this paper we demonstrate a surprising aspect of quantum mechanics that is accessible to an undergraduate student. We discuss probability backflow for an electron in a constant magnetic field. It is shown that even for a wavepacket composed entirely of states with negative angular momentum the effective angular momentum can take on positive…
International Nuclear Information System (INIS)
Wang, L.H.; Guo, Y.; Tian, C.F.; Song, X.P.; Ding, B.J.
2010-01-01
Using first-principles density functional theory and nonequilibrium Green's function formalism, we investigate the effect of torsion angle on the rectifying characteristics of 4'-thiolate-biphenyl-4-dithiocarboxylate sandwiched between two Au(111) electrodes. The results show that the torsion angle has an evident influence on rectifying performance of such devices. By increasing the dihedral angle between two phenyl rings, namely changing the magnitude of the intermolecular coupling effect, a different rectifying behavior can be observed in these systems. Our findings highlight that the rectifying characteristics are intimately related to dihedral angles and can provide fundamental guidelines for the design of functional molecular devices.
DEFF Research Database (Denmark)
Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik
2003-01-01
The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...... of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse....
Czech Academy of Sciences Publication Activity Database
Tsuji, H.; Fogarty, H. A.; Ehara, M.; Fukuda, R.; Casher, D. L.; Tamao, K.; Nakatsuji, H.; Michl, Josef
2014-01-01
Roč. 20, č. 30 (2014), s. 9431-9441 ISSN 0947-6539 Institutional support: RVO:61388963 Keywords : conformational effects * electronic spectra * SAC-CI calculations * silicon * UV/Vis spectroscopy Subject RIV: CC - Organic Chemistry Impact factor: 5.731, year: 2014
Modeling Bottom-Up Visual Attention Using Dihedral Group D4 §
Directory of Open Access Journals (Sweden)
Puneet Sharma
2016-08-01
Full Text Available In this paper, first, we briefly describe the dihedral group D 4 that serves as the basis for calculating saliency in our proposed model. Second, our saliency model makes two major changes in a latest state-of-the-art model known as group-based asymmetry. First, based on the properties of the dihedral group D 4 , we simplify the asymmetry calculations associated with the measurement of saliency. This results is an algorithm that reduces the number of calculations by at least half that makes it the fastest among the six best algorithms used in this research article. Second, in order to maximize the information across different chromatic and multi-resolution features, the color image space is de-correlated. We evaluate our algorithm against 10 state-of-the-art saliency models. Our results show that by using optimal parameters for a given dataset, our proposed model can outperform the best saliency algorithm in the literature. However, as the differences among the (few best saliency models are small, we would like to suggest that our proposed model is among the best and the fastest among the best. Finally, as a part of future work, we suggest that our proposed approach on saliency can be extended to include three-dimensional image data.
Directory of Open Access Journals (Sweden)
Dakun Zhang
2013-01-01
Full Text Available The necessary of classification research on common formula of group (dihedral group cycle decomposition expression is illustrated. It includes the reflection and rotation conversion, which derived six common formulae on cycle decomposition expressions of group; it designed the generation algorithm on the cycle decomposition expressions of group, which is based on the method of replacement conversion and the classification formula; algorithm analysis and the results of the process show that the generation algorithm which is based on the classification formula is outperformed by the general algorithm which is based on replacement conversion; it has great significance to solve the enumeration of the necklace combinational scheme, especially the structural problems of combinational scheme, by using group theory and computer.
Eisenberg, Azaria Solomon; Juszczak, Laura J
2013-07-05
Molecular dynamics (MD), coupled with fluorescence data for charged dipeptides of tryptophanyl glutamic acid (Trp-Glu), reveal a detailed picture of how specific conformation affects fluorescence. Fluorescence emission spectra and time-resolved emission measurements have been collected for all four charged species. MD simulations 20 to 30 ns in length have also been carried out for the Trp-Glu species, as simulation provides aqueous phase conformational data that can be correlated with the fluorescence data. The calculations show that each dipeptide species is characterized by a similar set of six, discrete Chi 1, Chi 2 dihedral angle pairs. The preferred Chi 1 angles--60°, 180°, and 300°--play the significant role in positioning the terminal amine relative to the indole ring. A Chi 1 angle of 60° results in the arching of the backbone over the indole ring and no interaction of the ring with the terminal amine. Chi 1 values of 180° and 300° result in an extension of the backbone away from the indole ring and a NH3 cation-π interaction with indole. This interaction is believed responsible for charge transfer quenching. Two fluorescence lifetimes and their corresponding amplitudes correlate with the Chi 1 angle probability distribution for all four charged Trp-Glu dipeptides. Fluorescence emission band maxima are also consistent with the proposed pattern of terminal amine cation quenching of fluorescence. Copyright © 2013 Wiley Periodicals, Inc.
Mohammad, Siti Afiqah; Ali, Nor Muhainiah Mohd; Sarmin, Nor Haniza; Idrus, Nor'ashiqin Mohd; Masri, Rohaidah
2014-06-01
A Bieberbach group is a torsion free crystallographic group, which is an extension of a free abelian group of finite rank by a finite point group, while homological functors of a group include nonabelian tensor square, exterior square and Schur Multiplier. In this paper, some homological functors of a Bieberbach group of dimension four with dihedral point group of order eight are computed.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
International Nuclear Information System (INIS)
Marklund, T.
1978-01-01
The most commonly used methods of assessing the scoliotic deviation measure angles that are not clearly defined in relation to the anatomy of the patient. In order to give an anatomic basis for such measurements it is proposed to define the scoliotic deviation as the deviation the vertebral column makes with the sagittal plane. Both the Cobb and the Ferguson angles may be based on this definition. The present methods of measurement are then attempts to measure these angles. If the plane of these angles is parallel to the film, the measurement will be correct. Errors in the measurements may be incurred by the projection. A hypothetical projection, called a 'rectified orthogonal projection', is presented, which correctly represents all scoliotic angles in accordance with these principles. It can be constructed in practice with the aid of a computer and by performing measurements on two projections of the vertebral column; a scoliotic curve can be represented independent of the kyphosis and lordosis. (Auth.)
Directory of Open Access Journals (Sweden)
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Fauzi, Wan Nor Farhana Wan Mohd; Idrus, Nor'ashiqin Mohd; Masri, Rohaidah; Sarmin, Nor Haniza
2014-07-01
The nonabelian tensor product was originated in homotopy theory as well as in algebraic K-theory. The nonabelian tensor square is a special case of the nonabelian tensor product where the product is defined if the two groups act on each other in a compatible way and their action are taken to be conjugation. In this paper, the computation of nonabelian tensor square of a Bieberbach group, which is a torsion free crystallographic group, of dimension five with dihedral point group of order eight is determined. Groups, Algorithms and Programming (GAP) software has been used to assist and verify the results.
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Deep learning methods for protein torsion angle prediction.
Li, Haiou; Hou, Jie; Adhikari, Badri; Lyu, Qiang; Cheng, Jianlin
2017-09-18
Deep learning is one of the most powerful machine learning methods that has achieved the state-of-the-art performance in many domains. Since deep learning was introduced to the field of bioinformatics in 2012, it has achieved success in a number of areas such as protein residue-residue contact prediction, secondary structure prediction, and fold recognition. In this work, we developed deep learning methods to improve the prediction of torsion (dihedral) angles of proteins. We design four different deep learning architectures to predict protein torsion angles. The architectures including deep neural network (DNN) and deep restricted Boltzmann machine (DRBN), deep recurrent neural network (DRNN) and deep recurrent restricted Boltzmann machine (DReRBM) since the protein torsion angle prediction is a sequence related problem. In addition to existing protein features, two new features (predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments) are used as input to each of the four deep learning architectures to predict phi and psi angles of protein backbone. The mean absolute error (MAE) of phi and psi angles predicted by DRNN, DReRBM, DRBM and DNN is about 20-21° and 29-30° on an independent dataset. The MAE of phi angle is comparable to the existing methods, but the MAE of psi angle is 29°, 2° lower than the existing methods. On the latest CASP12 targets, our methods also achieved the performance better than or comparable to a state-of-the art method. Our experiment demonstrates that deep learning is a valuable method for predicting protein torsion angles. The deep recurrent network architecture performs slightly better than deep feed-forward architecture, and the predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments are useful features for improving prediction accuracy.
International Nuclear Information System (INIS)
Finch, P.E.; Dancer, K.A.; Isaac, P.S.; Links, J.
2011-01-01
The representation theory of the Drinfeld doubles of dihedral groups is used to solve the Yang-Baxter equation. Use of the two-dimensional representations recovers the six-vertex model solution. Solutions in arbitrary dimensions, which are viewed as descendants of the six-vertex model case, are then obtained using tensor product graph methods which were originally formulated for quantum algebras. Connections with the Fateev-Zamolodchikov model are discussed.
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
... Home » Statistics and Data » Glaucoma, Open-angle Listen Glaucoma, Open-angle Open-angle Glaucoma Defined In open-angle glaucoma, the fluid passes ... 2010 2010 U.S. Age-Specific Prevalence Rates for Glaucoma by Age and Race/Ethnicity The prevalence of ...
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Small angle spectrometers: Summary
International Nuclear Information System (INIS)
Courant, E.; Foley, K.J.; Schlein, P.E.
1986-01-01
Aspects of experiments at small angles at the Superconducting Super Collider are considered. Topics summarized include a small angle spectrometer, a high contingency spectrometer, dipole and toroid spectrometers, and magnet choices
Federal Laboratory Consortium — Description:The FTA32 goniometer provides video-based contact angle and surface tension measurement. Contact angles are measured by fitting a mathematical expression...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Maadooliat, Mehdi; Gao, Xin; Huang, Jianhua Z.
2012-01-01
Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.
Maadooliat, Mehdi
2012-08-27
Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Scarred resonances and steady probability distribution in a chaotic microcavity
International Nuclear Information System (INIS)
Lee, Soo-Young; Rim, Sunghwan; Kim, Chil-Min; Ryu, Jung-Wan; Kwon, Tae-Yoon
2005-01-01
We investigate scarred resonances of a stadium-shaped chaotic microcavity. It is shown that two components with different chirality of the scarring pattern are slightly rotated in opposite ways from the underlying unstable periodic orbit, when the incident angles of the scarring pattern are close to the critical angle for total internal reflection. In addition, the correspondence of emission pattern with the scarring pattern disappears when the incident angles are much larger than the critical angle. The steady probability distribution gives a consistent explanation about these interesting phenomena and makes it possible to expect the emission pattern in the latter case
International Nuclear Information System (INIS)
Cook, G.O. Jr.; Knight, L.
1979-07-01
The question of optimal projection angles has recently become of interest in the field of reconstruction from projections. Here, studies are concentrated on the n x n pixel space, where literative algorithms such as ART and direct matrix techniques due to Katz are considered. The best angles are determined in a Gauss--Markov statistical sense as well as with respect to a function-theoretical error bound. The possibility of making photon intensity a function of angle is also examined. Finally, the best angles to use in an ART-like algorithm are studied. A certain set of unequally spaced angles was found to be preferred in several contexts. 15 figures, 6 tables
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
DEFF Research Database (Denmark)
Risager, Morten S.; Södergren, Carl Anders
2017-01-01
It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior of the den......It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior...... of the density function in both the small and large variable limits. This extends earlier results by Boca, Pasol, Popa and Zaharescu and Kelmer and Kontorovich in dimension 2 to general dimension n . Our proofs use the decay of matrix coefficients together with a number of careful estimates, and lead...
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
Angle dependence of Andreev scattering at semiconductor-superconductor interfaces
DEFF Research Database (Denmark)
Mortensen, Asger; Flensberg, Karsten; Jauho, Antti-Pekka
1999-01-01
We study the angle dependence of the Andreev scattering at a semiconductor-superconductor interface, generalizing the one-dimensional theory of Blonder, Tinkham, and Klapwijk (BTK),An increase of the momentum parallel to the interface leads to suppression of the probability of Andreev reflection...... and increase of the probability of normal reflection. We show that in the presence of a Fermi velocity mismatch between the semiconductor and the superconductor the angles of incidence and transmission are related according to the well-known Snell's law in optics. As a consequence there is a critical angle...
Solar cell angle of incidence corrections
Burger, Dale R.; Mueller, Robert L.
1995-01-01
Literature on solar array angle of incidence corrections was found to be sparse and contained no tabular data for support. This lack along with recent data on 27 GaAs/Ge 4 cm by 4 cm cells initiated the analysis presented in this paper. The literature cites seven possible contributors to angle of incidence effects: cosine, optical front surface, edge, shadowing, UV degradation, particulate soiling, and background color. Only the first three are covered in this paper due to lack of sufficient data. The cosine correction is commonly used but is not sufficient when the incident angle is large. Fresnel reflection calculations require knowledge of the index of refraction of the coverglass front surface. The absolute index of refraction for the coverglass front surface was not known nor was it measured due to lack of funds. However, a value for the index of refraction was obtained by examining how the prediction errors varied with different assumed indices and selecting the best fit to the set of measured values. Corrections using front surface Fresnel reflection along with the cosine correction give very good predictive results when compared to measured data, except there is a definite trend away from predicted values at the larger incident angles. This trend could be related to edge effects and is illustrated by a use of a box plot of the errors and by plotting the deviation of the mean against incidence angle. The trend is for larger deviations at larger incidence angles and there may be a fourth order effect involved in the trend. A chi-squared test was used to determine if the measurement errors were normally distributed. At 10 degrees the chi-squared test failed, probably due to the very small numbers involved or a bias from the measurement procedure. All other angles showed a good fit to the normal distribution with increasing goodness-of-fit as the angles increased which reinforces the very small numbers hypothesis. The contributed data only went to 65 degrees
Probable maximum flood control
International Nuclear Information System (INIS)
DeGabriele, C.E.; Wu, C.L.
1991-11-01
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
From lattice BF gauge theory to area-angle Regge calculus
International Nuclear Information System (INIS)
Bonzom, Valentin
2009-01-01
We consider Riemannian 4D BF lattice gauge theory, on a triangulation of spacetime. Introducing the simplicity constraints which turn BF theory into simplicial gravity, some geometric quantities of Regge calculus, areas, and 3D and 4D dihedral angles, are identified. The parallel transport conditions are taken care of to ensure a consistent gluing of simplices. We show that these gluing relations, together with the simplicity constraints, contain the constraints of area-angle Regge calculus in a simple way, via the group structure of the underlying BF gauge theory. This provides a precise road from constrained BF theory to area-angle Regge calculus. Doing so, a framework combining variables of lattice BF theory and Regge calculus is built. The action takes a form a la Regge and includes the contribution of the Immirzi parameter. In the absence of simplicity constraints, the standard spin foam model for BF theory is recovered. Insertions of local observables are investigated, leading to Casimir insertions for areas and reproducing for 3D angles known results obtained through angle operators on spin networks. The present formulation is argued to be suitable for deriving spin foam models from discrete path integrals and to unravel their geometric content.
Gap probability - Measurements and models of a pecan orchard
Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI
1992-01-01
Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.
DEFF Research Database (Denmark)
Miles, James Edward; Frederiksen, Jane V.; Jensen, Bente Rona
2012-01-01
: Pelvic limbs from red foxes (Vulpes vulpes). METHODS: Q angles were measured on hip dysplasia (HD) and whole limb (WL) view radiographs of each limb between the acetabular rim, mid-point (Q1: patellar center, Q2: femoral trochlea), and tibial tuberosity. Errors of 0.5-2.0 mm at measurement landmarks...
African Journals Online (AJOL)
there is a build up of pressure due to poor outflow of aqueous humor. The outflow obstruction could occur at the trabecular meshwork of the anterior chamber angle or subsequently in the episcleral vein due to raised venous pressure. Such build up of pressure results in glaucoma . Elevated intraocular pressure remains the ...
DEFF Research Database (Denmark)
Morgan, Jeannie; Lynnerup, Niels; Hoppa, R.D.
2013-01-01
measurements taken from computed tomography (CT) scans. Previous reports have observed that the lateral angle size in females is significantly larger than in males. The method was applied to an independent series of 77 postmortem CT scans (42 males, 35 females) to validate its accuracy and reliability...... method appears to be of minimal practical use in forensic anthropology and archeology....
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 9. At Right Angles. Shailesh A Shirali. Information and Announcements Volume 17 Issue 9 September 2012 pp 920-920. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/017/09/0920-0920 ...
International Nuclear Information System (INIS)
Kantrowitz, A.
1976-01-01
A method and apparatus is described for particle separation. The method uses a wide angle radially expanding vapor of a particle mixture. In particular, selective ionization of one isotope type in the particle mixture is produced in a multichamber separator and the ionized isotope type is accelerated out of the path of the vapor expansion for separate collection
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-05-01
Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
Classical probability model for Bell inequality
International Nuclear Information System (INIS)
Khrennikov, Andrei
2014-01-01
We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction 'Kolmogorovization' of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θ i ,θ ' j ) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θ i , θ ' j )). One may (but need not) say that randomness of 'free will' has to be accounted for.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Small angle neutron scattering
International Nuclear Information System (INIS)
Bernardini, G.; Cherubini, G.; Fioravanti, A.; Olivi, A.
1976-09-01
A method for the analysis of the data derived from neutron small angle scattering measurements has been accomplished in the case of homogeneous particles, starting from the basic theory without making any assumption on the form of particle size distribution function. The experimental scattering curves are interpreted with the aid the computer by means of a proper routine. The parameters obtained are compared with the corresponding ones derived from observations at the transmission electron microscope
International Nuclear Information System (INIS)
Qiu, S.; Amano, H.; Kasai, A.
1988-01-01
The solid angle in extended alpha source measurement for a series of counting geometries has been obtained by two methods: (1) calculated by means of the Nelson Blachmen series; (2) interpolated from the data table given by Gardner. A particular consequence of the application of the Nelson Blachmen series was deduced which was different from that given by the original author. The applicability of these two methods, as well as an experimentally measured method, is also evaluated. (author)
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
The influence of flip angle on the magic angle effect
International Nuclear Information System (INIS)
Zurlo, J.V.; Blacksin, M.F.; Karimi, S.
2000-01-01
Objective. To assess the impact of flip angle with gradient sequences on the ''magic angle effect''. We characterized the magic angle effect in various gradient echo sequences and compared the signal- to-noise ratios present on these sequences with the signal-to-noise ratios of spin echo sequences.Design. Ten normal healthy volunteers were positioned such that the flexor hallucis longus tendon remained at approximately at 55 to the main magnetic field (the magic angle). The tendon was imaged by a conventional spin echo T1- and T2-weighted techniques and by a series of gradient techniques. Gradient sequences were altered by both TE and flip angle. Signal-to-noise measurements were obtained at segments of the flexor hallucis longus tendon demonstrating the magic angle effect to quantify the artifact. Signal-to-noise measurements were compared and statistical analysis performed. Similar measurements were taken of the anterior tibialis tendon as an internal control.Results and conclusions. We demonstrated the magic angle effect on all the gradient sequences. The intensity of the artifact was affected by both the TE and flip angle. Low TE values and a high flip angle demonstrated the greatest magic angle effect. At TE values less than 30 ms, a high flip angle will markedly increase the magic angle effect. (orig.)
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Impact parameter dependence of inner-shell ionization probabilities
International Nuclear Information System (INIS)
Cocke, C.L.
1974-01-01
The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed
Variable angle correlation spectroscopy
International Nuclear Information System (INIS)
Lee, Y.K.; Lawrence Berkeley Lab., CA
1994-05-01
In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with 13 C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system
Prediction and probability in sciences
International Nuclear Information System (INIS)
Klein, E.; Sacquin, Y.
1998-01-01
This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Equilibrium contact angle or the most-stable contact angle?
Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A
2014-04-01
It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. © 2013 Elsevier B.V. All rights reserved.
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
The double Brewster angle effect
Thirion-Lefevre, Laetitia; Guinvarc'h, Régis
2018-01-01
The Double Brewster angle effect (DBE) is an extension of the Brewster angle to double reflection on two orthogonal dielectric surfaces. It results from the combination of two pseudo-Brewster angles occurring in complementary incidence angles domains. It can be observed for a large range of incidence angles provided that double bounces mechanism is present. As a consequence of this effect, we show that the reflection coefficient at VV polarization can be at least 10 dB lower than the reflection coefficient at HH polarization over a wide range of incidence angle - typically from 20 to 70∘. It is experimentally demonstrated using a Synthetic Aperture Radar (SAR) image that this effect can be seen on buildings and forests. For large buildings, the difference can reach more than 20 dB. xml:lang="fr"
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Joint probabilities and quantum cognition
International Nuclear Information System (INIS)
Acacio de Barros, J.
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Joint probabilities and quantum cognition
Energy Technology Data Exchange (ETDEWEB)
Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)
2012-12-18
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
The Probabilities of Unique Events
2012-08-30
Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the
Probability Matching, Fast and Slow
Koehler, Derek J.; James, Greta
2014-01-01
A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...
Angle Performance on Optima XE
International Nuclear Information System (INIS)
David, Jonathan; Satoh, Shu
2011-01-01
Angle control on high energy implanters is important due to shrinking device dimensions, and sensitivity to channeling at high beam energies. On Optima XE, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through a series of narrow slits, and any angle adjustment is made by steering the beam with the corrector magnet. In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen during implant.Using a sensitive channeling condition, we were able to quantify the angle repeatability of Optima XE. By quantifying the sheet resistance sensitivity to both horizontal and vertical angle variation, the total angle variation was calculated as 0.04 deg. (1σ). Implants were run over a five week period, with all of the wafers selected from a single boule, in order to control for any crystal cut variation.
Small angle neutron scattering
Directory of Open Access Journals (Sweden)
Cousin Fabrice
2015-01-01
Full Text Available Small Angle Neutron Scattering (SANS is a technique that enables to probe the 3-D structure of materials on a typical size range lying from ∼ 1 nm up to ∼ a few 100 nm, the obtained information being statistically averaged on a sample whose volume is ∼ 1 cm3. This very rich technique enables to make a full structural characterization of a given object of nanometric dimensions (radius of gyration, shape, volume or mass, fractal dimension, specific area… through the determination of the form factor as well as the determination of the way objects are organized within in a continuous media, and therefore to describe interactions between them, through the determination of the structure factor. The specific properties of neutrons (possibility of tuning the scattering intensity by using the isotopic substitution, sensitivity to magnetism, negligible absorption, low energy of the incident neutrons make it particularly interesting in the fields of soft matter, biophysics, magnetic materials and metallurgy. In particular, the contrast variation methods allow to extract some informations that cannot be obtained by any other experimental techniques. This course is divided in two parts. The first one is devoted to the description of the principle of SANS: basics (formalism, coherent scattering/incoherent scattering, notion of elementary scatterer, form factor analysis (I(q→0, Guinier regime, intermediate regime, Porod regime, polydisperse system, structure factor analysis (2nd Virial coefficient, integral equations, characterization of aggregates, and contrast variation methods (how to create contrast in an homogeneous system, matching in ternary systems, extrapolation to zero concentration, Zero Averaged Contrast. It is illustrated by some representative examples. The second one describes the experimental aspects of SANS to guide user in its future experiments: description of SANS spectrometer, resolution of the spectrometer, optimization of
Automated analysis of angle closure from anterior chamber angle images.
Baskaran, Mani; Cheng, Jun; Perera, Shamira A; Tun, Tin A; Liu, Jiang; Aung, Tin
2014-10-21
To evaluate a novel software capable of automatically grading angle closure on EyeCam angle images in comparison with manual grading of images, with gonioscopy as the reference standard. In this hospital-based, prospective study, subjects underwent gonioscopy by a single observer, and EyeCam imaging by a different operator. The anterior chamber angle in a quadrant was classified as closed if the posterior trabecular meshwork could not be seen. An eye was classified as having angle closure if there were two or more quadrants of closure. Automated grading of the angle images was performed using customized software. Agreement between the methods was ascertained by κ statistic and comparison of area under receiver operating characteristic curves (AUC). One hundred forty subjects (140 eyes) were included, most of whom were Chinese (102/140, 72.9%) and women (72/140, 51.5%). Angle closure was detected in 61 eyes (43.6%) with gonioscopy in comparison with 59 eyes (42.1%, P = 0.73) using manual grading, and 67 eyes (47.9%, P = 0.24) with automated grading of EyeCam images. The agreement for angle closure diagnosis between gonioscopy and both manual (κ = 0.88; 95% confidence interval [CI), 0.81-0.96) and automated grading of EyeCam images was good (κ = 0.74; 95% CI, 0.63-0.85). The AUC for detecting eyes with gonioscopic angle closure was comparable for manual and automated grading (AUC 0.974 vs. 0.954, P = 0.31) of EyeCam images. Customized software for automated grading of EyeCam angle images was found to have good agreement with gonioscopy. Human observation of the EyeCam images may still be needed to avoid gross misclassification, especially in eyes with extensive angle closure. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Measurement of the angle gamma
International Nuclear Information System (INIS)
Aleksan, R.; Sphicas, P.; Massachusetts Inst. of Tech., Cambridge, MA
1993-12-01
The angle γ as defined in the Wolfenstein approximation is not completely out of reach of current or proposed dedicated B experiments. This work represents but a first step in the direction of extracting the third angle of the unitarity triangle by study the feasibility of using new decay modes in a hadronic machine. (A.B.). 11 refs., 1 fig., 7 tabs
Nucleation of small angle boundaries
CSIR Research Space (South Africa)
Nabarro, FRN
1996-12-01
Full Text Available The internal stresses induced by the strain gradients in an array of lattice cells delineated by low-angle dislocation boundaries are partially relieved by the creation of new low-angle boundaries. This is shown to be a first-order transition...
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
International Nuclear Information System (INIS)
Tikhonirov, V.V.
1993-01-01
The results of calculations of the intensity and polarization of radiation from channeled and unchanneled e +- are presented. The Fourier transformation (FT) is used to calculate numerous matrix elements. The calculations for channeled e + showed fast approach of spectral intensity to its value calculated in the approximation of self-consistent field (ASCF) with growing photon energy. In the case of 150 GeV unchanneled e - in Ge at T=293 K the ASCF gives a significantly higher value as compared to the FT. 4 refs., 3 figs
Relationship between the Angle of Repose and Angle of Internal ...
African Journals Online (AJOL)
). The angle of internal friction ... compression chambers. Lorenzen, 1957 (quoted by Mohsenin,. 1986), reported that the design of deep ... tiongiven for lateral pressure in deep bins as presented by Mohsenin. (1986). The presence of moisture ...
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Probability and statistics: A reminder
International Nuclear Information System (INIS)
Clement, B.
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
On probability-possibility transformations
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Ring magnet firing angle control
International Nuclear Information System (INIS)
Knott, M.J.; Lewis, L.G.; Rabe, H.H.
1975-01-01
A device is provided for controlling the firing angles of thyratrons (rectifiers) in a ring magnet power supply. A phase lock loop develops a smooth ac signal of frequency equal to and in phase with the frequency of the voltage wave developed by the main generator of the power supply. A counter that counts from zero to a particular number each cycle of the main generator voltage wave is synchronized with the smooth AC signal of the phase lock loop. Gates compare the number in the counter with predetermined desired firing angles for each thyratron and with coincidence the proper thyratron is fired at the predetermined firing angle
Glaister, P.
1997-09-01
Tetrahedral Bond Angle from Elementary Trigonometry The alternative approach of using the scalar (or dot) product of vectors enables the determination of the bond angle in a tetrahedral molecule in a simple way. There is, of course, an even more straightforward derivation suitable for students who are unfamiliar with vectors, or products thereof, but who do know some elementary trigonometry. The starting point is the figure showing triangle OAB. The point O is the center of a cube, and A and B are at opposite corners of a face of that cube in which fits a regular tetrahedron. The required bond angle alpha = AÔB; and using Pythagoras' theorem, AB = 2(square root 2) is the diagonal of a face of the cube. Hence from right-angled triangle OEB, tan(alpha/2) = (square root 2) and therefore alpha = 2tan-1(square root 2) is approx. 109° 28' (see Fig. 1).
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Probability matching and strategy availability.
Koehler, Derek J; James, Greta
2010-09-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Logic, Probability, and Human Reasoning
2015-01-01
accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Probability matching and strategy availability
J. Koehler, Derek; Koehler, Derek J.; James, Greta
2010-01-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Frequency scaling for angle gathers
Zuberi, M. A H; Alkhalifah, Tariq Ali
2014-01-01
Angle gathers provide an extra dimension to analyze the velocity after migration. Space-shift and time shift-imaging conditions are two methods used to obtain angle gathers, but both are reasonably expensive. By scaling the time-lag axis of the time-shifted images, the computational cost of the time shift imaging condition can be considerably reduced. In imaging and more so Full waveform inversion, frequencydomain Helmholtz solvers are used more often to solve for the wavefields than conventional time domain extrapolators. In such cases, we do not need to extend the image, instead we scale the frequency axis of the frequency domain image to obtain the angle gathers more efficiently. Application on synthetic data demonstrate such features.
Angle imaging: Advances and challenges
Quek, Desmond T L; Nongpiur, Monisha E; Perera, Shamira A; Aung, Tin
2011-01-01
Primary angle closure glaucoma (PACG) is a major form of glaucoma in large populous countries in East and South Asia. The high visual morbidity from PACG is related to the destructive nature of the asymptomatic form of the disease. Early detection of anatomically narrow angles is important and the subsequent prevention of visual loss from PACG depends on an accurate assessment of the anterior chamber angle (ACA). This review paper discusses the advantages and limitations of newer ACA imaging technologies, namely ultrasound biomicroscopy, Scheimpflug photography, anterior segment optical coherence tomography and EyeCam, highlighting the current clinical evidence comparing these devices with each other and with clinical dynamic indentation gonioscopy, the current reference standard. PMID:21150037
Lectures on probability and statistics
International Nuclear Information System (INIS)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Variable angle asymmetric cut monochromator
International Nuclear Information System (INIS)
Smither, R.K.; Fernandez, P.B.
1993-09-01
A variable incident angle, asymmetric cut, double crystal monochromator was tested for use on beamlines at the Advanced Photon Source (APS). For both undulator and wiggler beams the monochromator can expand area of footprint of beam on surface of the crystals to 50 times the area of incident beam; this will reduce the slope errors by a factor of 2500. The asymmetric cut allows one to increase the acceptance angle for incident radiation and obtain a better match to the opening angle of the incident beam. This can increase intensity of the diffracted beam by a factor of 2 to 5 and can make the beam more monochromatic, as well. The monochromator consists of two matched, asymmetric cut (18 degrees), silicon crystals mounted so that they can be rotated about three independent axes. Rotation around the first axis controls the Bragg angle. The second rotation axis is perpendicular to the diffraction planes and controls the increase of the area of the footprint of the beam on the crystal surface. Rotation around the third axis controls the angle between the surface of the crystal and the wider, horizontal axis for the beam and can make the footprint a rectangle with a minimum. length for this area. The asymmetric cut is 18 degrees for the matched pair of crystals, which allows one to expand the footprint area by a factor of 50 for Bragg angles up to 19.15 degrees (6 keV for Si[111] planes). This monochromator, with proper cooling, will be useful for analyzing the high intensity x-ray beams produced by both undulators and wigglers at the APS
An oilspill trajectory analysis model with a variable wind deflection angle
Samuels, W.B.; Huang, N.E.; Amstutz, D.E.
1982-01-01
The oilspill trajectory movement algorithm consists of a vector sum of the surface drift component due to wind and the surface current component. In the U.S. Geological Survey oilspill trajectory analysis model, the surface drift component is assumed to be 3.5% of the wind speed and is rotated 20 degrees clockwise to account for Coriolis effects in the Northern Hemisphere. Field and laboratory data suggest, however, that the deflection angle of the surface drift current can be highly variable. An empirical formula, based on field observations and theoretical arguments relating wind speed to deflection angle, was used to calculate a new deflection angle at each time step in the model. Comparisons of oilspill contact probabilities to coastal areas calculated for constant and variable deflection angles showed that the model is insensitive to this changing angle at low wind speeds. At high wind speeds, some statistically significant differences in contact probabilities did appear. ?? 1982.
Angle independent velocity spectrum determination
DEFF Research Database (Denmark)
2014-01-01
An ultrasound imaging system (100) includes a transducer array (102) that emits an ultrasound beam and produces at least one transverse pulse-echo field that oscillates in a direction transverse to the emitted ultrasound beam and that receive echoes produced in response thereto and a spectral vel...... velocity estimator (110) that determines a velocity spectrum for flowing structure, which flows at an angle of 90 degrees and flows at angles less than 90 degrees with respect to the emitted ultrasound beam, based on the received echoes....
Temperature dependence of Brewster's angle.
Guo, Wei
2018-01-01
In this work, a dielectric at a finite temperature is modeled as an ensemble of identical atoms moving randomly around where they are trapped. Light reflection from the dielectric is then discussed in terms of atomic radiation. Specific calculation demonstrates that because of the atoms' thermal motion, Brewster's angle is, in principle, temperature-dependent, and the dependence is weak in the low-temperature limit. What is also found is that the Brewster's angle is nothing but a result of destructive superposition of electromagnetic radiation from the atoms.
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
DEFF Research Database (Denmark)
Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Sillesen, Henrik
1998-01-01
This paper reports on a scanning technique, denoted multi-angle compound imaging (MACI), using spatial compounding. The MACI method also contains elements of frequency compounding, as the transmit frequency is lowered for the highest beam angles in order to reduce grating lobes. Compared to conve......This paper reports on a scanning technique, denoted multi-angle compound imaging (MACI), using spatial compounding. The MACI method also contains elements of frequency compounding, as the transmit frequency is lowered for the highest beam angles in order to reduce grating lobes. Compared...... to conventional B-mode imaging MACI offers better defined tissue boundaries and lower variance of the speckle pattern, resulting in an image with reduced random variations. Design and implementation of a compound imaging system is described, images of rubber tubes and porcine aorta are shown and effects...... on visualization are discussed. The speckle reduction is analyzed numerically and the results are found to be in excellent agreement with existing theory. An investigation of detectability of low-contrast lesions shows significant improvements compared to conventional imaging. Finally, possibilities for improving...
Femoral varus: what's the angle
DEFF Research Database (Denmark)
Miles, James Edward; Svalastoga, Eiliv Lars; Eriksen, Thomas
angles were calculated using Microsoft Excel for the three previously reported techniques and a novel method, which we believed would be more reliable. Reliability between readings was assessed using the within-subject standard deviation and repeatability coefficient, and the effect of angulation...
Excluding joint probabilities from quantum theory
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Time dependent non-extinction probability for prompt critical systems
International Nuclear Information System (INIS)
Gregson, M. W.; Prinja, A. K.
2009-01-01
The time dependent non-extinction probability equation is presented for slab geometry. Numerical solutions are provided for a nested inner/outer iteration routine where the fission terms (both linear and non-linear) are updated and then held fixed over the inner scattering iteration. Time dependent results are presented highlighting the importance of the injection position and angle. The iteration behavior is also described as the steady state probability of initiation is approached for both small and large time steps. Theoretical analysis of the nested iteration scheme is shown and highlights poor numerical convergence for marginally prompt critical systems. An acceleration scheme for the outer iterations is presented to improve convergence of such systems. Theoretical analysis of the acceleration scheme is also provided and the associated decrease in computational run time addressed. (authors)
Determination of velocity vector angles using the directional cross-correlation method
DEFF Research Database (Denmark)
Kortbek, Jacob; Jensen, Jørgen Arendt
2005-01-01
and then select the angle with the highest normalized correlation between directional signals. The approach is investigated using Field II simulations and data from the experimental ultrasound scanner RASMUS and with a parabolic flow having a peak velocity of 0.3 m/s. A 7 MHz linear array transducer is used......A method for determining both velocity magnitude and angle in any direction is suggested. The method uses focusing along the velocity direction and cross-correlation for finding the correct velocity magnitude. The angle is found from beamforming directional signals in a number of directions......-time ) between signals to correlate, and a proper choice varies with flow angle and flow velocity. One performance example is given with a fixed value of k tprf for all flow angles. The angle estimation on measured data for flow at 60 ◦ to 90 ◦ , yields a probability of valid estimates between 68% and 98...
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Contact angles on stretched solids
Mensink, Liz; Snoeijer, Jacco
2017-11-01
The surface energy of solid interfaces plays a central role in wetting, as they dictate the liquid contact angle. Yet, it has been challenging to measure the solid surface energies independently, without making use of Young's law. Here we present Molecular Dynamics (MD) simulations by which we measure the surface properties for all interfaces, including the solids. We observe change in contact angles upon stretching the solid substrates, showing that the surface energy is actually strain dependent. This is clear evidence of the so-called Shuttleworth effect, making it necessary to distinguish surface energy from surface tension. We discuss how this effect gives rise to a new class of elasto-capillary phenomena. ERC Consolidator Grant No. 616918.
Disorders of the cerebellopontine angle
International Nuclear Information System (INIS)
Block, F.
2006-01-01
Disorders of the cerebellopontine angle may present by symptoms like vertigo, hearing problems, affection of the trigeminal or facial nerve. Ipsilateral ataxia and contralateral hemiparesis develop in case of a rather large tumor in this region and display an involvement of the cerebellum and/or brainstem. However, some of these typical symptoms are not recognized by the patient. Thus, in case of a suspicion of a disorder of the cerebellopontine angle the relevant functions have to be tested clinically. In addition, electrophysiology can confirm dysfunction of these cranial nerves. Mainstay of the therapy should be the treatment of the underlying cause. Nevertheless, not seldom it is necessary to treat symptoms like vertigo or facial pain. (orig.) [de
Measurement of the angle gamma
International Nuclear Information System (INIS)
Aleksan, R.; Kayser, B.; Sphicas, P.
1993-01-01
The angle γ at least as defined in the Wolfenstein approximation is not completely out of reach of current or proposed dedicated B experiments. This conclusion certainly depends crucially on the assumed trigger and tagging efficiencies and also on the expected backgrounds. The work summarized here represents but a first step in the direction of extracting the third angle of the unitarity triangle. The theoretical developments during the workshop have resulted in a clearer understanding of the quantities studied. On the experimental side, new decay modes (i.e. in addition to the traditional ρK s decay) have resulted in expections for observing CP violation in B s decays which are not unreasonable. It is conceivable that a dedicated B experiment can probe a fundamental aspect of the Standard Model, the CKM matrix, in multiple ways. In the process, new physics can appear anywhere along the line
Simple and compact expressions for neutrino oscillation probabilities in matter
International Nuclear Information System (INIS)
Minakata, Hisakazu; Parke, Stephen J.
2016-01-01
We reformulate perturbation theory for neutrino oscillations in matter with an expansion parameter related to the ratio of the solar to the atmospheric Δm"2 scales. Unlike previous works, we use a renormalized basis in which certain first-order effects are taken into account in the zeroth-order Hamiltonian. We show that the new framework has an exceptional feature that leads to the neutrino oscillation probability in matter with the same structure as in vacuum to first order in the expansion parameter. It facilitates immediate physical interpretation of the formulas, and makes the expressions for the neutrino oscillation probabilities extremely simple and compact. We find, for example, that the ν_e disappearance probability at this order is of a simple two-flavor form with an appropriately identified mixing angle and Δm"2. More generally, all the oscillation probabilities can be written in the universal form with the channel-discrimination coefficient of 0, ±1 or simple functions of θ_2_3. Despite their simple forms they include all order effects of θ_1_3 and all order effects of the matter potential, to first order in our expansion parameter.
LHC Report: playing with angles
Mike Lamont for the LHC team
2016-01-01
Ready (after a machine development period), steady (running), go (for a special run)! The crossing angles are an essential feature of the machine set-up. They have to be big enough to reduce the long-range beam-beam effect. The LHC has recently enjoyed a period of steady running and managed to set a new record for “Maximum Stable Luminosity Delivered in 7 days” of 3.29 fb-1 between 29 August and 4 September. The number of bunches per beam remains pegged at 2220 because of the limitations imposed by the SPS beam dump. The bunch population is also somewhat reduced due to outgassing near one of the injection kickers at point 8. Both limitations will be addressed during the year-end technical stop, opening the way for increased performance in 2017. On 10 and 11 September, a two day machine development (MD) period took place. The MD programme included a look at the possibility of reducing the crossing angle at the high-luminosity interaction points. The crossing angles are an ess...
Light Scattering at Various Angles
Latimer, Paul; Pyle, B. E.
1972-01-01
The Mie theory of scattering is used to provide new information on how changes in particle volume, with no change in dry weight, should influence light scattering for various scattering angles and particle sizes. Many biological cells (e.g., algal cells, erythrocytes) and large subcellular structures (e.g., chloroplasts, mitochondria) in suspension undergo this type of reversible volume change, a change which is related to changes in the rates of cellular processes. A previous study examined the effects of such volume changes on total scattering. In this paper scattering at 10° is found to follow total scattering closely, but scattering at 45°, 90°, 135°, and 170° behaves differently. Small volume changes can cause very large observable changes in large angle scattering if the sample particles are uniform in size; however, the natural particle size heterogeneity of most samples would mask this effect. For heterogeneous samples of most particle size ranges, particle shrink-age is found to increase large angle scattering. PMID:4556610
Angle comparison using an autocollimator
Geckeler, Ralf D.; Just, Andreas; Vasilev, Valentin; Prieto, Emilio; Dvorácek, František; Zelenika, Slobodan; Przybylska, Joanna; Duta, Alexandru; Victorov, Ilya; Pisani, Marco; Saraiva, Fernanda; Salgado, Jose-Antonio; Gao, Sitian; Anusorn, Tonmueanwai; Leng Tan, Siew; Cox, Peter; Watanabe, Tsukasa; Lewis, Andrew; Chaudhary, K. P.; Thalmann, Ruedi; Banreti, Edit; Nurul, Alfiyati; Fira, Roman; Yandayan, Tanfer; Chekirda, Konstantin; Bergmans, Rob; Lassila, Antti
2018-01-01
Autocollimators are versatile optical devices for the contactless measurement of the tilt angles of reflecting surfaces. An international key comparison (KC) on autocollimator calibration, EURAMET.L-K3.2009, was initiated by the European Association of National Metrology Institutes (EURAMET) to provide information on the capabilities in this field. The Physikalisch-Technische Bundesanstalt (PTB) acted as the pilot laboratory, with a total of 25 international participants from EURAMET and from the Asia Pacific Metrology Programme (APMP) providing measurements. This KC was the first one to utilise a high-resolution electronic autocollimator as a standard. In contrast to KCs in angle metrology which usually involve the full plane angle, it focused on relatively small angular ranges (+/-10 arcsec and +/-1000 arcsec) and step sizes (10 arcsec and 0.1 arcsec, respectively). This document represents the approved final report on the results of the KC. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCL, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
The Nasolabial Angle Among Patients with Total Cleft Lip and Palate.
Paradowska-Stolarz, Anna M; Kawala, Beata
2015-01-01
Nasolabial angle is the angle that is measured between points columella, subnasale and labiale superius. The reference values vary from 90 to 120 degrees (the mean value is 109.8 degrees). In some disorders, nasolabial angle might change. This influences the facial profile. One of such deformities are clefts. The nasolabial angle might be decreased in cleft patients due to deformation of the nose and upper lip that might be caused by the reconstructive surgical procedures performed. The aim of the study was to compare the nasolabial angle between the groups of patients with total clefts of the lip, alveolar bone and palate and healthy individuals. The cephalometric X-rays of 118 patients with clefts (73 boys and 45 girls) and 101 healthy individuals (32 boys and 69 girls) were taken into account to measure nasolabial angle and compared. In patients with cleft deformities, the nasolabial angle values were smaller than in healthy individuals. Among the patients with clefts, the ones with a bilateral type of deformity are characterized by the highest mean values of nasolabial angle. The angle is smaller in groups of girls when compared to boys. Nasolabial angle in patients with total clefts of lip, alveolar bone and palate is statistically smaller than in healthy individuals. This might be a result of either the deformation of the upper lip or (more probably) the nose. The orthodontic treatment should be individualized.
Foundations of the theory of probability
Kolmogorov, AN
2018-01-01
This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
Void probability scaling in hadron nucleus interactions
International Nuclear Information System (INIS)
Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima
2002-01-01
Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Dependent Human Error Probability Assessment
International Nuclear Information System (INIS)
Simic, Z.; Mikulicic, V.; Vukovic, I.
2006-01-01
This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the
Small angle scattering and polymers
International Nuclear Information System (INIS)
Cotton, J.P.
1996-01-01
The determination of polymer structure is a problem of interest for both statistical physics and industrial applications. The average polymer structure is defined. Then, it is shown why small angle scattering, associated with isotopic substitution, is very well suited to the measurement of the chain conformation. The corresponding example is the old, but pedagogic, measurement of the chain form factor in the polymer melt. The powerful contrast variation method is illustrated by a recent determination of the concentration profile of a polymer interface. (author) 12 figs., 48 refs
International Nuclear Information System (INIS)
Mickael, M.; Gardner, R.P.; Verghese, K.
1988-01-01
An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Directory of Open Access Journals (Sweden)
Maurer Till
2005-04-01
Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.
An Angle Criterion for Riesz Bases
DEFF Research Database (Denmark)
Lindner, Alexander M; Bittner, B.
1999-01-01
We present a characterization of Riesz bases in terms ofthe angles between certain finite dimensional subspaces. Correlationsbetween the bounds of the Riesz basis and the size of the angles arederived....
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL) probability wheel
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Defining Probability in Sex Offender Risk Assessment.
Elwood, Richard W
2016-12-01
There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
DEFF Research Database (Denmark)
Hahn, Thomas; Foldspang, Anders
1997-01-01
Quadriceps muscle contraction tends to straighten the Q angle. We expected that sports comprising a high amount of quadriceps training could be associated with low Q angles. The aim of the present study was to estimate the Q angle in athletes and to investigate its potential associations with par......Quadriceps muscle contraction tends to straighten the Q angle. We expected that sports comprising a high amount of quadriceps training could be associated with low Q angles. The aim of the present study was to estimate the Q angle in athletes and to investigate its potential associations...... with participation in sport. Three hundred and thirty-nine athletes had their Q angle measured. The mean of right-side Q angles was higher than left side, and the mean Q angle was higher in women than in men. The Q angle was positively associated with years of jogging, and negatively with years of soccer, swimming...... and sports participation at all. It is concluded that the use of Q angle measurements is questionable....
Wafer scale oblique angle plasma etching
Burckel, David Bruce; Jarecki, Jr., Robert L.; Finnegan, Patrick Sean
2017-05-23
Wafer scale oblique angle etching of a semiconductor substrate is performed in a conventional plasma etch chamber by using a fixture that supports a multiple number of separate Faraday cages. Each cage is formed to include an angled grid surface and is positioned such that it will be positioned over a separate one of the die locations on the wafer surface when the fixture is placed over the wafer. The presence of the Faraday cages influences the local electric field surrounding each wafer die, re-shaping the local field to be disposed in alignment with the angled grid surface. The re-shaped plasma causes the reactive ions to follow a linear trajectory through the plasma sheath and angled grid surface, ultimately impinging the wafer surface at an angle. The selected geometry of the Faraday cage angled grid surface thus determines the angle at with the reactive ions will impinge the wafer.
Is probability of frequency too narrow?
International Nuclear Information System (INIS)
Martz, H.F.
1993-01-01
Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed
Evaluation of blotchy pigments in the anterior chamber angle as a sign of angle closure
Directory of Open Access Journals (Sweden)
Harsha L Rao
2012-01-01
Full Text Available Background: Blotchy pigments in the anterior chamber (AC angle are considered diagnostic of primary angle closure (PAC. But there are no reports either on the prevalence of blotchy pigments in AC angles or the validity of this sign. Aims: To determine the prevalence of blotchy pigments in AC angles and to evaluate their relationship with glaucomatous optic neuropathy (GON in eyes with occludable angles. Setting and Design: Cross-sectional, comparative study. Materials and Methods: Gonioscopy was performed in 1001 eyes of 526 subjects (245 eyes of 148 consecutive, occludable angle subjects and 756 eyes of 378 non-consecutive, open angle subjects, above 35 years of age. Quadrant-wise location of blotchy pigments was documented. Statistical Analysis: Odds of blotchy pigments in occludable angles against that in open angles were evaluated. Relationship of GON with blotchy pigments in occludable angle eyes was evaluated using a multivariate model. Results: Prevalence of blotchy pigments in occludable angles was 28.6% (95% CI, 22.9-34.3 and in open angles was 4.7% (95% CI, 3.2-6.3. Blotchy pigments were more frequently seen in inferior (16% and superior quadrants (15% of occludable angles, and inferior quadrant of open angles (4%. Odds of superior quadrant blotchy pigments in occludable angles were 33 times that in open angles. GON was seen in 107 occludable angle eyes. Blotchy pigments were not significantly associated with GON (odds ratio = 0.5; P = 0.1. Conclusions: Blotchy pigments were seen in 28.6% of occludable angle eyes and 4.7% of open angles eyes. Presence of blotchy pigments in the superior quadrant is more common in occludable angles. Presence of GON in occludable angle eyes was not associated with blotchy pigments.
Evaluation of blotchy pigments in the anterior chamber angle as a sign of angle closure
Rao, Harsha L; Mungale, Sachin C; Kumbar, Tukaram; Parikh, Rajul S; Garudadri, Chandra S
2012-01-01
Background: Blotchy pigments in the anterior chamber (AC) angle are considered diagnostic of primary angle closure (PAC). But there are no reports either on the prevalence of blotchy pigments in AC angles or the validity of this sign. Aims: To determine the prevalence of blotchy pigments in AC angles and to evaluate their relationship with glaucomatous optic neuropathy (GON) in eyes with occludable angles. Setting and Design: Cross-sectional, comparative study. Materials and Methods: Gonioscopy was performed in 1001 eyes of 526 subjects (245 eyes of 148 consecutive, occludable angle subjects and 756 eyes of 378 non-consecutive, open angle subjects), above 35 years of age. Quadrant-wise location of blotchy pigments was documented. Statistical Analysis: Odds of blotchy pigments in occludable angles against that in open angles were evaluated. Relationship of GON with blotchy pigments in occludable angle eyes was evaluated using a multivariate model. Results: Prevalence of blotchy pigments in occludable angles was 28.6% (95% CI, 22.9-34.3) and in open angles was 4.7% (95% CI, 3.2-6.3). Blotchy pigments were more frequently seen in inferior (16%) and superior quadrants (15%) of occludable angles, and inferior quadrant of open angles (4%). Odds of superior quadrant blotchy pigments in occludable angles were 33 times that in open angles. GON was seen in 107 occludable angle eyes. Blotchy pigments were not significantly associated with GON (odds ratio = 0.5; P = 0.1). Conclusions: Blotchy pigments were seen in 28.6% of occludable angle eyes and 4.7% of open angles eyes. Presence of blotchy pigments in the superior quadrant is more common in occludable angles. Presence of GON in occludable angle eyes was not associated with blotchy pigments. PMID:23202393
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Introducing Disjoint and Independent Events in Probability.
Kelly, I. W.; Zwiers, F. W.
Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Collective probabilities algorithm for surface hopping calculations
International Nuclear Information System (INIS)
Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto
2003-01-01
General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method
Examples of Neutrosophic Probability in Physics
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Some open problems in noncommutative probability
International Nuclear Information System (INIS)
Kruszynski, P.
1981-01-01
A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Against All Odds: When Logic Meets Probability
van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.
2017-01-01
This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
The probability of the false vacuum decay
International Nuclear Information System (INIS)
Kiselev, V.; Selivanov, K.
1983-01-01
The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
The transition probabilities of the reciprocity model
Snijders, T.A.B.
1999-01-01
The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
The paediatric Bohler's angle and crucial angle of Gissane: a case series
Directory of Open Access Journals (Sweden)
Crawford Haemish A
2011-01-01
Full Text Available Abstract Background Bohler's angle and the crucial angle of Gissane can be used to assess calcaneal fractures. While the normal adult values of these angles are widely known, the normal paediatric values have not yet been established. Our aim is to investigate Bohler's angle and the crucial angle of Gissane in a paediatric population and establish normal paediatric reference values. Method We measured Bohler's angle and the crucial angle of Gissane using normal plain ankle radiographs of 763 patients from birth to 14 years of age completed over a five year period from July 2003 to June 2008. Results In our paediatric study group, the mean Bohler's angle was 35.2 degrees and the mean crucial angle of Gissane was 111.3 degrees. In an adult comparison group, the mean Bohler's angle was 39.2 degrees and the mean crucial angle of Gissane was 113.8 degrees. The differences in Bohler's angle and the crucial angle of Gissane between these two groups were statistically significant. Conclusion We have presented the normal values of Bohler's angle and the crucial angle of Gissane in a paediatric population. These values may provide a useful comparison to assist with the management of the paediatric calcaneal fracture.
The enigma of probability and physics
International Nuclear Information System (INIS)
Mayants, L.
1984-01-01
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
International Nuclear Information System (INIS)
Shimada, Yoshio
2000-01-01
It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)
International Nuclear Information System (INIS)
Foudray, Angela M K; Levin, Craig S
2007-01-01
PET at the highest level is an inverse problem: reconstruct the location of the emission (which localize biological function) from detected photons. Ideally, one would like to directly measure an annihilation photon's incident direction on the detector. In the developed algorithm, Bayesian Estimation for Angle Recovery (BEAR), we utilized the increased information gathered from localizing photon interactions in the detector and developed a Bayesian estimator for a photon's incident direction. Probability distribution functions (PDFs) were filled using an interaction energy weighted mean or center of mass (COM) reference space, which had the following computational advantages: (1) a significant reduction in the size of the data in measurement space, making further manipulation and searches faster (2) the construction of COM space does not depend on measurement location, it takes advantage of measurement symmetries, and data can be added to the training set without knowledge and recalculation of prior training data, (3) calculation of posterior probability map is fully parallelizable, it can scale to any number of processors. These PDFs were used to estimate the point spread function (PSF) in incident angle space for (i) algorithm assessment and (ii) to provide probability selection criteria for classification. The algorithm calculates both the incident θ and φ angle, with ∼16 degrees RMS in both angles, limiting the incoming direction to a narrow cone. Feature size did not improve using the BEAR algorithm as an angle filter, but the contrast ratio improved 40% on average
A thermodynamic model of contact angle hysteresis.
Makkonen, Lasse
2017-08-14
When a three-phase contact line moves along a solid surface, the contact angle no longer corresponds to the static equilibrium angle but is larger when the liquid is advancing and smaller when the liquid is receding. The difference between the advancing and receding contact angles, i.e., the contact angle hysteresis, is of paramount importance in wetting and capillarity. For example, it determines the magnitude of the external force that is required to make a drop slide on a solid surface. Until now, fundamental origin of the contact angle hysteresis has been controversial. Here, this origin is revealed and a quantitative theory is derived. The theory is corroborated by the available experimental data for a large number of solid-liquid combinations. The theory is applied in modelling the contact angle hysteresis on a textured surface, and these results are also in quantitative agreement with the experimental data.
A Property of Crack Propagation at the Specimen of CFRP with Layer Angle
Energy Technology Data Exchange (ETDEWEB)
Hwang, Gue Wan; Cho, Jae Ung [Kongju Univ., Kongju (Korea, Republic of); Cho, Chong Du [Inha Univ., Incheon (Korea, Republic of)
2016-12-15
CFRP is the composite material manufactured by the hybrid resin on the basis of carbon fiber. As this material has the high specific strength and the light weight, it has been widely used at various fields. Particularly, the unidirectional carbon fiber can be applied with the layer angle. CFRP made with layer angle has the strength higher than with no layer angle. In this paper, the property of crack growth due to each layer angle was investigated on the crack propagation and fracture behavior of the CFRP compact tension specimen due to the change of layer angle. The value of maximum stress is shown to be decreased and the crack propagation is slowed down as the layer angle is increased. But the limit according to the layer angle is shown as the stress is increased again from the base point of the layer angle of 60°.This study result is thought to be utilized with the data which verify the probability of fatigue fracture when the defect inside the structure at using CFRP of mechanical structure happens.
A Property of Crack Propagation at the Specimen of CFRP with Layer Angle
International Nuclear Information System (INIS)
Hwang, Gue Wan; Cho, Jae Ung; Cho, Chong Du
2016-01-01
CFRP is the composite material manufactured by the hybrid resin on the basis of carbon fiber. As this material has the high specific strength and the light weight, it has been widely used at various fields. Particularly, the unidirectional carbon fiber can be applied with the layer angle. CFRP made with layer angle has the strength higher than with no layer angle. In this paper, the property of crack growth due to each layer angle was investigated on the crack propagation and fracture behavior of the CFRP compact tension specimen due to the change of layer angle. The value of maximum stress is shown to be decreased and the crack propagation is slowed down as the layer angle is increased. But the limit according to the layer angle is shown as the stress is increased again from the base point of the layer angle of 60°.This study result is thought to be utilized with the data which verify the probability of fatigue fracture when the defect inside the structure at using CFRP of mechanical structure happens
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Miniati, M.; Pistolesi, M.
2001-01-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Jiang, Yuzhen; Chang, Dolly S; Zhu, Haogang; Khawaja, Anthony P; Aung, Tin; Huang, Shengsong; Chen, Qianyun; Munoz, Beatriz; Grossi, Carlota M; He, Mingguang; Friedman, David S; Foster, Paul J
2014-09-01
To determine longitudinal changes in angle configuration in the eyes of primary angle-closure suspects (PACS) treated by laser peripheral iridotomy (LPI) and in untreated fellow eyes. Longitudinal cohort study. Primary angle-closure suspects aged 50 to 70 years were enrolled in a randomized, controlled clinical trial. Each participant was treated by LPI in 1 randomly selected eye, with the fellow eye serving as a control. Angle width was assessed in a masked fashion using gonioscopy and anterior segment optical coherence tomography (AS-OCT) before and at 2 weeks, 6 months, and 18 months after LPI. Angle width in degrees was calculated from Shaffer grades assessed under static gonioscopy. Angle configuration was also evaluated using angle opening distance (AOD250, AOD500, AOD750), trabecular-iris space area (TISA500, TISA750), and angle recess area (ARA) measured in AS-OCT images. No significant difference was found in baseline measures of angle configuration between treated and untreated eyes. At 2 weeks after LPI, the drainage angle on gonioscopy widened from a mean of 13.5° at baseline to a mean of 25.7° in treated eyes, which was also confirmed by significant increases in all AS-OCT angle width measures (Pgonioscopy (P = 0.18), AOD250 (P = 0.167) and ARA (P = 0.83). In untreated eyes, angle width consistently decreased across all follow-up visits after LPI, with a more rapid longitudinal decrease compared with treated eyes (P values for all variables ≤0.003). The annual rate of change in angle width was equivalent to 1.2°/year (95% confidence interval [CI], 0.8-1.6) in treated eyes and 1.6°/year (95% CI, 1.3-2.0) in untreated eyes (P<0.001). Angle width of treated eyes increased markedly after LPI, remained stable for 6 months, and then decreased significantly by 18 months after LPI. Untreated eyes experienced a more consistent and rapid decrease in angle width over the same time period. Copyright © 2014 American Academy of Ophthalmology. Published by
A method for the generation of random multiple Coulomb scattering angles
International Nuclear Information System (INIS)
Campbell, J.R.
1995-06-01
A method for the random generation of spatial angles drawn from non-Gaussian multiple Coulomb scattering distributions is presented. The method employs direct numerical inversion of cumulative probability distributions computed from the universal non-Gaussian angular distributions of Marion and Zimmerman. (author). 12 refs., 3 figs
Upgrading Probability via Fractions of Events
Directory of Open Access Journals (Sweden)
Frič Roman
2016-08-01
Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.
Failure probability analysis of optical grid
Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng
2008-11-01
Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.
Behavior of Tilted Angle Shear Connectors
Khorramian, Koosha; Maleki, Shervin; Shariati, Mahdi; Ramli Sulong, N. H.
2015-01-01
According to recent researches, angle shear connectors are appropriate to transfer longitudinal shear forces across the steel-concrete interface. Angle steel profile has been used in different positions as L-shaped or C-shaped shear connectors. The application of angle shear connectors in tilted positions is of interest in this study. This study investigates the behaviour of tilted-shaped angle shear connectors under monotonic loading using experimental push out tests. Eight push-out specimens are tested to investigate the effects of different angle parameters on the ultimate load capacity of connectors. Two different tilted angles of 112.5 and 135 degrees between the angle leg and steel beam are considered. In addition, angle sizes and lengths are varied. Two different failure modes were observed consisting of concrete crushing-splitting and connector fracture. By increasing the size of connector, the maximum load increased for most cases. In general, the 135 degrees tilted angle shear connectors have a higher strength and stiffness than the 112.5 degrees type. PMID:26642193
Behavior of Tilted Angle Shear Connectors.
Directory of Open Access Journals (Sweden)
Koosha Khorramian
Full Text Available According to recent researches, angle shear connectors are appropriate to transfer longitudinal shear forces across the steel-concrete interface. Angle steel profile has been used in different positions as L-shaped or C-shaped shear connectors. The application of angle shear connectors in tilted positions is of interest in this study. This study investigates the behaviour of tilted-shaped angle shear connectors under monotonic loading using experimental push out tests. Eight push-out specimens are tested to investigate the effects of different angle parameters on the ultimate load capacity of connectors. Two different tilted angles of 112.5 and 135 degrees between the angle leg and steel beam are considered. In addition, angle sizes and lengths are varied. Two different failure modes were observed consisting of concrete crushing-splitting and connector fracture. By increasing the size of connector, the maximum load increased for most cases. In general, the 135 degrees tilted angle shear connectors have a higher strength and stiffness than the 112.5 degrees type.
The qualitative criterion of transient angle stability
DEFF Research Database (Denmark)
Lyu, R.; Xue, Y.; Xue, F.
2015-01-01
In almost all the literatures, the qualitative assessment of transient angle stability extracts the angle information of generators based on the swing curve. As the angle (or angle difference) of concern and the threshold value rely strongly on the engineering experience, the validity and robust...... of these criterions are weak. Based on the stability mechanism from the extended equal area criterion (EEAC) theory and combining with abundant simulations of real system, this paper analyzes the criterions in most literatures and finds that the results could be too conservative or too optimistic. It is concluded...
Uncertainty about probability: a decision analysis perspective
International Nuclear Information System (INIS)
Howard, R.A.
1988-01-01
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group
Applications of Trajectory Solid Angle for Probabilistic Safety Assessment
International Nuclear Information System (INIS)
Wong, Po Kee; Wong, Adam E.; Wong, Anita
2002-01-01
In 1974, a well-known research problem in Statistical Mechanics entitled 'To determine and define the probability function P.sub.2 of a particle hitting a predetermined area, given all its parameters of generation and ejection' was openly solicited for its solution from research and development organizations in U.S.A. One of many proposed solutions of the problem, initiated at that time, is by means of the Trajectory Solid Angle (TSA). TSA is defined as the integral of the dot product of the unit tangent of the particle's trajectory to the vector area divided by the square of the position vector connecting between the point of ejection and that of the surface to be hit. The invention provides: (1) The precise and the unique solution of a previously unsolved P.sub.2 problem: (2) Impacts to the governmental NRC safety standards and DOD weapon systems and many activities in the Department of Energy; (3) Impacts to update the contents of text books of physics and mathematics of all levels; (4) Impacts to the scientific instruments with applications in high technologies. The importance of Trajectory Solid Angle can be quoted from a letter by the late Institute Professor P. M. Morse of MIT who reviewed the DOE proposal P7900450 (reference No. 7) in 1979 and addressed to the inventor. 'If the Trajectory Solid Angle is correct it will provide a revolutionary concept in physics'. (authors)
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Dependency models and probability of joint events
International Nuclear Information System (INIS)
Oerjasaeter, O.
1982-08-01
Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Probabilities on Streams and Reflexive Games
Directory of Open Access Journals (Sweden)
Andrew Schumann
2014-01-01
Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Modeling experiments using quantum and Kolmogorov probability
International Nuclear Information System (INIS)
Hess, Karl
2008-01-01
Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.
Keren, G.; Teigen, K.H.
2001-01-01
This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which
Determination of stability of epimetamorphic rock slope using Minimax Probability Machine
Directory of Open Access Journals (Sweden)
Manoj Kumar
2016-01-01
Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.
Optimum Tilt Angle at Tropical Region
Directory of Open Access Journals (Sweden)
S Soulayman
2015-02-01
Full Text Available : One of the important parameters that affect the performance of a solar collector is its tilt angle with the horizon. This is because of the variation of tilt angle changes the amount of solar radiation reaching the collector surface. Meanwhile, is the rule of thumb, which says that solar collector Equator facing position is the best, is valid for tropical region? Thus, it is required to determine the optimum tilt as for Equator facing and for Pole oriented collectors. In addition, the question that may arise: how many times is reasonable for adjusting collector tilt angle for a definite value of surface azimuth angle? A mathematical model was used for estimating the solar radiation on a tilted surface, and to determine the optimum tilt angle and orientation (surface azimuth angle for the solar collector at any latitude. This model was applied for determining optimum tilt angle and orientation in the tropical zones, on a daily basis, as well as for a specific period. The optimum angle was computed by searching for the values for which the radiation on the collector surface is a maximum for a particular day or a specific period. The results reveal that changing the tilt angle 12 times in a year (i.e. using the monthly optimum tilt angle maintains approximately the total amount of solar radiation near the maximum value that is found by changing the tilt angle daily to its optimum value. This achieves a yearly gain in solar radiation of 11% to 18% more than the case of a solar collector fixed on a horizontal surface.
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Encounter Probability of Individual Wave Height
DEFF Research Database (Denmark)
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Predicting binary choices from probability phrase meanings.
Wallsten, Thomas S; Jang, Yoonhee
2008-08-01
The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.
Certainties and probabilities of the IPCC
International Nuclear Information System (INIS)
2004-01-01
Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)
The probability factor in establishing causation
International Nuclear Information System (INIS)
Hebert, J.
1988-01-01
This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr
Bayesian optimization for computationally extensive probability distributions.
Tamura, Ryo; Hukushima, Koji
2018-01-01
An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.
Characteristic length of the knotting probability revisited
International Nuclear Information System (INIS)
Uehara, Erica; Deguchi, Tetsuo
2015-01-01
We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)
Probability of Survival Decision Aid (PSDA)
National Research Council Canada - National Science Library
Xu, Xiaojiang; Amin, Mitesh; Santee, William R
2008-01-01
A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Determining probabilities of geologic events and processes
International Nuclear Information System (INIS)
Hunter, R.L.; Mann, C.J.; Cranwell, R.M.
1985-01-01
The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Probability of spent fuel transportation accidents
International Nuclear Information System (INIS)
McClure, J.D.
1981-07-01
The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Schreurs, Mervin J; Benjaminse, Anne; Lemmink, Koen A P M
2017-10-03
Cutting is an important skill in team-sports, but unfortunately is also related to non-contact ACL injuries. The purpose was to examine knee kinetics and kinematics at different cutting angles. 13 males and 16 females performed cuts at different angles (45°, 90°, 135° and 180°) at maximum speed. 3D kinematics and kinetics were collected. To determine differences across cutting angles (45°, 90°, 135° and 180°) and sex (female, male), a 4×2 repeated measures ANOVA was conducted followed by post hoc comparisons (Bonferroni) with alpha level set at α≤0.05a priori. At all cutting angles, males showed greater knee flexion angles than females (pcutting angles with no differences in the amount of knee flexion -42.53°±8.95°, females decreased their knee flexion angle from -40.6°±7.2° when cutting at 45° to -36.81°±9.10° when cutting at 90°, 135° and 180° (pcutting towards sharper angles (pcutting angles and then stabilized compared to the 45° cutting angle (pcutting to sharper angles (pcutting angles demand different knee kinematics and kinetics. Sharper cutting angles place the knee more at risk. However, females and males handle this differently, which has implications for injury prevention. Copyright © 2017 Elsevier Ltd. All rights reserved.
Imprecise Probability Methods for Weapons UQ
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Escape and transmission probabilities in cylindrical geometry
International Nuclear Information System (INIS)
Bjerke, M.A.
1980-01-01
An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Collision Probabilities for Finite Cylinders and Cuboids
Energy Technology Data Exchange (ETDEWEB)
Carlvik, I
1967-05-15
Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
Page 1 '---------------------------- Presenting features ofprimary angle ...
African Journals Online (AJOL)
coma were assessed. The diagnosis of primary angle-closure glaucoma was made on presentation if the intra-ocular pressure was > 21 mmHg, or if a glaucomatous visual field was found, in the presence of a partially or totally closed angle or peripheral anterior synechiae. Provocation tests were not performed. Patients ...
Gaugings at angles from orientifold reductions
International Nuclear Information System (INIS)
Roest, Diederik
2009-01-01
We consider orientifold reductions to N= 4 gauged supergravity in four dimensions. A special feature of this theory is that different factors of the gauge group can have relative angles with respect to the electro-magnetic SL(2) symmetry. These are crucial for moduli stabilization and de Sitter vacua. We show how such gaugings at angles generically arise in orientifold reductions.
Automatic Cobb Angle Determination From Radiographic Images
Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.
2013-01-01
Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.
Practical evaluation of action-angle variables
International Nuclear Information System (INIS)
Boozer, A.H.
1984-02-01
A practical method is described for establishing action-angle variables for a Hamiltonian system. That is, a given nearly integrable Hamiltonian is divided into an exactly integrable system plus a perturbation in action-angle form. The transformation of variables, which is carried out using a few short trajectory integrations, permits a rapid determination of trajectory properties throughout a phase space volume
Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation
International Nuclear Information System (INIS)
Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto
2015-01-01
It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)
Energy and angle differential cross sections for the electron-impact double ionization of helium
International Nuclear Information System (INIS)
Colgan, James P.; Pindzola, M.S.; Robicheaux, F.
2008-01-01
Energy and angle differential cross sections for the electron-impact double ionization of helium are calculated using a non-perturbative time-dependent close-coupling method. Collision probabilities are found by projection of a time evolved nine dimensional coordinate space wave function onto fully antisymmetric products of spatial and spin functions representing three outgoing Coulomb waves. At an incident energy of 106 eV, we present double energy differential cross sections and pentuple energy and angle differential cross sections. The pentuple energy and angle differential cross sections are found to be in relative agreement with the shapes observed in recent (e,3e) reaction microscope experiments. Integration of the differential cross sections over all energies and angles yields a total ionization cross section that is also in reasonable agreement with absolute crossed-beams experiments.
Effects of Compound K-Distributed Sea Clutter on Angle Measurement of Wideband Monopulse Radar
Directory of Open Access Journals (Sweden)
Hong Zhu
2017-01-01
Full Text Available The effects of compound K-distributed sea clutter on angle measurement of wideband monopulse radar are investigated in this paper. We apply the conditional probability density function (pdf of monopulse ratio (MR error to analyze these effects. Based on the angle measurement procedure of the wideband monopulse radar, this conditional pdf is first deduced in detail for the case of compound K-distributed sea clutter plus noise. Herein, the spatial correlation of the texture components for each channel clutter and the correlation of the texture components between the sum and difference channel clutters are considered, and two extreme situations for each of them are tackled. Referring to the measured sea clutter data, angle measurement performances in various K-distributed sea clutter plus noise circumstances are simulated, and the effects of compound K-distributed sea clutter on angle measurement are discussed.
Apparent contact angle and contact angle hysteresis on liquid infused surfaces.
Semprebon, Ciro; McHale, Glen; Kusumaatmaja, Halim
2016-12-21
We theoretically investigate the apparent contact angle and contact angle hysteresis of a droplet placed on a liquid infused surface. We show that the apparent contact angle is not uniquely defined by material parameters, but also has a dependence on the relative size between the droplet and its surrounding wetting ridge formed by the infusing liquid. We derive a closed form expression for the contact angle in the limit of vanishing wetting ridge, and compute the correction for small but finite ridge, which corresponds to an effective line tension term. We also predict contact angle hysteresis on liquid infused surfaces generated by the pinning of the contact lines by the surface corrugations. Our analytical expressions for both the apparent contact angle and contact angle hysteresis can be interpreted as 'weighted sums' between the contact angles of the infusing liquid relative to the droplet and surrounding gas phases, where the weighting coefficients are given by ratios of the fluid surface tensions.
A lattice determination of gA and left angle x right angle from overlap fermions
International Nuclear Information System (INIS)
Guertler, M.; Schiller, A.; Streuer, T.; Freie Univ. Berlin
2004-10-01
We present results for the nucleon's axial charge g A and the first moment left angle x right angle of the unpolarized parton distribution function from a simulation of quenched overlap fermions. (orig.)
Laser peripheral iridoplasty for angle-closure.
Ng, Wai Siene; Ang, Ghee Soon; Azuara-Blanco, Augusto
2012-02-15
Angle-closure glaucoma is a leading cause of irreversible blindness in the world. Treatment is aimed at opening the anterior chamber angle and lowering the IOP with medical and/or surgical treatment (e.g. trabeculectomy, lens extraction). Laser iridotomy works by eliminating pupillary block and widens the anterior chamber angle in the majority of patients. When laser iridotomy fails to open the anterior chamber angle, laser iridoplasty may be recommended as one of the options in current standard treatment for angle-closure. Laser peripheral iridoplasty works by shrinking and pulling the peripheral iris tissue away from the trabecular meshwork. Laser peripheral iridoplasty can be used for crisis of acute angle-closure and also in non-acute situations. To assess the effectiveness of laser peripheral iridoplasty in the treatment of narrow angles (i.e. primary angle-closure suspect), primary angle-closure (PAC) or primary angle-closure glaucoma (PACG) in non-acute situations when compared with any other intervention. In this review, angle-closure will refer to patients with narrow angles (PACs), PAC and PACG. We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2011, Issue 12), MEDLINE (January 1950 to January 2012), EMBASE (January 1980 to January 2012), Latin American and Caribbean Literature on Health Sciences (LILACS) (January 1982 to January 2012), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov) and the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). There were no date or language restrictions in the electronic searches for trials. The electronic databases were last searched on 5 January 2012. We included only randomised controlled trials (RCTs) in this review. Patients with narrow angles, PAC or PACG were eligible. We excluded studies that included only patients with acute presentations
Scoliosis angle. Conceptual basis and proposed definition
Energy Technology Data Exchange (ETDEWEB)
Marklund, T [Linkoepings Hoegskola (Sweden)
1978-01-01
The most commonly used methods of assessing the scoliotic deviation measure angles that are not clearly defined in relation to the anatomy of the patient. In order to give an anatomic basis for such measurements it is proposed to define the scoliotic deviation as the deviation the vertebral column makes with the sagittal plane. Both the Cobb and the Ferguson angles may be based on this definition. The present methods of measurement are then attempts to measure these angles. If the plane of these angles is parallel to the film, the measurement will be correct. Errors in the measurements may be incurred by the projection. A hypothetical projection, called a 'rectified orthogonal projection', is presented, which correctly represents all scoliotic angles in accordance with these principles. It can be constructed in practice with the aid of a computer and by performing measurements on two projections of the vertebral column; a scoliotic curve can be represented independent of the kyphosis and lordosis.
The resection angle in apical surgery
DEFF Research Database (Denmark)
von Arx, Thomas; Janner, Simone F M; Jensen, Simon S
2016-01-01
OBJECTIVES: The primary objective of the present radiographic study was to analyse the resection angle in apical surgery and its correlation with treatment outcome, type of treated tooth, surgical depth and level of root-end filling. MATERIALS AND METHODS: In the context of a prospective clinical...... study, cone beam computed tomography (CBCT) scans were taken before and 1 year after apical surgery to measure the angle of the resection plane relative to the longitudinal axis of the root. Further, the surgical depth (distance from the buccal cortex to the most lingual/palatal point of the resection...... or with the retrofilling length. CONCLUSIONS: Statistically significant differences were observed comparing resection angles of different tooth groups. However, the angle had no significant effect on treatment outcome. CLINICAL RELEVANCE: Contrary to common belief, the resection angle in maxillary anterior teeth...
Experimental study of crossing angle collision
International Nuclear Information System (INIS)
Chen, T.; Rice, D.; Rubin, D.; Sagan, D.; Tigner, M.
1993-01-01
The non-linear coupling due to the beam-beam interaction with crossing angle has been studied. The major effect of a small (∼12mrad) crossing angle is to excite 5Q x ±Q s =integer coupling resonance family on large amplitude particles, which results in bad lifetime. On the CESR, a small crossing angle (∼2.4mr) was created at the IP and a reasonable beam-beam tune-shift was achieved. The decay rate of the beam is measured as a function of horizontal tune with and without crossing angle. The theoretical analysis, simulation and experimental measurements have a good agreement. The resonance strength as a function of crossing angle is also measured
Apparent Contact Angle and Contact Angle Hysteresis on Liquid Infused Surfaces
Semprebon, Ciro; McHale, Glen; Kusumaatmaja, Halim
2016-01-01
We theoretically investigate the apparent contact angle and contact angle hysteresis of a droplet placed on a liquid infused surface. We show that the apparent contact angle is not uniquely defined by material parameters, but also has a strong dependence on the relative size between the droplet and its surrounding wetting ridge formed by the infusing liquid. We derive a closed form expression for the contact angle in the limit of vanishing wetting ridge, and compute the correction for small b...
Alpha-particle emission probabilities of ²³⁶U obtained by alpha spectrometry.
Marouli, M; Pommé, S; Jobbágy, V; Van Ammel, R; Paepen, J; Stroh, H; Benedik, L
2014-05-01
High-resolution alpha-particle spectrometry was performed with an ion-implanted silicon detector in vacuum on a homogeneously electrodeposited (236)U source. The source was measured at different solid angles subtended by the detector, varying between 0.8% and 2.4% of 4π sr, to assess the influence of coincidental detection of alpha-particles and conversion electrons on the measured alpha-particle emission probabilities. Additional measurements were performed using a bending magnet to eliminate conversion electrons, the results of which coincide with normal measurements extrapolated to an infinitely small solid angle. The measured alpha emission probabilities for the three main peaks - 74.20 (5)%, 25.68 (5)% and 0.123 (5)%, respectively - are consistent with literature data, but their precision has been improved by at least one order of magnitude in this work. © 2013 Published by Elsevier Ltd.
Survival and Growth of Cottonwood Clones After Angle Planting and Base Angle Treatments
W.K. Randall; Harvey E. Kennedy
1976-01-01
Presently, commercial cottonwood plantations in the lower Mississippi Valley are established using vertically planted, unrooted cuttings with a flat (90Â°) base. Neither survival nor first-year growth of a group of six Stoneville clones was improved by angle planting or cutting base angles diagonally. For one clone, survival was significantly better when base angle was...
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
Creation of the {pi} angle standard for the flat angle measurements
Energy Technology Data Exchange (ETDEWEB)
Giniotis, V; Rybokas, M, E-mail: gi@ap.vtu.l, E-mail: MRybokas@gama.l [Department of Information Technologies, Vilnius Gediminas Technical University, Sauletekio al. 11, 10223 Vilnius-40 (Lithuania)
2010-07-01
Angle measurements are based mainly on multiangle prisms - polygons with autocollimators, rotary encoders for high accuracy and circular scales as the standards of the flat angle. Traceability of angle measurements is based on the standard of the plane angle - prism (polygon) calibrated at an appropriate accuracy. Some metrological institutions have established their special test benches (comparators) equipped with circular scales or rotary encoders of high accuracy and polygons with autocollimators for angle calibration purposes. Nevertheless, the standard (etalon) of plane angle - polygon has many restrictions for the transfer of angle unit - radian (rad) and other units of angle. It depends on the number of angles formed by the flat sides of the polygon that is restricted by technological and metrological difficulties related to the production and accuracy determination of the polygon. A possibility to create the standard of the angle equal to {pi} rad or half the circle or the full angle is proposed. It can be created by the circular scale with the rotation axis of very high accuracy and two precision reading instruments, usually, photoelectric microscopes (PM), placed on the opposite sides of the circular scale using the special alignment steps. A great variety of angle units and values can be measured and its traceability ensured by applying the third PM on the scale. Calibration of the circular scale itself and other scale or rotary encoder as well is possible using the proposed method with an implementation of {pi} rad as the primary standard angle. The method proposed enables to assure a traceability of angle measurements at every laboratory having appropriate environment and reading instruments of appropriate accuracy together with a rotary table with the rotation axis of high accuracy - rotation trajectory (runout) being in the range of 0.05 {mu}m. Short information about the multipurpose angle measurement test bench developed is presented.
On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!
Directory of Open Access Journals (Sweden)
Mark R. Crovelli
2009-06-01
Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.
Angle closure glaucoma in congenital ectropion uvea
Directory of Open Access Journals (Sweden)
Grace M. Wang
2018-06-01
Full Text Available Purpose: Congenital ectropion uvea is a rare anomaly, which is associated with open, but dysplastic iridocorneal angles that cause childhood glaucoma. Herein, we present 3 cases of angle-closure glaucoma in children with congenital ectropion uvea. Observations: Three children were initially diagnosed with unilateral glaucoma secondary to congenital ectropion uvea at 7, 8 and 13 years of age. The three cases showed 360° of ectropion uvea and iris stromal atrophy in the affected eye. In one case, we have photographic documentation of progression to complete angle closure, which necessitated placement of a glaucoma drainage device 3 years after combined trabeculotomy and trabeculectomy. The 2 other cases, which presented as complete angle closure, also underwent glaucoma drainage device implantation. All three cases had early glaucoma drainage device encapsulation (within 4 months and required additional surgery (cycloablation or trabeculectomy. Conclusions and importance: Congenital ectropion uvea can be associated with angle-closure glaucoma, and placement of glaucoma drainage devices in all 3 of our cases showed early failure due to plate encapsulation. Glaucoma in congenital ectropion uvea requires attention to angle configuration and often requires multiple surgeries to obtain intraocular pressure control. Keywords: Congenital ectropion uvea, Juvenile glaucoma, Angle-closure glaucoma, Glaucoma drainage device
Modified Angle's Classification for Primary Dentition.
Chandranee, Kaushik Narendra; Chandranee, Narendra Jayantilal; Nagpal, Devendra; Lamba, Gagandeep; Choudhari, Purva; Hotwani, Kavita
2017-01-01
This study aims to propose a modification of Angle's classification for primary dentition and to assess its applicability in children from Central India, Nagpur. Modification in Angle's classification has been proposed for application in primary dentition. Small roman numbers i/ii/iii are used for primary dentition notation to represent Angle's Class I/II/III molar relationships as in permanent dentition, respectively. To assess applicability of modified Angle's classification a cross-sectional preschool 2000 children population from central India; 3-6 years of age residing in Nagpur metropolitan city of Maharashtra state were selected randomly as per the inclusion and exclusion criteria. Majority 93.35% children were found to have bilateral Class i followed by 2.5% bilateral Class ii and 0.2% bilateral half cusp Class iii molar relationships as per the modified Angle's classification for primary dentition. About 3.75% children had various combinations of Class ii relationships and 0.2% children were having Class iii subdivision relationship. Modification of Angle's classification for application in primary dentition has been proposed. A cross-sectional investigation using new classification revealed various 6.25% Class ii and 0.4% Class iii molar relationships cases in preschool children population in a metropolitan city of Nagpur. Application of the modified Angle's classification to other population groups is warranted to validate its routine application in clinical pediatric dentistry.
Modified angle's classification for primary dentition
Directory of Open Access Journals (Sweden)
Kaushik Narendra Chandranee
2017-01-01
Full Text Available Aim: This study aims to propose a modification of Angle's classification for primary dentition and to assess its applicability in children from Central India, Nagpur. Methods: Modification in Angle's classification has been proposed for application in primary dentition. Small roman numbers i/ii/iii are used for primary dentition notation to represent Angle's Class I/II/III molar relationships as in permanent dentition, respectively. To assess applicability of modified Angle's classification a cross-sectional preschool 2000 children population from central India; 3–6 years of age residing in Nagpur metropolitan city of Maharashtra state were selected randomly as per the inclusion and exclusion criteria. Results: Majority 93.35% children were found to have bilateral Class i followed by 2.5% bilateral Class ii and 0.2% bilateral half cusp Class iii molar relationships as per the modified Angle's classification for primary dentition. About 3.75% children had various combinations of Class ii relationships and 0.2% children were having Class iii subdivision relationship. Conclusions: Modification of Angle's classification for application in primary dentition has been proposed. A cross-sectional investigation using new classification revealed various 6.25% Class ii and 0.4% Class iii molar relationships cases in preschool children population in a metropolitan city of Nagpur. Application of the modified Angle's classification to other population groups is warranted to validate its routine application in clinical pediatric dentistry.
Preferred nasolabial angle in Middle Eastern population.
Alharethy, Sami
2017-05-01
To define the preferred nasolabial angle measurement in Middle Eastern population. An observational study was conducted from January 2012 to January 2016 at the Department of Otolaryngology, Head and Neck Surgery, King Abdulaziz University Hospital, King Saud University, Riyadh, Kingdom of Saudi Arabia. A total of 1027 raters, 506 males, and 521 females were asked to choose the most ideal nasolabial angle for 5 males and 5 females lateral photographs whose nasolabial angle were modified with Photoshop into the following angles (85°, 90°, 95°, 100°, 105°, and 110°). Male raters preferred the angle of 89.5° ± 3.5° (mean ± SD) for males and 90.8° ± 5.6° for females. While female raters preferred the angle of 89.3° ± 3.8° for males and 90.5° ± 4.8° for females. ANOVA test compare means among groups: p: 0.342, and there is no statistically significant difference between groups. The results of our study showed an even more acute angles than degrees found in the literature. It shows that what young generation in our region prefers and clearly reflects that what could be explained as under rotation of the nasal tip in other cultures is just the ideal for some Middle Eastern population.
Uncertainty relation and probability. Numerical illustration
International Nuclear Information System (INIS)
Fujikawa, Kazuo; Umetsu, Koichiro
2011-01-01
The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...
Comparing coefficients of nested nonlinear probability models
DEFF Research Database (Denmark)
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...
A basic course in probability theory
Bhattacharya, Rabi
2016-01-01
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
Stotler, D.P.; Goldston, R.J.
1989-09-01
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Independent events in elementary probability theory
Csenki, Attila
2011-07-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY
Directory of Open Access Journals (Sweden)
Magdalena Hykšová
2012-03-01
Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.
Probability analysis of nuclear power plant hazards
International Nuclear Information System (INIS)
Kovacs, Z.
1985-01-01
The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Geometric modeling in probability and statistics
Calin, Ovidiu
2014-01-01
This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...
Fixation probability on clique-based graphs
Choi, Jeong-Ok; Yu, Unjong
2018-02-01
The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.
Computing angle of arrival of radio signals
Borchardt, John J.; Steele, David K.
2017-11-07
Various technologies pertaining to computing angle of arrival of radio signals are described. A system that is configured for computing the angle of arrival of a radio signal includes a cylindrical sheath wrapped around a cylindrical object, where the cylindrical sheath acts as a ground plane. The system further includes a plurality of antennas that are positioned about an exterior surface of the cylindrical sheath, and receivers respectively coupled to the antennas. The receivers output measurements pertaining to the radio signal. A processing circuit receives the measurements and computes the angle of arrival of the radio signal based upon the measurements.
Duelling idiots and other probability puzzlers
Nahin, Paul J
2002-01-01
What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Probability densities and Lévy densities
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Probabilities from entanglement, Born's rule from envariance
International Nuclear Information System (INIS)
Zurek, W.
2005-01-01
Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)
Risk Probability Estimating Based on Clustering
DEFF Research Database (Denmark)
Chen, Yong; Jensen, Christian D.; Gray, Elizabeth
2003-01-01
of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...
Fifty challenging problems in probability with solutions
Mosteller, Frederick
1987-01-01
Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall
Path probabilities of continuous time random walks
International Nuclear Information System (INIS)
Eule, Stephan; Friedrich, Rudolf
2014-01-01
Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)
Probable Gastrointestinal Toxicity of Kombucha Tea
Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David
1997-01-01
Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa
Bayesian estimation of core-melt probability
International Nuclear Information System (INIS)
Lewis, H.W.
1984-01-01
A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease
ESTIMATING LONG GRB JET OPENING ANGLES AND REST-FRAME ENERGETICS
Energy Technology Data Exchange (ETDEWEB)
Goldstein, Adam [Space Science Office, VP62, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Connaughton, Valerie [Science and Technology Institute, Universities Space Research Association, Huntsville, AL 35805 (United States); Briggs, Michael S.; Burns, Eric, E-mail: adam.m.goldstein@nasa.gov [Center for Space Plasma and Aeronomic Research, University of Alabama in Huntsville, 320 Sparkman Drive, Huntsville, AL 35899 (United States)
2016-02-10
We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.
Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces
International Nuclear Information System (INIS)
Vourdas, A.
2014-01-01
The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities
Optical fibre angle sensor used in MEMS
International Nuclear Information System (INIS)
Golebiowski, J; Milcarz, Sz; Rybak, M
2014-01-01
There is a need for displacement and angle measurements in many movable MEMS structures. The use of fibre optical sensors helps to measure micrometre displacements and small rotation angles. Advantages of this type of transducers are their simple design, high precision of processing, low costs and ability of a non-contact measurement. The study shows an analysis of a fibre-optic intensity sensor used for MEMS movable structure rotation angle measurement. An intensity of the light in the photodetector is basically dependent on a distance between a reflecting surface and a head surface of the fibre transmitting arm, and the deflection angle. Experimental tests were made for PMMA 980/1000 plastic fibres, Θ NA =33°. The study shows both analytical and practical results. It proves that calculated and experimental characteristics for the analysed transducers are similar.
Gonioscopy in primary angle closure glaucoma.
Bruno, Christina A; Alward, Wallace L M
2002-06-01
Primary angle closure is a condition characterized by obstruction to aqueous humor outflow by the peripheral iris, and results in changes in the iridocorneal angle that are visible through gonioscopic examination. Gonioscopy in these eyes, however, can be difficult. This chapter discusses techniques that might help in the examination. These include beginning the examination with the inferior angle, methods to help in looking over the iris, cycloplegia, locating the corneal wedge, indentation, van Herick estimation, examining the other eye, and topical glycerin. Finally, there is a discussion about the pathology associated with the closed angle, with emphasis on the appearance of iris bombé, plateau iris, and the distinction between iris processes and peripheral anterior synechiae.
International Nuclear Information System (INIS)
Torrianni, I.L.
1983-01-01
The theoretical and experimental problems appearing in diffraction experiments at very low angles by several kinds of materials are discussed. The importance of synchrotron radiation in such problems is shown. (L.C.) [pt
Directional Wide-Angle Range Finder (DWARF)
National Aeronautics and Space Administration — The proposed innovation, the Directional Wide-Angle Range Finder (DWARF) is the creation of a laser range-finder with a wide field-of-view (FOV) and a directional...
Angle measurement with laser feedback instrument.
Chen, Wenxue; Zhang, Shulian; Long, Xingwu
2013-04-08
An instrument for angle measurement based on laser feedback has been designed. The measurement technique is based on the principle that when a wave plate placed into a feedback cavity rotates, its phase retardation varies. Phase retardation is a function of the rotating angle of the wave plate. Hence, the angle can be converted to phase retardation. The phase retardation is measured at certain characteristic points identified in the laser outputting curve that are then modulated by laser feedback. The angle of a rotating object can be measured if it is connected to the wave plate. The main advantages of this instrument are: high resolution, compact, flexible, low cost, effective power, and fast response.
Precision Guidance with Impact Angle Requirements
National Research Council Canada - National Science Library
Ford, Jason
2001-01-01
This paper examines a weapon system precision guidance problem in which the objective is to guide a weapon onto a non-manoeuvring target so that a particular desired angle of impact is achieved using...
Tropical Cyclone Wind Probability Forecasting (WINDP).
1981-04-01
llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM
The Probability Heuristics Model of Syllogistic Reasoning.
Chater, Nick; Oaksford, Mike
1999-01-01
Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Critique of `Elements of Quantum Probability'
Gill, R.D.
1998-01-01
We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension
Independent Events in Elementary Probability Theory
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Probable Unusual Transmission of Zika Virus
Centers for Disease Control (CDC) Podcasts
This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
Spatial Probability Cuing and Right Hemisphere Damage
Shaqiri, Albulena; Anderson, Britt
2012-01-01
In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…
Sampling, Probability Models and Statistical Reasoning -RE ...
Indian Academy of Sciences (India)
random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.
Virus isolation: Specimen type and probable transmission
Indian Academy of Sciences (India)
First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.
Estimating the Probability of Negative Events
Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike
2009-01-01
How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…
Concurrency meets probability: theory and practice (abstract)
Katoen, Joost P.
Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between
Confusion between Odds and Probability, a Pandemic?
Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer
2012-01-01
This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…
Probability in Action: The Red Traffic Light
Shanks, John A.
2007-01-01
Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…
Probability & Statistics: Modular Learning Exercises. Teacher Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Probability & Statistics: Modular Learning Exercises. Student Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Conditional probability on MV-algebras
Czech Academy of Sciences Publication Activity Database
Kroupa, Tomáš
2005-01-01
Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005
Investigating Probability with the NBA Draft Lottery.
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…
Probability from a Socio-Cultural Perspective
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Neutrosophic Probability, Set, And Logic (first version)
Smarandache, Florentin
2000-01-01
This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.
Pade approximant calculations for neutron escape probability
International Nuclear Information System (INIS)
El Wakil, S.A.; Saad, E.A.; Hendi, A.A.
1984-07-01
The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)
On a paradox of probability theory
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)
Escape probabilities for fluorescent x-rays
International Nuclear Information System (INIS)
Dance, D.R.; Day, G.J.
1985-01-01
Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)
Sequential Probability Ration Tests : Conservative and Robust
Kleijnen, J.P.C.; Shi, Wen
2017-01-01
In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output
Applied probability models with optimization applications
Ross, Sheldon M
1992-01-01
Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.
Quantum probability and conceptual combination in conjunctions.
Hampton, James A
2013-06-01
I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.
Axial vector mass spectrum and mixing angles
International Nuclear Information System (INIS)
Caffarelli, R.V.; Kang, K.
1976-01-01
Spectral sum rules of the axial-vector current and axial-vector current-pseudoscalar field are used to study the axial-vector mass spectrum and mixing angles, as well as the decay constants and mixing angles of the pseudoscalar mesons. In general, the result is quite persuasive for the existence of the Jsup(PC) = 1 ++ multiplet in which one has a canonical D-E mixing. (Auth.)
Contact angle hysteresis on superhydrophobic stripes.
Dubov, Alexander L; Mourran, Ahmed; Möller, Martin; Vinogradova, Olga I
2014-08-21
We study experimentally and discuss quantitatively the contact angle hysteresis on striped superhydrophobic surfaces as a function of a solid fraction, ϕS. It is shown that the receding regime is determined by a longitudinal sliding motion of the deformed contact line. Despite an anisotropy of the texture the receding contact angle remains isotropic, i.e., is practically the same in the longitudinal and transverse directions. The cosine of the receding angle grows nonlinearly with ϕS. To interpret this we develop a theoretical model, which shows that the value of the receding angle depends both on weak defects at smooth solid areas and on the strong defects due to the elastic energy of the deformed contact line, which scales as ϕS(2)lnϕS. The advancing contact angle was found to be anisotropic, except in a dilute regime, and its value is shown to be determined by the rolling motion of the drop. The cosine of the longitudinal advancing angle depends linearly on ϕS, but a satisfactory fit to the data can only be provided if we generalize the Cassie equation to account for weak defects. The cosine of the transverse advancing angle is much smaller and is maximized at ϕS ≃ 0.5. An explanation of its value can be obtained if we invoke an additional energy due to strong defects in this direction, which is shown to be caused by the adhesion of the drop on solid sectors and is proportional to ϕS(2). Finally, the contact angle hysteresis is found to be quite large and generally anisotropic, but it becomes isotropic when ϕS ≤ 0.2.
Small-angle neutron-scattering experiments
International Nuclear Information System (INIS)
Hardy, A.D.; Thomas, M.W.; Rouse, K.D.
1981-04-01
A brief introduction to the technique of small-angle neutron scattering is given. The layout and operation of the small-angle scattering spectrometer, mounted on the AERE PLUTO reactor, is also described. Results obtained using the spectrometer are presented for three materials (doped uranium dioxide, Magnox cladding and nitrided steel) of interest to Springfields Nuclear Power Development Laboratories. The results obtained are discussed in relation to other known data for these materials. (author)
Radiodiagnosis of Cerebellopontine-angle tumors
International Nuclear Information System (INIS)
Weyer, K.H. van de
1979-01-01
The most important radiodiagnostic signs of cerebellopontine-angle tumors are demonstrated. The value of plain films and special projections is discussed. The use of recent diagnostic procedures like scintography, CT and cisternography with oily contrast medium is critically analyzed. The advantage and disadvantages of these procedures are discussed according to their usefullness in evaluating size, route of spread and localisation of cerebellopontine-angle tumors. (orig.) [de
Estimating Elevation Angles From SAR Crosstalk
Freeman, Anthony
1994-01-01
Scheme for processing polarimetric synthetic-aperture-radar (SAR) image data yields estimates of elevation angles along radar beam to target resolution cells. By use of estimated elevation angles, measured distances along radar beam to targets (slant ranges), and measured altitude of aircraft carrying SAR equipment, one can estimate height of target terrain in each resolution cell. Monopulselike scheme yields low-resolution topographical data.
Expressions for the Total Yaw Angle
2016-09-01
1. Introduction 1 2. Mathematical Notation 1 3. Total Yaw Expression Derivations 2 3.1 First Derivation 2 3.2 Second Derivation 4 3.3 Other...4 iv Approved for public release; distribution is unlimited. 1. Introduction The total yaw angle, γt , of a ballistic projectile is... elevation angles from spherical coordinates.∗ We again place point A at the end point of V. Now imagine a plane parallel to the y-z plane that includes
Lateral displacement in small angle multiple scattering
Energy Technology Data Exchange (ETDEWEB)
Bichsel, H.; Hanson, K.M.; Schillaci, K.M. (Los Alamos National Lab., NM (USA))
1982-07-01
Values have been calculated for the average lateral displacement in small angle multiple scattering of protons with energies of several hundred MeV. The calculations incorporate the Moliere distribution which does not make the gaussian approximations of the distribution in projected angle and lateral deflections. Compared to other published data, such approximations can lead to errors in the lateral displacement of up to 10% in water.
Monte Carlo methods to calculate impact probabilities
Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.
2014-09-01
Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward
Control of Pan-tilt Mechanism Angle using Position Matrix Method
Directory of Open Access Journals (Sweden)
Hendri Maja Saputra
2013-12-01
Full Text Available Control of a Pan-Tilt Mechanism (PTM angle for the bomb disposal robot Morolipi-V2 using inertial sensor measurement unit, x-IMU, has been done. The PTM has to be able to be actively controlled both manually and automatically in order to correct the orientation of the moving Morolipi-V2 platform. The x-IMU detects the platform orientation and sends the result in order to automatically control the PTM. The orientation is calculated using the quaternion combined with Madwick and Mahony filter methods. The orientation data that consists of angles of roll (α, pitch (β, and yaw (γ from the x-IMU are then being sent to the camera for controlling the PTM motion (pan & tilt angles after calculating the reverse angle using position matrix method. Experiment results using Madwick and Mahony methods show that the x-IMU can be used to find the robot platform orientation. Acceleration data from accelerometer and flux from magnetometer produce noise with standard deviation of 0.015 g and 0.006 G, respectively. Maximum absolute errors caused by Madgwick and Mahony method with respect to Xaxis are 48.45º and 33.91º, respectively. The x-IMU implementation as inertia sensor to control the Pan-Tilt Mechanism shows a good result, which the probability of pan angle tends to be the same with yaw and tilt angle equal to the pitch angle, except a very small angle shift due to the influence of roll angle..
Angle closure glaucoma in congenital ectropion uvea.
Wang, Grace M; Thuente, Daniel; Bohnsack, Brenda L
2018-06-01
Congenital ectropion uvea is a rare anomaly, which is associated with open, but dysplastic iridocorneal angles that cause childhood glaucoma. Herein, we present 3 cases of angle-closure glaucoma in children with congenital ectropion uvea. Three children were initially diagnosed with unilateral glaucoma secondary to congenital ectropion uvea at 7, 8 and 13 years of age. The three cases showed 360° of ectropion uvea and iris stromal atrophy in the affected eye. In one case, we have photographic documentation of progression to complete angle closure, which necessitated placement of a glaucoma drainage device 3 years after combined trabeculotomy and trabeculectomy. The 2 other cases, which presented as complete angle closure, also underwent glaucoma drainage device implantation. All three cases had early glaucoma drainage device encapsulation (within 4 months) and required additional surgery (cycloablation or trabeculectomy). Congenital ectropion uvea can be associated with angle-closure glaucoma, and placement of glaucoma drainage devices in all 3 of our cases showed early failure due to plate encapsulation. Glaucoma in congenital ectropion uvea requires attention to angle configuration and often requires multiple surgeries to obtain intraocular pressure control.
Neutron spin echo scattering angle measurement (SESAME)
International Nuclear Information System (INIS)
Pynn, R.; Fitzsimmons, M.R.; Fritzsche, H.; Gierlings, M.; Major, J.; Jason, A.
2005-01-01
We describe experiments in which the neutron spin echo technique is used to measure neutron scattering angles. We have implemented the technique, dubbed spin echo scattering angle measurement (SESAME), using thin films of Permalloy electrodeposited on silicon wafers as sources of the magnetic fields within which neutron spins precess. With 30-μm-thick films we resolve neutron scattering angles to about 0.02 deg. with neutrons of 4.66 A wavelength. This allows us to probe correlation lengths up to 200 nm in an application to small angle neutron scattering. We also demonstrate that SESAME can be used to separate specular and diffuse neutron reflection from surfaces at grazing incidence. In both of these cases, SESAME can make measurements at higher neutron intensity than is available with conventional methods because the angular resolution achieved is independent of the divergence of the neutron beam. Finally, we discuss the conditions under which SESAME might be used to probe in-plane structure in thin films and show that the method has advantages for incident neutron angles close to the critical angle because multiple scattering is automatically accounted for
Bounding probabilistic safety assessment probabilities by reality
International Nuclear Information System (INIS)
Fragola, J.R.; Shooman, M.L.
1991-01-01
The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates
Establishment probability in newly founded populations
Directory of Open Access Journals (Sweden)
Gusset Markus
2012-06-01
Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.
The probability and severity of decompression sickness
Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928
Undetected angle closure in patients with a diagnosis of open-angle glaucoma.
Varma, Devesh K; Simpson, Sarah M; Rai, Amandeep S; Ahmed, Iqbal Ike K
2017-08-01
The aim of this study was to identify the proportion of patients referred to a tertiary glaucoma centre with a diagnosis of open-angle glaucoma (OAG) who were found to have angle closure glaucoma. Retrospective chart review. Consecutive new patients referred for glaucoma management to a tertiary centre between July 2010 and December 2011 were reviewed. Patients whose referrals for glaucoma assessment specified angle status as "open" were included. The data collected included glaucoma specialist's angle assessment, diagnosis, and glaucoma severity. The status of those with 180 degrees or more Shaffer angle grading of 0 was classified as "closed." From 1234 glaucoma referrals, 179 cases were specified to have a diagnosis of OAG or when angles were known to be open. Of these, 16 (8.9%) were found on examination by the glaucoma specialist to have angle closure. Pseudoexfoliation was present in 4 of 16 patients (25%) in the missed angle-closure glaucoma (ACG) group and 22 of 108 patients (13.5%) in the remaining OAG group. There was no difference found in demographic or ocular biometric parameters between those with confirmed OAG versus those with missed ACG. Almost 1 in 11 patients referred by ophthalmologists to a tertiary glaucoma centre with a diagnosis of OAG were in fact found to have angle closure. Given the different treatment approaches for ACG versus OAG, this study suggests a need to strengthen angle evaluations. Copyright © 2017 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Project study of a small-angle neutron scattering apparatus
International Nuclear Information System (INIS)
Schedler, E.; Pollet, J.L.
1979-03-01
This design study deals with the set up of a low angle scattering apparatus in the HMI reactor hall in Berlin. The experiences of other institutes with facilities of a similar type, - especially with D11 and D17 of the ILL in Grenoble, the set up the KFA in Juelich and of the PTB in Braunschweig -, are included to a large extend. The aim of this paper is - to define the necessary boundary conditions for the construction (including: installation of a cold source, the beam line, the neutron guide pipe and an extention of the reactor hall), -to determine the properties of the planned apparatus, especially in comparison with D11, probably the most versatile instrument, - to make desitions for the design of the components, - to work out the detailed drawings for construction - to estimate the costs and the time necessary for construction, if industrial manufacturers set up the project. (orig.) [de
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Probabilities for gravitational lensing by point masses in a locally inhomogeneous universe
International Nuclear Information System (INIS)
Isaacson, J.A.; Canizares, C.R.
1989-01-01
Probability functions for gravitational lensing by point masses that incorporate Poisson statistics and flux conservation are formulated in the Dyer-Roeder construction. Optical depths to lensing for distant sources are calculated using both the method of Press and Gunn (1973) which counts lenses in an otherwise empty cone, and the method of Ehlers and Schneider (1986) which projects lensing cross sections onto the source sphere. These are then used as parameters of the probability density for lensing in the case of a critical (q0 = 1/2) Friedmann universe. A comparison of the probability functions indicates that the effects of angle-averaging can be well approximated by adjusting the average magnification along a random line of sight so as to conserve flux. 17 references
VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES
International Nuclear Information System (INIS)
G.A. Valentine; F.V. Perry; S. Dartevelle
2005-01-01
Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision
Contact angle of unset elastomeric impression materials.
Menees, Timothy S; Radhakrishnan, Rashmi; Ramp, Lance C; Burgess, John O; Lawson, Nathaniel C
2015-10-01
Some elastomeric impression materials are hydrophobic, and it is often necessary to take definitive impressions of teeth coated with some saliva. New hydrophilic materials have been developed. The purpose of this in vitro study was to compare contact angles of water and saliva on 7 unset elastomeric impression materials at 5 time points from the start of mixing. Two traditional polyvinyl siloxane (PVS) (Aquasil, Take 1), 2 modified PVS (Imprint 4, Panasil), a polyether (Impregum), and 2 hybrid (Identium, EXA'lence) materials were compared. Each material was flattened to 2 mm and a 5 μL drop of distilled water or saliva was dropped on the surface at 25 seconds (t0) after the start of mix. Contact angle measurements were made with a digital microscope at initial contact (t0), t1=2 seconds, t2=5 seconds, t3=50% working time, and t4=95% working time. Data were analyzed with a generalized linear mixed model analysis, and individual 1-way ANOVA and Tukey HSD post hoc tests (α=.05). For water, materials grouped into 3 categories at all time-points: the modified PVS and one hybrid material (Identium) produced the lowest contact angles, the polyether material was intermediate, and the traditional PVS materials and the other hybrid (EXA'lence) produced the highest contact angles. For saliva, Identium, Impregum, and Imprint 4 were in the group with the lowest contact angle at most time points. Modified PVS materials and one of the hybrid materials are more hydrophilic than traditional PVS materials when measured with water. Saliva behaves differently than water in contact angle measurement on unset impression material and produces a lower contact angle on polyether based materials. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Joint survival probability via truncated invariant copula
International Nuclear Information System (INIS)
Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol
2016-01-01
Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.
Probabilities, causes and propensities in physics
Suárez, Mauricio
2010-01-01
This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Measurement of the resonance escape probability
International Nuclear Information System (INIS)
Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.
1957-01-01
The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr
Approaches to Evaluating Probability of Collision Uncertainty
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
Multiple model cardinalized probability hypothesis density filter
Georgescu, Ramona; Willett, Peter
2011-09-01
The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.
Conditional probabilities in Ponzano-Regge minisuperspace
International Nuclear Information System (INIS)
Petryk, Roman; Schleich, Kristin
2003-01-01
We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes
Bayesian Prior Probability Distributions for Internal Dosimetry
Energy Technology Data Exchange (ETDEWEB)
Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E
2001-07-01
The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)
Probability and statistics for particle physics
Mana, Carlos
2017-01-01
This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...
Uncertainty in T1 mapping using the variable flip angle method with two flip angles
International Nuclear Information System (INIS)
Schabel, Matthias C; Morrell, Glen R
2009-01-01
Propagation of errors, in conjunction with the theoretical signal equation for spoiled gradient echo pulse sequences, is used to derive a theoretical expression for uncertainty in quantitative variable flip angle T 1 mapping using two flip angles. This expression is then minimized to derive a rigorous expression for optimal flip angles that elucidates a commonly used empirical result. The theoretical expressions for uncertainty and optimal flip angles are combined to derive a lower bound on the achievable uncertainty for a given set of pulse sequence parameters and signal-to-noise ratio (SNR). These results provide a means of quantitatively determining the effect of changing acquisition parameters on T 1 uncertainty. (note)
Probability Weighting as Evolutionary Second-best
Herold, Florian; Netzer, Nick
2011-01-01
The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...
Bayesian probability theory and inverse problems
International Nuclear Information System (INIS)
Kopec, S.
1994-01-01
Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)
Probability and Statistics in Aerospace Engineering
Rheinfurth, M. H.; Howell, L. W.
1998-01-01
This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.
Statistical models based on conditional probability distributions
International Nuclear Information System (INIS)
Narayanan, R.S.
1991-10-01
We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
Selected papers on analysis, probability, and statistics
Nomizu, Katsumi
1994-01-01
This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.
Clan structure analysis and rapidity gap probability
International Nuclear Information System (INIS)
Lupia, S.; Giovannini, A.; Ugoccioni, R.
1995-01-01
Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)
Clan structure analysis and rapidity gap probability
Energy Technology Data Exchange (ETDEWEB)
Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)
1995-03-01
Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)
Introduction to tensorial resistivity probability tomography
Mauriello, Paolo; Patella, Domenico
2005-01-01
The probability tomography approach developed for the scalar resistivity method is here extended to the 2D tensorial apparent resistivity acquisition mode. The rotational invariant derived from the trace of the apparent resistivity tensor is considered, since it gives on the datum plane anomalies confined above the buried objects. Firstly, a departure function is introduced as the difference between the tensorial invariant measured over the real structure and that computed for a reference uni...
Interaction probability value calculi for some scintillators
International Nuclear Information System (INIS)
Garcia-Torano Martinez, E.; Grau Malonda, A.
1989-01-01
Interaction probabilities for 17 gamma-ray energies between 1 and 1.000 KeV have been computed and tabulated. The tables may be applied to the case of cylindrical vials with radius 1,25 cm and volumes 5, 10 and 15 ml. Toluene, Toluene/Alcohol, Dioxane-Naftalen, PCS, INSTAGEL and HISAFE II scintillators are considered. Graphical results for 10 ml are also given. (Author) 11 refs
Probability of collective excited state decay
International Nuclear Information System (INIS)
Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.
1987-01-01
Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable
SureTrak Probability of Impact Display
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
On the universality of knot probability ratios
Energy Technology Data Exchange (ETDEWEB)
Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)
2011-04-22
Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)
Calculating Cumulative Binomial-Distribution Probabilities
Scheuer, Ernest M.; Bowerman, Paul N.
1989-01-01
Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.
PSA, subjective probability and decision making
International Nuclear Information System (INIS)
Clarotti, C.A.
1989-01-01
PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven
Box-particle probability hypothesis density filtering
Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.
2014-01-01
This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...
Probability and statistics in particle physics
International Nuclear Information System (INIS)
Frodesen, A.G.; Skjeggestad, O.
1979-01-01
Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)
Probable Unusual Transmission of Zika Virus
Centers for Disease Control (CDC) Podcasts
2011-05-23
This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event. Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID). Date Released: 5/25/2011.
Heart sounds analysis using probability assessment
Czech Academy of Sciences Publication Activity Database
Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel
2017-01-01
Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016
Classical probabilities for Majorana and Weyl spinors
International Nuclear Information System (INIS)
Wetterich, C.
2011-01-01
Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.
A quantum probability model of causal reasoning
Directory of Open Access Journals (Sweden)
Jennifer S Trueblood
2012-05-01
Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
Failure probability analysis on mercury target vessel
International Nuclear Information System (INIS)
Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro
2005-03-01
Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)
Converting dose distributions into tumour control probability
International Nuclear Information System (INIS)
Nahum, A.E.
1996-01-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs
Converting dose distributions into tumour control probability
Energy Technology Data Exchange (ETDEWEB)
Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics
1996-08-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.
Consistent probabilities in loop quantum cosmology
International Nuclear Information System (INIS)
Craig, David A; Singh, Parampreet
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)
Probability and containment of turbine missiles
International Nuclear Information System (INIS)
Yeh, G.C.K.
1976-01-01
With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)
Pipe failure probability - the Thomas paper revisited
International Nuclear Information System (INIS)
Lydell, B.O.Y.
2000-01-01
Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Nuclear data uncertainties: I, Basic concepts of probability
International Nuclear Information System (INIS)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs
Complications and Reoperations in Mandibular Angle Fractures.
Chen, Collin L; Zenga, Joseph; Patel, Ruchin; Branham, Gregory
2018-05-01
Mandible angle fractures can be repaired in a variety of ways, with no consensus on the outcomes of complications and reoperation rates. To analyze patient, injury, and surgical factors, including approach to the angle and plating technique, associated with postoperative complications, as well as the rate of reoperation with regard to mandible angle fractures. Retrospective cohort study analyzing the surgical outcomes of patients with mandible angle fractures between January 1, 2000, and December 31, 2015, who underwent open reduction and internal fixation. Patients were eligible if they were aged 18 years or older, had 3 or less mandible fractures with 1 involving the mandibular angle, and had adequate follow-up data. Patients with comminuted angle fractures, bilateral angle fractures, and multiple surgical approaches were excluded. A total of 135 patients were included in the study. All procedures were conducted at a single, large academic hospital located in an urban setting. Major complications and reoperation rates. Major complications included in this study were nonunion, malunion, severe malocclusion, severe infection, and exposed hardware. Of 135 patients 113 (83.7%) were men; median age was 29 years (range, 18-82 years). Eighty-seven patients (64.4%) underwent the transcervical approach and 48 patients (35.6%) received the transoral approach. Fifteen (17.2%) patients in the transcervical group and 9 (18.8%) patients in the transoral group experienced major complications (difference, 1%; 95% CI, -8% to 10%). Thirteen (14.9%) patients in the transcervical group and 8 (16.7%) patients in the transoral group underwent reoperations (difference, 2%; 95% CI, -13% to 17%). Active smoking had a significant effect on the rate of major complications (odds ratio, 4.04; 95% CI, 1.07 to 15.34; P = .04). During repair of noncomminuted mandibular angle fractures, both of the commonly used approaches-transcervical and transoral-can be used during treatment with equal
Graphene spin valve: An angle sensor
Energy Technology Data Exchange (ETDEWEB)
Iqbal, Muhammad Zahir, E-mail: zahir.upc@gmail.com [Faculty of Engineering Sciences, GIK Institute of Engineering Sciences and Technology, Topi 23640, Khyber Pakhtunkhwa (Pakistan); Hussain, Ghulam [Faculty of Engineering Sciences, GIK Institute of Engineering Sciences and Technology, Topi 23640, Khyber Pakhtunkhwa (Pakistan); Siddique, Salma [Department of Bioscience & Biotechnology, Sejong University, Seoul 143-747 (Korea, Republic of); Iqbal, Muhammad Waqas [Department of Physics, Riphah Institute of Computing and Applied Sciences (RICAS), Riphah International University, Lahore (Pakistan)
2017-06-15
Graphene spin valves can be optimized for various spintronic applications by tuning the associated experimental parameters. In this work, we report the angle dependent magnetoresistance (MR) in graphene spin valve for different orientations of applied magnetic field (B). The switching points of spin valve signals show a clear shift towards higher B for each increasing angle of the applied field, thus sensing the response for respective orientation of the magnetic field. The angular variation of B shifts the switching points from ±95 G to ±925 G as the angle is varied from 0° to 90° at 300 K. The observed shifts in switching points become more pronounced (±165 G to ±1450 G) at 4.2 K for similar orientation. A monotonic increase in MR ratio is observed as the angle of magnetic field is varied in the vertical direction at 300 K and 4.2 K temperatures. This variation of B (from 0° to 90°) increases the magnitude of MR ratio from ∼0.08% to ∼0.14% at 300 K, while at 4.2 K it progresses to ∼0.39% from ∼0.14%. The sensitivity related to angular variation of such spin valve structure can be employed for angle sensing applications.
A Viewpoint on the Quantity "Plane Angle"
Eder, W. E.
1982-01-01
Properties of the quantity "plane angle" are explored under the hypothesis that it is a dimensional quantity. The exploration proceeds especially with respect to the physical concept, its mathematical treatment, vector concepts, measurement theory, units of related quantities, engineering pragmatism, and SI. An attempt is made to bring these different relations into a rational, logical and consistent framework, and thus to justify the hypothesis. Various types of vectorial quantities are recognized, and their properties described with an outline of the necessary algebraic manipulations. The concept of plane angle is amplified, and its interdependence with the circular arc is explored. The resulting units of plane angle form a class of similar scales of measurement. Consequences of the confirmed hypothesis are developed for mathematical expressions involving trigonometric functions, rotational volumes and areas, mathematical limits, differentiation and series expansion. Consequences for mechanical rotational quantities are developed, with proposals for revisions to a number of expressions for derived units within SI. A revised definition for the quantity "plane angle" is stated to take account of the developed insights. There is a clear need to reconsider the status of plane angle and some other quantities within the international framework of SI.
Calculating the Probability of Returning a Loan with Binary Probability Models
Directory of Open Access Journals (Sweden)
Julian Vasilev
2014-12-01
Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.
High-resolution elastic recoil detection utilizing Bayesian probability theory
International Nuclear Information System (INIS)
Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.
2001-01-01
Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2
School and conference on probability theory
International Nuclear Information System (INIS)
Lawler, G.F.
2004-01-01
This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this
Slope stability probability classification, Waikato Coal Measures, New Zealand
Energy Technology Data Exchange (ETDEWEB)
Lindsay, P.; Gillard, G.R.; Moore, T.A. [CRL Energy, PO Box 29-415, Christchurch (New Zealand); Campbell, R.N.; Fergusson, D.A. [Solid Energy North, Private Bag 502, Huntly (New Zealand)
2001-01-01
Ferm classified lithological units have been identified and described in the Waikato Coal Measures in open pits in the Waikato coal region. These lithological units have been classified geotechnically by mechanical tests and discontinuity measurements. Using these measurements slope stability probability classifications (SSPC) have been quantified based on an adaptation of Hack's [Slope Stability Probability Classification, ITC Delft Publication, Enschede, Netherlands, vol. 43, 1998, 273 pp.] SSPC system, which places less influence on rock quality designation and unconfined compressive strength than previous slope/rock mass rating systems. The Hack weathering susceptibility rating has been modified by using chemical index of alteration values determined from XRF major element analyses. Slaking is an important parameter in slope stability in the Waikato Coal Measures lithologies and hence, a non-subjective method of assessing slaking in relation to the chemical index of alteration has been introduced. Another major component of this adapted SSPC system is the inclusion of rock moisture content effects on slope stability. The main modifications of Hack's SSPC system are the introduction of rock intact strength derived from the modified Mohr-Coulomb failure criterion, which has been adapted for varying moisture content, weathering state and confining pressure. It is suggested that the subjectivity in assessing intact rock strength within broad bands in the initial SSPC system is a major weakness of the initial system. Initial results indicate a close relationship between rock mass strength values, calculated from rock mass friction angles and rock mass cohesion values derived from two established rock mass classification methods (modified Hoek-Brown failure criteria and MRMR) and the adapted SSPC system. The advantage of the modified SSPC system is that slope stability probabilities based on discontinuity-independent and discontinuity-dependent data and a
Impact of controlling the sum of error probability in the sequential probability ratio test
Directory of Open Access Journals (Sweden)
Bijoy Kumarr Pradhan
2013-05-01
Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.