Fonseca Rasmus
2009-10-01
Full Text Available Abstract Background Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments make up nearly 40% of proteins and they do not have any apparent recurrent patterns, which complicates overall prediction accuracy of protein structure prediction methods. Luckily, previous work has indicated that coil segments are in fact not completely random in structure and flanking residues do seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle residue in the input-window. The trained neural network shows a significant improvement (4-68% in predicting the most probable bin (covering a 30° × 30° area of the dihedral angle space for all amino acids in the data set compared to baseline statistics. An accuracy comparable to that of secondary structure prediction (≈ 80% is achieved by observing the 20 bins with highest output values. Conclusion Many different protein structure prediction methods exist and each uses different tools and auxiliary predictions to help determine the native structure. In this work the sequence is used to predict local context dependent dihedral angle propensities in coil-regions. This predicted distribution can potentially improve tertiary structure prediction
Helles, Glennie; Fonseca, Rasmus
2009-01-01
Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments...... make up nearly 40\\% of proteins, and they do not have any apparent recurrent patterns which complicates overall prediction accuracy of protein structure prediction methods. Luckily, previous work has indicated that coil segments are in fact not completely random in structure and flanking residues do...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...
Disequilibrium dihedral angles in dolerite sills
Holness, Marian B.; Richardson, Chris; Helz, Rosalind T.
2012-01-01
The geometry of clinopyroxene-plagioclase-plagioclase junctions in mafic rocks, measured by the median dihedral angle Θcpp, is created during solidification. In the solidifying Kilauea Iki (Hawaii) lava lake, the wider junctions between plagioclase grains are the first to be filled by pyroxene, followed by the narrower junctions. The final Θcpp, attained when all clinopyroxene-plagioclase-plagioclase junctions are formed, is 78° in the upper crust of the lake, and 85° in the lower solidification front. Θcpp in the 3.5-m-thick Traigh Bhàn na Sgùrra sill (Inner Hebrides) is everywhere 78°. In the Whin Sill (northern England, 38 m thick) and the Portal Peak sill (Antarctica, 129 m thick), Θcpp varies symmetrically, with the lowest values at the margins. The 266-m-thick Basement Sill (Antarctica) has asymmetric variation of Θcpp, attributed to a complex filling history. The chilled margins of the Basement Sill are partially texturally equilibrated, with high Θcpp. The plagioclase grain size in the two widest sills varies asymmetrically, with the coarsest rocks found in the upper third. Both Θcpp and average grain size are functions of model crystallization times. Θcpp increases from 78° to a maximum of ∼100° as the crystallization time increases from 1 to 500 yr. Because the use of grain size as a measure of crystallization time is dependent on an estimate of crystal growth rates, dihedral angles provide a more direct proxy for cooling rates in dolerites.
Measurement of dihedral angles by scanning electron microscopy.
Achutaramayya, G.; Scott, W. D.
1973-01-01
The extension of Hoover's (1971) technique to the case of dihedral-angle measurement is described. Dihedral angles are often determined by interferometry on thermally grooved grain boundaries to obtain information on relative interfacial energies. In the technique considered the measured angles approach the true angles as the tilt angle approaches 90 deg. It is pointed out that the scanning electron microscopy method provides a means of seeing the real root of a groove at a lateral magnification which is higher than that obtainable with interferometry.
Measurement of dihedral angles by scanning electron microscopy.
Achutaramayya, G.; Scott, W. D.
1973-01-01
The extension of Hoover's (1971) technique to the case of dihedral-angle measurement is described. Dihedral angles are often determined by interferometry on thermally grooved grain boundaries to obtain information on relative interfacial energies. In the technique considered the measured angles approach the true angles as the tilt angle approaches 90 deg. It is pointed out that the scanning electron microscopy method provides a means of seeing the real root of a groove at a lateral magnification which is higher than that obtainable with interferometry.
Decoding low dihedral angles in gabbroic layered intrusions
Holness, M. B.; Humphreys, M.; Veksler, I. V.
2010-12-01
Texturally equilibrated rocks are granular with a unimodal grain size, smoothly curved grain boundaries, and angles at three-grain junctions of 110-140°. Gabbros are not texturally equilibrated: primocrysts commonly have planar faces whereas later-formed phases fill in the interstitial spaces. Augite-plagioclase-plagioclase dihedral angles (Θcpp) rarely attain the equilibrium value in gabbros and the population of disequilibrium angles preserves otherwise inaccessible information about rock history. The Θcpp population varies significantly between different basaltic bodies. In a rapidly cooled dolerite Θcpp has a low median (60-70°) and a high standard deviation (20-25°). The plagioclase-augite grain boundaries are generally planar. In more slowly cooled gabbros in layered intrusions, the angle populations have a higher median (80-110°) with a low standard deviation (10-15°). The plagioclase-augite grain boundaries are generally planar far from the triple junction, but curve within 10 microns of the junction. This curvature is commonly asymmetric. The angle population in solidified gabbros infiltrated by low-temperature melts is similar to that in dolerites, although the low angles are associated with cuspate interstitial grains. The dihedral angle is a function of both the original solidification process and subsequent high-temperature (melt-absent) grain boundary migration. Infilling of a melt pocket by overgrowth of the bounding solid phases necessitates supersaturation, and this is easier to attain for planar faces, resulting in inhibition of augite growth into pores bounded by planar plagioclase grains and an asymmetry of the initial augite-plag-plag junction. If the solidified gabbro is kept sufficiently hot these initial junction geometries can change during textural equilibration. In the Skaergaard, Rum and Bushveld intrusions, the median Θcpp varies with liquidus assemblage, increasing step-wise on the addition of a new liquidus phase. Locally
Asymmetric dihedral angle offsets for large-size lunar laser ranging retroreflectors
Otsubo, Toshimichi; Kunimori, Hiroo; Noda, Hirotomo; Hanada, Hideo; Araki, Hiroshi; Katayama, Masato
2011-08-01
The distribution of two-dimensional velocity aberration is off-centered by 5 to 6 microradians in lunar laser ranging, due to the stable measurement geometry in the motion of the Earth and the Moon. The optical responses of hollow-type retroreflectors are investigated through numerical simulations, especially focusing on large-size, single-reflector targets that can ultimately minimize the systematic error in future lunar laser ranging. An asymmetric dihedral angle offset, i.e. setting unequal angles between the three back faces, is found to be effective for retroreflectors that are larger than 100 mm in diameter. Our numerical simulation results reveal that the optimized return energy increases approximately 3.5 times more than symmetric dihedral angle cases, and the optimized dihedral angle offsets are 0.65-0.8 arcseconds for one angle, and zeroes for the other two angles.
Using Excel To Study The Relation Between Protein Dihedral Angle Omega And Backbone Length
Shew, Christopher; Evans, Samari; Tao, Xiuping
How to involve the uninitiated undergraduate students in computational biophysics research? We made use of Microsoft Excel to carry out calculations of bond lengths, bond angles and dihedral angles of proteins. Specifically, we studied protein backbone dihedral angle omega by examining how its distribution varies with the length of the backbone length. It turns out Excel is a respectable tool for this task. An ordinary current-day desktop or laptop can handle the calculations for midsized proteins in just seconds. Care has to be taken to enter the formulas for the spreadsheet column after column to minimize the computing load. Supported in part by NSF Grant #1238795.
Prediction of backbone dihedral angles and protein secondary structure using support vector machines
Hirst Jonathan D
2009-12-01
Full Text Available Abstract Background The prediction of the secondary structure of a protein is a critical step in the prediction of its tertiary structure and, potentially, its function. Moreover, the backbone dihedral angles, highly correlated with secondary structures, provide crucial information about the local three-dimensional structure. Results We predict independently both the secondary structure and the backbone dihedral angles and combine the results in a loop to enhance each prediction reciprocally. Support vector machines, a state-of-the-art supervised classification technique, achieve secondary structure predictive accuracy of 80% on a non-redundant set of 513 proteins, significantly higher than other methods on the same dataset. The dihedral angle space is divided into a number of regions using two unsupervised clustering techniques in order to predict the region in which a new residue belongs. The performance of our method is comparable to, and in some cases more accurate than, other multi-class dihedral prediction methods. Conclusions We have created an accurate predictor of backbone dihedral angles and secondary structure. Our method, called DISSPred, is available online at http://comp.chem.nottingham.ac.uk/disspred/.
Dihedral angle preferences of DNA and RNA binding amino acid residues in proteins.
Ponnuraj, Karthe; Saravanan, Konda Mani
2017-04-01
A protein can interact with DNA or RNA molecules to perform various cellular processes. Identifying or analyzing DNA/RNA binding site amino acid residues is important to understand molecular recognition process. It is quite possible to accurately model DNA/RNA binding amino acid residues in experimental protein-DNA/RNA complex by using the electron density map whereas, locating/modeling the binding site amino acid residues in the predicted three dimensional structures of DNA/RNA binding proteins is still a difficult task. Considering the above facts, in the present work, we have carried out a comprehensive analysis of dihedral angle preferences of DNA and RNA binding site amino acid residues by using a classical Ramachandran map. We have computed backbone dihedral angles of non-DNA/RNA binding residues and used as control dataset to make a comparative study. The dihedral angle preference of DNA and RNA binding site residues of twenty amino acid type is presented. Our analysis clearly revealed that the dihedral angles (φ, ψ) of DNA/RNA binding amino acid residues prefer to occupy (-89° to -60°, -59° to -30°) bins. The results presented in this paper will help to model/locate DNA/RNA binding amino acid residues with better accuracy. Copyright © 2017 Elsevier B.V. All rights reserved.
Dihedral angle of carbonatite melts in mantle residue near the upper mantle and transition zone
Ghosh, S. K.; Rohrbach, A.; Schmidt, M. W.
2015-12-01
Carbonate melts are thought to be ideal metasomatic agents in the deep upper mantle (Green & Wallace, 1988) and these melts are low in viscosities (10-1-10-3 Pa·s) compared to primitive basalt (101-102 Pa·s), furthermore the ability to form an interconnected grain-edge melt network at low melt fractions (3 GPa (Dasgupta et al. 2006, Ghosh et al., 2009), dissolve a number of geochemically incompatible elements much better than silicate melts (Blundy and Dalton, 2000). Previous studies of carbonate melt dihedral angles in olivine-dominated matrices yielded 25-30oat 1-3 GPa, relatively independent of melt composition (Watson et al., 1990) and temperature (Hunter and McKenzie, 1989). Dihedral angles of carbonate melts in contact with deep mantle silicate phases (e.g. garnet, wadsleyite, and ringwoodite) which constitute more than 70 % of the deep upper mantle and transition zone have not been studied yet. We have performed multi-anvil experiments on carbonate-bearing peridotites with 5.0 wt% CO2 from 13.5 to 20 GPa 1550 oC to investigate the dihedral angle of magnesio-carbonatite melts in equilibrium with garnet, olivine (and its high-pressure polymorphs), and clinoenstatite. The dihedral angle of carbonate melts in the deep upper mantle and transition zone is ~30° for majorite garnet and olivine (and its polymorphs) dominated matrices. It does not change with increasing pressure in the range 13.5-20 GPa. Our results suggest that very low melt fractions of carbonatite melt forming in the deep upper mantle and transition zone are interconnected at melt fractions less than 0.01. Consistent with geophysical observations, this could possibly explain low velocity regions in the deep mantle and transition zone.
Ghanbarzadeh, S.; Hesse, M. A.; Prodanovic, M.; Gardner, J. E.
2013-12-01
Salt deposits in sedimentary basins have long been considered to be a seal against fluid penetration. However, experimental, theoretical and field evidence suggests brine (and oil) can wet salt crystal surfaces at higher pressures and temperatures, which can form a percolating network. This network may act as flow conduits even at low porosities. The aim of this work is to investigate the effects of dihedral angle and porosity on the formation of percolating paths in different salt network lattices. However, previous studies considered only simple homogeneous and isotropic geometries. This work extends the analysis to realistic salt textures by presenting a novel numerical method to describe the texturally equilibrated pore shapes in polycrystalline rock salt and brine systems. First, a theoretical interfacial topology was formulated to minimize the interfacial surface between brine and salt. Then, the resulting nonlinear system of ordinary differential equations was solved using the Newton-Raphson method. Results show that the formation of connected fluid channels is more probable in lower dihedral angles and at higher porosities. The connectivity of the pore network is hysteretic, because the connection and disconnection at the pore throats for processes with increasing or decreasing porosities occur at different porosities. In porous media with anisotropic solids, pores initially connect in the direction of the shorter crystal axis and only at much higher porosities in the other directions. Consequently, even an infinitesimal elongation of the crystal shape can give rise to very strong anisotropy in permeability of the pore network. Also, fluid flow was simulated in the resulting pore network to calculate permeability, capillary entry pressure and velocity field. This work enabled us to investigate the opening of pore space and sealing capacity of rock salts. The obtained pore geometries determine a wide range of petrophysical properties such as permeability and
Zhou, Alice Qinhua; O'Hern, Corey S; Regan, Lynne
2014-10-01
The side-chain dihedral angle distributions of all amino acids have been measured from myriad high-resolution protein crystal structures. However, we do not yet know the dominant interactions that determine these distributions. Here, we explore to what extent the defining features of the side-chain dihedral angle distributions of different amino acids can be captured by a simple physical model. We find that a hard-sphere model for a dipeptide mimetic that includes only steric interactions plus stereochemical constraints is able to recapitulate the key features of the back-bone dependent observed amino acid side-chain dihedral angle distributions of Ser, Cys, Thr, Val, Ile, Leu, Phe, Tyr, and Trp. We find that for certain amino acids, performing the calculations with the amino acid of interest in the central position of a short α-helical segment improves the match between the predicted and observed distributions. We also identify the atomic interactions that give rise to the differences between the predicted distributions for the hard-sphere model of the dipeptide and that of the α-helical segment. Finally, we point out a case where the hard-sphere plus stereochemical constraint model is insufficient to recapitulate the observed side-chain dihedral angle distribution, namely the distribution P(χ₃) for Met.
Optimization of Protein Backbone Dihedral Angles by Means of Hamiltonian Reweighting.
Margreitter, Christian; Oostenbrink, Chris
2016-09-26
Molecular dynamics simulations depend critically on the accuracy of the underlying force fields in properly representing biomolecules. Hence, it is crucial to validate the force-field parameter sets in this respect. In the context of the GROMOS force field, this is usually achieved by comparing simulation data to experimental observables for small molecules. In this study, we develop new amino acid backbone dihedral angle potential energy parameters based on the widely used 54A7 parameter set by matching to experimental J values and secondary structure propensity scales. In order to find the most appropriate backbone parameters, close to 100 000 different combinations of parameters have been screened. However, since the sheer number of combinations considered prohibits actual molecular dynamics simulations for each of them, we instead predicted the values for every combination using Hamiltonian reweighting. While the original 54A7 parameter set fails to reproduce the experimental data, we are able to provide parameters that match significantly better. However, to ensure applicability in the context of larger peptides and full proteins, further studies have to be undertaken.
A simple molecular mechanics integrator in mixed rigid body and dihedral angle space
Vitalis, Andreas, E-mail: a.vitalis@bioc.uzh.ch [Department of Biochemistry, University of Zurich, Winterthurerstrasse 190, CH-8057 Zurich (Switzerland); Pappu, Rohit V. [Department of Biomedical Engineering and Center for Biological Systems Engineering, Washington University in St. Louis, One Brookings Drive, Campus Box 1097, St. Louis, Missouri 63130 (United States)
2014-07-21
We propose a numerical scheme to integrate equations of motion in a mixed space of rigid-body and dihedral angle coordinates. The focus of the presentation is biomolecular systems and the framework is applicable to polymers with tree-like topology. By approximating the effective mass matrix as diagonal and lumping all bias torques into the time dependencies of the diagonal elements, we take advantage of the formal decoupling of individual equations of motion. We impose energy conservation independently for every degree of freedom and this is used to derive a numerical integration scheme. The cost of all auxiliary operations is linear in the number of atoms. By coupling the scheme to one of two popular thermostats, we extend the method to sample constant temperature ensembles. We demonstrate that the integrator of choice yields satisfactory stability and is free of mass-metric tensor artifacts, which is expected by construction of the algorithm. Two fundamentally different systems, viz., liquid water and an α-helical peptide in a continuum solvent are used to establish the applicability of our method to a wide range of problems. The resultant constant temperature ensembles are shown to be thermodynamically accurate. The latter relies on detailed, quantitative comparisons to data from reference sampling schemes operating on exactly the same sets of degrees of freedom.
Studies on the Dihedral Angle and Torsional Barriers for 4,4′-Bipyridine
CHEN Wen-kai; LU Chun-hai; XU Jiao; LI Jun-qian
2004-01-01
Using the Hartree-Fock, MP2, and the B3LYP, BLYP, mPW1PW91 density functional methods, each combined with the 6-31G(d), 6-311G(d), 6-311+(d), 6-311++G(d, p) cc-pvdz and cc-pvtz basis sets, the equilibrium geometry of 4,4′ -bipyridine was optimized and the internal rotational potential barriers heights at 0° (AE0), 90° (AE90) were obtained. For the best basis set (cc-pvtz) , the predicted dihedral angle e ranges from 37.0 to 37.8° for all methods except the Hartree-Fock method (43.7). This agreed with the estimation from the electron diffraction experimental measurement (37.2°). The inter-ring C-C distance, ranging from 147.2 to 148.7 pm ( 147 pm experimental), is intermediate between the typical aromatic C-C bond and the aliphatic C-C bond. The results show that the inter-ring o-conjugation between two pyridyl rings stabilizes the co-planar conformer and the steric repulsion between the ortho neighboring hydrogens belonging to different rings favors the non-planar orthogonal conformer.
Disequilibrium dihedral angles in layered intrusions: the microstructural record of fractionation
Holness, Marian; Namur, Olivier; Cawthorn, Grant
2013-04-01
The dihedral angle formed at junctions between two plagioclase grains and a grain of augite is only rarely in textural equilibrium in gabbros from km-scale crustal layered intrusions. The median of a population of these disequilibrium angles, Θcpp, varies systematically within individual layered intrusions, remaining constant over large stretches of stratigraphy with significant increases or decreases associated with the addition or reduction respectively of the number of phases on the liquidus of the bulk magma. The step-wise changes in Θcpp are present in Upper Zone of the Bushveld Complex, the Megacyclic Unit I of the Sept Iles Intrusion, and the Layered Series of the Skaergaard Intrusion. The plagioclase-bearing cumulates of Rum have a bimodal distribution of Θcpp, dependent on whether the cumulus assemblage includes clinopyroxene. The presence of the step-wise changes is independent of the order of arrival of cumulus phases and of the composition of either the cumulus phases or the interstitial liquid inferred to be present in the crystal mush. Step-wise changes in the rate of change in enthalpy with temperature (ΔH) of the cooling and crystallizing magma correspond to the observed variation of Θcpp, with increases of both ΔH and Θcpp associated with the addition of another liquidus phase, and decreases of both associated with the removal of a liquidus phase. The replacement of one phase by another (e.g. olivine ⇔ orthpyroxene) has little effect on ΔH and no discernible effect on Θcpp. An increase of ΔH is manifest by an increase in the fraction of the total enthalpy budget that is the latent heat of crystallization (the fractional latent heat). It also results in an increase in the amount crystallized in each incremental temperature drop (the crystal productivity). An increased fractional latent heat and crystal productivity result in an increased rate of plagioclase growth compared to that of augite during the final stages of solidification
Blum, Alexander Simon
2009-06-10
This thesis deals with the possibility of describing the flavor sector of the Standard Model of Particle Physics (with neutrino masses), that is the fermion masses and mixing matrices, with a discrete, non-abelian flavor symmetry. In particular, mass independent textures are considered, where one or several of the mixing angles are determined by group theory alone and are independent of the fermion masses. To this end a systematic analysis of a large class of discrete symmetries, the dihedral groups, is analyzed. Mass independent textures originating from such symmetries are described and it is shown that such structures arise naturally from the minimization of scalar potentials, where the scalars are gauge singlet flavons transforming non-trivially only under the flavor group. Two models are constructed from this input, one describing leptons, based on the group D{sub 4}, the other describing quarks and employing the symmetry D{sub 14}. In the latter model it is the quark mixing matrix element V{sub ud} - basically the Cabibbo angle - which is at leading order predicted from group theory. Finally, discrete flavor groups are discussed as subgroups of a continuous gauge symmetry and it is shown that this implies that the original gauge symmetry is broken by fairly large representations. (orig.)
Taguchi, Alexander T.; Mattis, Aidas J.; O'Malley, Patrick J.; Dikanov, Sergei A.; Wraight, Colin A.
2013-01-01
Only quinones with a 2-methoxy group can act simultaneously as the primary (QA) and secondary (QB) electron acceptors in photosynthetic reaction centers from Rb. sphaeroides. 13C HYSCORE measurements of the 2-methoxy in the semiquinone states, SQA and SQB, were compared with QM calculations of the 13C couplings as a function of dihedral angle. X-ray structures support dihedral angle assignments corresponding to a redox potential gap (ΔEm) between QA and QB of ~180 mV. This is consistent with the failure of a ubiquinone analog lacking the 2-methoxy to function as QB in mutant reaction centers with a ΔEm ≈ 160–195 mV. PMID:24079813
Scattering patterns of dihedral corner reflectors with impedance surface impedances
Balanis, Constantine A.; Griesser, Timothy; Liu, Kefeng
The radar cross section patterns of lossy dihedral corner reflectors are calculated using a uniform geometrical theory of diffraction for impedance surfaces. All terms of up to third order reflections are considered for patterns in the principal plane. The surface waves are included whenever they exist for reactive surface impedances. The dihedral corner reflectors examined have right, obtuse, and acute interior angles, and patterns over the entire 360 deg azimuthal plane are calculated. The surface impedances can be different on the four faces of the dihedral corner reflector; however, the surface impedance must be uniform over each face. Computed cross sections are compared with a moment method technique for a dielectric/ferrite absorber coating on a metallic corner reflector. The analysis of the dihedral corner reflector is important because it demonstrates many of the important scattering contributors of complex targets including both interior and exterior wedge diffraction, half-plane diffraction, and dominant multiple reflections and diffractions.
RCS analysis and reduction for lossy dihedral corner reflectors
Griesser, Timothy; Balanis, Constantine A.; Liu, Kefeng
1989-05-01
The radar-cross-section (RCS) patterns of lossy dihedral corner reflectors are calculated, using a uniform geometrical theory of diffraction for impedance surfaces. All terms of up to third-order reflections and diffractions are considered for patterns in the principal plane. The surface waves are included whenever they exist for reactive surface impedances. The dihedral corner reflectors examined have right, obtuse, and acute interior angles, and patterns over the entire 360 deg azimuthal plane are calculated. The surface impedances can be different on the four faces of the dihedral corner reflector; however, the surface impedance must be uniform over each face. Computed cross sections are compared with the results of a moment-method technique for a dielectric/ferrite absorber coating on a metallic corner reflector.
Song Lei; Yang Hua; Zhang Yang; Zhang Haoyu; Huang Jun
2014-01-01
The influence of dihedral layout on lateral-directional dynamic stability of the tailless flying wing aircraft is discussed in this paper. A tailless flying wing aircraft with a large aspect ratio is selected as the object of study, and the dihedral angle along the spanwise sections is divided into three segments. The influence of dihedral layouts is studied. Based on the stability derivatives cal-culated by the vortex lattice method code, the linearized small-disturbance equations of the lateral modes are used to determine the mode dynamic characteristics. By comparing 7056 configurations with different dihedral angle layouts, two groups of stability optimized dihedral layout concepts are created. Flight quality close to Level 2 requirements is achieved in these optimized concepts without any electric stability augmentation system.
Nongpiur, Monisha E; Haaland, Benjamin A; Perera, Shamira A; Friedman, David S; He, Mingguang; Sakata, Lisandro M; Baskaran, Mani; Aung, Tin
2014-01-01
To develop a score along with an estimated probability of disease for detecting angle closure based on anterior segment optical coherence tomography (AS OCT) imaging. Cross-sectional study. A total of 2047 subjects 50 years of age and older were recruited from a community polyclinic in Singapore. All subjects underwent standardized ocular examination including gonioscopy and imaging by AS OCT (Carl Zeiss Meditec). Customized software (Zhongshan Angle Assessment Program) was used to measure AS OCT parameters. Complete data were available for 1368 subjects. Data from the right eyes were used for analysis. A stepwise logistic regression model with Akaike information criterion was used to generate a score that then was converted to an estimated probability of the presence of gonioscopic angle closure, defined as the inability to visualize the posterior trabecular meshwork for at least 180 degrees on nonindentation gonioscopy. Of the 1368 subjects, 295 (21.6%) had gonioscopic angle closure. The angle closure score was calculated from the shifted linear combination of the AS OCT parameters. The score can be converted to an estimated probability of having angle closure using the relationship: estimated probability = e(score)/(1 + e(score)), where e is the natural exponential. The score performed well in a second independent sample of 178 angle-closure subjects and 301 normal controls, with an area under the receiver operating characteristic curve of 0.94. A score derived from a single AS OCT image, coupled with an estimated probability, provides an objective platform for detection of angle closure. Copyright © 2014 Elsevier Inc. All rights reserved.
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
Xiao Liu
2016-01-01
Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.
Computing symmetric colorings of the dihedral group
Zelenyuk, Yuliya
2016-06-01
A symmetry on a group G is a mapping G ∋ x ↦ gx-1 g ∈ G, where g ∈ G. A subset A ⊆ G is symmetric if it is invariant under some symmetry, that is, A = gA-1g. The notion of symmetry has interesting relations to enumerative combinatorics. A coloring is symmetric if χ(gx-1g) = χ(x) for some g ∈ G. We discuss an approach how to compute the number of symmetric r-colorings for any finite group. Using this approach we derive the formula for the number of symmetric r-colorings of the dihedral group D3.
Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio
2015-12-01
Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space.
Dihedral-based segment identification and classification of biopolymers II: polynucleotides.
Nagy, Gabor; Oostenbrink, Chris
2014-01-27
In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which--to our knowledge--were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides.
Dihedral-Based Segment Identification and Classification of Biopolymers II: Polynucleotides
2013-01-01
In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which—to our knowledge—were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides. PMID:24364355
The dihedral corner reflector as a reference target
Corona, P.; Ferrara, G.; Gennarelli, C.
The radiation properties of a dihedral corner reflector are analyzed in detail in order to assess the effectiveness of such a device as a standard reference in experimental determinations of radar cross section. A short review of reference targets is presented, and the physical optics approach and the images method are used to develop a mathematical model for the dihedral corner. Results from a computer program implemented to evaluate the field backscattered from the corner and to compute patterns for various dihedral sizes are reported. It is concluded that the dihedral corner can be conveniently used as a reference target by scanning in a plane containing the corner wedge.
Dihedral Galois covers of algebraic varieties and the simple cases
Catanese, , Fabrizio; Perroni, Fabio
2017-08-01
In this article we investigate the algebra and geometry of dihedral covers of smooth algebraic varieties. To this aim we first describe the Weil divisors and the Picard group of divisorial sheaves on normal double covers. Then we provide a structure theorem for dihedral covers, that is, given a smooth variety Y, we describe the algebraic ;building data; on Y which are equivalent to the existence of such covers π : X → Y. We introduce then two special very explicit classes of dihedral covers: the simple and the almost simple dihedral covers, and we determine their basic invariants. For the simple dihedral covers we also determine their natural deformations. In the last section we give an application to fundamental groups.
Dihedral-based segment identification and classification of biopolymers I: proteins.
Nagy, Gabor; Oostenbrink, Chris
2014-01-27
A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail.
Dihedral-Based Segment Identification and Classification of Biopolymers I: Proteins
2013-01-01
A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail. PMID:24364820
Consequences of nonorthogonality on the scattering properties of dihedral reflectors
Anderson, W. C.
1987-10-01
Small deviations from orthogonality can reduce drastically the backscattering radar cross section (RCS) of dihedral corner reflectors. The method of physical optics is used to calculate the magnitude of the reductions in RCS which result from modest departures from orthogonality. The theoretical results are then compared with experimental measurements which are found to be in very good agreement.
Blurring and Deblurring Digital Images Using the Dihedral Group
Husein Hadi Abbas Jassim
2015-12-01
Full Text Available A new method of blurring and deblurring digital images is presented. The approach is based on using new filters generating from average filter and H-filters using the action of the dihedral group. These filters are called HB-filters; used to cause a motion blur and then deblurring affected images. Also, enhancing images using HB-filters is presented as compared to other methods like Average, Gaussian, and Motion. Results and analysis show that the HB-filters are better in peak signal to noise ratio (PSNR and RMSE.
Active Dihedral Control System for a Torisionally Flexible Wing
Kendall, Greg T. (Inventor); Lisoski, Derek L. (Inventor); Morgan, Walter R. (Inventor); Griecci, John A. (Inventor)
2015-01-01
A span-loaded, highly flexible flying wing, having horizontal control surfaces mounted aft of the wing on extended beams to form local pitch-control devices. Each of five spanwise wing segments of the wing has one or more motors and photovoltaic arrays, and produces its own lift independent of the other wing segments, to minimize inter-segment loads. Wing dihedral is controlled by separately controlling the local pitch-control devices consisting of a control surface on a boom, such that inboard and outboard wing segment pitch changes relative to each other, and thus relative inboard and outboard lift is varied.
Blurring and Deblurring Digital Images Using the Dihedral Group
Husein Hadi Abbas Jassim; Zahir M. Hussain; Hind R.M Shaaban; Kawther B.R. Al-dbag
2015-01-01
A new method of blurring and deblurring digital images is presented. The approach is based on using new filters generating from average filter and H-filters using the action of the dihedral group. These filters are called HB-filters; used to cause a motion blur and then deblurring affected images. Also, enhancing images using HB-filters is presented as compared to other methods like Average, Gaussian, and Motion. Results and analysis show that the HB-filters are better in peak signal to noise...
The Representations of Quantum Double of Dihedral Groups
Dong, Jingcheng
2011-01-01
Let $k$ be an algebraically closed field of odd characteristic $p$, and let $D_n$ be the dihedral group of order $2n$ such that $p\\mid 2n$. Let $D(kD_n)$ denote the quantum double of the group algebra $kD_n$. In this paper, we describe the structures of all finite dimensional indecomposable left $D(kD_n)$-modules, equivalently, of all finite dimensional indecomposable Yetter-Drinfeld $kD_n$-modules, and classify them.
Bilangan Kromatik Grap Commuting dan Non Commuting Grup Dihedral
Handrini Rahayuningtyas
2015-11-01
Full Text Available Commuting graph is a graph that has a set of points X and two different vertices to be connected directly if each commutative in G. Let G non abelian group and Z(G is a center of G. Noncommuting graph is a graph which the the vertex is a set of G\\Z(G and two vertices x and y are adjacent if and only if xy≠yx. The vertex colouring of G is giving k colour at the vertex, two vertices that are adjacent not given the same colour. Edge colouring of G is two edges that have common vertex are coloured with different colour. The smallest number k so that a graph can be coloured by assigning k colours to the vertex and edge called chromatic number. In this article, it is available the general formula of chromatic number of commuting and noncommuting graph of dihedral group
Ostermeir, Katja; Zacharias, Martin
2014-01-15
A Hamiltonian Replica-Exchange Molecular Dynamics (REMD) simulation method has been developed that employs a two-dimensional backbone and one-dimensional side chain biasing potential specifically to promote conformational transitions in peptides. To exploit the replica framework optimally, the level of the biasing potential in each replica was appropriately adapted during the simulations. This resulted in both high exchange rates between neighboring replicas and improved occupancy/flow of all conformers in each replica. The performance of the approach was tested on several peptide and protein systems and compared with regular MD simulations and previous REMD studies. Improved sampling of relevant conformational states was observed for unrestrained protein and peptide folding simulations as well as for refinement of a loop structure with restricted mobility of loop flanking protein regions.
2011-03-24
Nov. 1987. [10] Havrilla , M. “EENG622 lecture notes”. AFIT EENG622 - Advanced Electro- magnetics. [11] Jackson, Julie Ann. Three-Dimensional Feature...Apr. 1984. [19] Peebles, Peyton Z. Radar Principles. John Wiley & Sons, New York, 1998. [20] Ruck, George T., Donald E. Barrick, William D. Stuart
Thulstrup, Peter W.; Hoffmann, Søren Vrønning; Hansen, Bjarke K.V.;
2011-01-01
is supported by the results of detailed quantum chemical Time Dependent Density Functional Theory (TD-DFT) calculations. The resulting analysis has profound implications for the understanding of the optical, photochemical, and photophysical characteristics of this and related chromophores, of importance...
A Peptoid Square Helix via Synergistic Control of Backbone Dihedral Angles.
Gorske, Benjamin C; Mumford, Emily M; Gerrity, Charles G; Ko, Imelda
2017-06-21
The continued expansion of the fields of macromolecular chemistry and nanoscience has motivated the development of new secondary structures that can serve as architectural elements of innovative materials, molecular machines, biological probes, and even commercial medicines. Synthetic foldamers are particularly attractive systems for developing such elements because they are specifically designed to facilitate synthetic manipulation and functional diversity. However, relatively few predictive design principles exist that permit both rational and modular control of foldamer secondary structure, while maintaining the capacity for facile diversification of displayed functionality. We demonstrate here that the synergistic application of two such principles in the design of peptoid foldamers yields a new and unique secondary structure that we term an "η-helix" due to its repeating turns, which are highly reminiscent of peptide β-turns. Solution-phase structures of η-helices were obtained by simulated annealing using NOE-derived distance restraints, and the NMR spectra of a series of designed η-helices were altogether consistent with the primary adoption of this structure. The structure is resilient to solvent and temperature changes, and accommodates diversification without requiring postsynthetic manipulation. The unique shape, broad structural stability, and synthetic accessibility of η-helices could facilitate their utilization in a wide range of applications.
Jindal Shveta
2010-01-01
Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.
Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik
2003-01-01
The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...
Fitting of dihedral terms in classical force fields as an analytic linear least-squares problem.
Hopkins, Chad W; Roitberg, Adrian E
2014-07-28
The derivation and optimization of most energy terms in modern force fields are aided by automated computational tools. It is therefore important to have algorithms to rapidly and precisely train large numbers of interconnected parameters to allow investigators to make better decisions about the content of molecular models. In particular, the traditional approach to deriving dihedral parameters has been a least-squares fit to target conformational energies through variational optimization strategies. We present a computational approach for simultaneously fitting force field dihedral amplitudes and phase constants which is analytic within the scope of the data set. This approach completes the optimal molecular mechanics representation of a quantum mechanical potential energy surface in a single linear least-squares fit by recasting the dihedral potential into a linear function in the parameters. We compare the resulting method to a genetic algorithm in terms of computational time and quality of fit for two simple molecules. As suggested in previous studies, arbitrary dihedral phases are only necessary when modeling chiral molecules, which include more than half of drugs currently in use, so we also examined a dihedral parametrization case for the drug amoxicillin and one of its stereoisomers where the target dihedral includes a chiral center. Asymmetric dihedral phases are needed in these types of cases to properly represent the quantum mechanical energy surface and to differentiate between stereoisomers about the chiral center.
Sicard, Francois
2012-01-01
Well-Tempered Metadynamics (WTmetaD) is an efficient method to enhance the reconstruction of the free-energy surface of proteins. WTmetaD guarantees a faster convergence in the long time limit in comparison with the standard metadynamics. It still suffers however from the same limitation, i.e. the non trivial choice of pertinent collective variables (CVs). To circumvent this problem, we couple WTmetaD with a set of CVs generated from a dihedral Principal Component Analysis (dPCA) on the Ramachadran dihedral angles describing the backbone structure of the protein. The dPCA provides a generic method to extract relevant CVs built from internal coordinates. We illustrate the robustness of this method in the case of the small and very diffusive Metenkephalin pentapeptide, and highlight a criterion to limit the number of CVs necessary to biased the metadynamics simulation. The free-energy landscape (FEL) of Met-enkephalin built on CVs generated from dPCA is found rugged compared with the FEL built on CVs extracted ...
Dihedral f-tilings of the sphere by rhombi and triangles
Ana M. Breda
2005-12-01
Full Text Available We classify, up to an isomorphism, the class of all dihedral f-tilings of S 2, whose prototiles are a spherical triangle and a spherical rhombus. The equiangular case was considered and classified in Ana M. Breda and Altino F. Santos, Dihedral f-tilings of the sphere by spherical triangles and equiangular well-centered quadrangles. Here we complete the classification considering the case of non-equiangular rhombi.
Strange, P.
2012-01-01
In this paper we demonstrate a surprising aspect of quantum mechanics that is accessible to an undergraduate student. We discuss probability backflow for an electron in a constant magnetic field. It is shown that even for a wavepacket composed entirely of states with negative angular momentum the effective angular momentum can take on positive…
PO Analysis for RCS of Nonorthogonal Dihedral Corner Reflectors Coated by RAM
无
2001-01-01
The backscattering radar cross section (RCS) of nonorthogonal dihedral corner reflectors coated by RAM (radar absorbing materials) is formulated by the method of PO (physical optics), where singly, doubly, and triply reflected contributions are considered. The final expressions are analytical and allow for the incidence nonperpendicular to the fold axis of the reflector. The results are compared with ones of MoM (method of moment), which shows that the trend of backscatter patterr of the dihedral corner reflector can be well predicted by this method.
THE NUCLEAR ENCOUNTER PROBABILITY
SMULDERS, PJM
1994-01-01
This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.
Floating volumetric image formation using a dihedral corner reflector array device.
Miyazaki, Daisuke; Hirano, Noboru; Maeda, Yuki; Yamamoto, Siori; Mukai, Takaaki; Maekawa, Satoshi
2013-01-01
A volumetric display system using an optical imaging device consisting of numerous dihedral corner reflectors placed perpendicular to the surface of a metal plate is proposed. Image formation by the dihedral corner reflector array (DCRA) is free from distortion and focal length. In the proposed volumetric display system, a two-dimensional real image is moved by a mirror scanner to scan a three-dimensional (3D) space. Cross-sectional images of a 3D object are displayed in accordance with the position of the image plane. A volumetric image is observed as a stack of the cross-sectional images. The use of the DCRA brings compact system configuration and volumetric real image generation with very low distortion. An experimental volumetric display system including a DCRA, a galvanometer mirror, and a digital micro-mirror device was constructed to verify the proposed method. A volumetric image consisting of 1024×768×400 voxels was formed by the experimental system.
Coordination field calculation for rare earth complexes in dihedral symmetry field
范英芳; 杨频; 潘大丰; 王越奎
1995-01-01
The coordination field perturbation matrix element expressions about D2-field of the terms 2S+1Lf (J=0 - 8 and 7=1/2 - 15/2) with fN (N=1 -13) configuration have been derived The concrete forms of the DSCPCF parameters Akm in the dihedral field (D2, C2v) for various ligand numbers (5 -12) and their reducing behavior in the higher symmetry fields (D4, C4v, D2d, D4d, D2k, D4h and Oh) are discussed with the double sphere coordination point charge field (DSCPCF) model and the irreducible operator tensor method. Besides, the corresponding computational schemes have been developed and the computer program DSF.D has been compiled, which is applicable for the spectral analysis of the rare earth ion complexes with arbitrary ligand numbers in the dihedral, tetragonal and cubical symmetry fields.
Hu, Yao; Shi, Rui
2016-10-01
Optical micro-structure array, including microlens array and pyramid array, has the function of integral imaging or diffraction beam-splitting. Careful measurement of the 3D profile of the array is a basic approach for insuring its quality. However, due to the limited numerical aperture of microscopy, when the surface is too steep, typically larger than 45 degrees, little light will be reflected or scattered back to the measurement equipment. The signal-to-noise-ratio will drop below the measurable threshold and information will be lost during measurement. In our case, the dihedral of the sample surface is 90 degrees. Intuitively, the reflected rays should be parallel to the incident rays after twice reflection and can be picked up by the detector. Nevertheless, the white-light interference microscope still showed no information on the 45- degree-inclined surface. In this paper, we study the twice-reflection of the dihedral angle of 90 degrees. We put it in the test beam of a spherical interferometer to simulate the situation in microscope. Simulation and real experiments suggest that the twice-reflection beam is of low spatial coherence and may act as the background intensity in white-light interferogram. This result cannot lead to a novel testing approach directly but points out the problem. We will sprout new idea based on it.
Generating finite cyclic and dihedral groups using sequential insertion systems with interactions
Fong, Wan Heng; Sarmin, Nor Haniza; Turaev, Sherzod; Yosman, Ahmad Firdaus
2017-04-01
The operation of insertion has been studied extensively throughout the years for its impact in many areas of theoretical computer science such as DNA computing. First introduced as a generalization of the concatenation operation, many variants of insertion have been introduced, each with their own computational properties. In this paper, we introduce a new variant that enables the generation of some special types of groups called sequential insertion systems with interactions. We show that these new systems are able to generate all finite cyclic and dihedral groups.
Concerted dihedral rotations give rise to internal friction in unfolded proteins.
Echeverria, Ignacia; Makarov, Dmitrii E; Papoian, Garegin A
2014-06-18
Protein chains undergo conformational diffusion during folding and dynamics, experiencing both thermal kicks and viscous drag. Recent experiments have shown that the corresponding friction can be separated into wet friction, which is determined by the solvent viscosity, and dry friction, where frictional effects arise due to the interactions within the protein chain. Despite important advances, the molecular origins underlying dry friction in proteins have remained unclear. To address this problem, we studied the dynamics of the unfolded cold-shock protein at different solvent viscosities and denaturant concentrations. Using extensive all-atom molecular dynamics simulations we estimated the internal friction time scales and found them to agree well with the corresponding experimental measurements (Soranno et al. Proc. Natl. Acad. Sci. U.S.A. 2012, 109, 17800-17806). Analysis of the reconfiguration dynamics of the unfolded chain further revealed that hops in the dihedral space provide the dominant mechanism of internal friction. Furthermore, the increased number of concerted dihedral moves at physiological conditions suggest that, in such conditions, the concerted motions result in higher frictional forces. These findings have important implications for understanding the folding kinetics of proteins as well as the dynamics of intrinsically disordered proteins.
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
s-REGULAR DIHEDRAL COVERINGS OF THE COMPLETE GRAPH OF ORDER 4
FENG YANQUAN(冯衍全); J.H. KWAK
2004-01-01
A graph is s-regular if its automorphism group acts regularly on the set of its s-arcs. An infinite family of cubic 1-regular graphs was constructed in [7] as cyclic coverings of the three-dimensional Hypercube, and a classification of all s-regular cyclic coverings of the complete bipartite graph of order 6 was given in [8] for each s ≥ 1, whose fibre preserving automorphism subgroups act arc-transitively. In this paper, the authors classify all s-regular dihedral coverings of the complete graph of order 4 for each s ≥ 1, whose fibre-preserving automorphism subgroups act arc-transitively. As a result, a new infinite family of cubic 1-regular graphs is constructed.
Numerical analysis of magnetic states mixing in the Heisenberg model with the dihedral symmetry
Jaśniewicz-Pacer K.
2013-01-01
Full Text Available The total spin number S is not a ‘good quantum number for’ the Heisenberg model with singleion anisotropy, so the Hamiltonian eigenstates with diﬀerent S may form linear combinations. Sometimes it is assumed that S can be used as an ‘approximate quantum number’, though some results show that mixing of S-states is important in investigations of magnetic molecules. Some small spin systems with the dihedral symmetry are analyzed to investigate diﬀerent schemes of mixing and its dependence on the anisotropy parameter. The results show various behavior of the magnetic state mixing. The mean (over a state value of total spin is quite stable for the ground state, but in other cases this dependence is nonlinear and sometimes non-monotonic.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Angle Tree: Nearest Neighbor Search in High Dimensions with Low Intrinsic Dimensionality
Zvedeniouk, Ilia
2010-01-01
We propose an extension of tree-based space-partitioning indexing structures for data with low intrinsic dimensionality embedded in a high dimensional space. We call this extension an Angle Tree. Our extension can be applied to both classical kd-trees as well as the more recent rp-trees. The key idea of our approach is to store the angle (the "dihedral angle") between the data region (which is a low dimensional manifold) and the random hyperplane that splits the region (the "splitter"). We show that the dihedral angle can be used to obtain a tight lower bound on the distance between the query point and any point on the opposite side of the splitter. This in turn can be used to efficiently prune the search space. We introduce a novel randomized strategy to efficiently calculate the dihedral angle with a high degree of accuracy. Experiments and analysis on real and synthetic data sets shows that the Angle Tree is the most efficient known indexing structure for nearest neighbor queries in terms of preprocessing ...
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Probability and Relative Frequency
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Omotuyi, Olaposi I; Hamada, Tsuyoshi
2015-01-01
Inhibitors of human furin may represent the clinical remedy for very aggressive cancer, viral, and bacterial infections. Most of the currently available inhibitors are weak in terms of potency, drug-likeness, and furin specificity thereby necessitating the development of newer compounds especially mechanism-based inhibitors. Here, the roles of active site Cys198 (C198), His194 (H194), and Ser386 (S386) were investigated using computational-site-directed mutagenesis and molecular dynamics (MD) simulation. Data were obtained from six (6) biosystems: wildtype (C198/S386), furin-C198G (S386), furin-S386G (C198), and their peptide (nascent hydrolyzed peptide H2N-RTRR-CO2) bound complexes. The results strongly supported that in wildtype furin but not S386G and C198G mutants, following S386/scissile carbon attack (4.0 Å), the peptide retracted from the active site, representing peptide release post hydrolysis. Furthermore, in S386G mutant, C194 side chain thiolate ion may act as the nucleophile replacement but competing electron-rich centers (H194, H364) and energetically unattainable geometric strain on the peptide may constitute the limiting factors. In biosystems not complexed with peptide (representative of pre-attack state), C198 preferentially engaged H194 imidazole moiety via sulfur-π bond system causing a dihedral and positional restraints on the imidazole ring for ultimate alignment of its NE2 hydrogen atom with the side chain enolate oxygen of S364 required for optimal proton transfer. In summary, small-molecular-weight compounds with dual serine and cysteine protease inhibitory actions may represent a new class of potent and furin-selective compounds for future clinical applications.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Dynamical Simulation of Probabilities
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
A new angle on the Euler angles
Markley, F. Landis; Shuster, Malcolm D.
1995-01-01
We present a generalization of the Euler angles to axes beyond the twelve conventional sets. The generalized Euler axes must satisfy the constraint that the first and the third are orthogonal to the second; but the angle between the first and third is arbitrary, rather than being restricted to the values 0 and pi/2, as in the conventional sets. This is the broadest generalization of the Euler angles that provides a representation of an arbitrary rotation matrix. The kinematics of the generalized Euler angles and their relation to the attitude matrix are presented. As a side benefit, the equations for the generalized Euler angles are universal in that they incorporate the equations for the twelve conventional sets of Euler angles in a natural way.
Deep learning methods for protein torsion angle prediction.
Li, Haiou; Hou, Jie; Adhikari, Badri; Lyu, Qiang; Cheng, Jianlin
2017-09-18
Deep learning is one of the most powerful machine learning methods that has achieved the state-of-the-art performance in many domains. Since deep learning was introduced to the field of bioinformatics in 2012, it has achieved success in a number of areas such as protein residue-residue contact prediction, secondary structure prediction, and fold recognition. In this work, we developed deep learning methods to improve the prediction of torsion (dihedral) angles of proteins. We design four different deep learning architectures to predict protein torsion angles. The architectures including deep neural network (DNN) and deep restricted Boltzmann machine (DRBN), deep recurrent neural network (DRNN) and deep recurrent restricted Boltzmann machine (DReRBM) since the protein torsion angle prediction is a sequence related problem. In addition to existing protein features, two new features (predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments) are used as input to each of the four deep learning architectures to predict phi and psi angles of protein backbone. The mean absolute error (MAE) of phi and psi angles predicted by DRNN, DReRBM, DRBM and DNN is about 20-21° and 29-30° on an independent dataset. The MAE of phi angle is comparable to the existing methods, but the MAE of psi angle is 29°, 2° lower than the existing methods. On the latest CASP12 targets, our methods also achieved the performance better than or comparable to a state-of-the art method. Our experiment demonstrates that deep learning is a valuable method for predicting protein torsion angles. The deep recurrent network architecture performs slightly better than deep feed-forward architecture, and the predicted residue contact number and the error distribution of torsion angles extracted from sequence fragments are useful features for improving prediction accuracy.
On Quantum Conditional Probability
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Probability, Nondeterminism and Concurrency
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
... Programs Home > Statistics and Data > Glaucoma, Open-angle Glaucoma, Open-angle Open-angle Glaucoma Defined In open-angle glaucoma, the fluid passes ... 2010 2010 U.S. Age-Specific Prevalence Rates for Glaucoma by Age and Race/Ethnicity The prevalence of ...
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability in quantum mechanics
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Federal Laboratory Consortium — Description:The FTA32 goniometer provides video-based contact angle and surface tension measurement. Contact angles are measured by fitting a mathematical expression...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Contact Angle Effects in Boiling Heat Transfer
Urquiola, Erwin; Fujita, Yasunobu
2002-01-01
This paper reports boiling experiments with pure water and surfactant solutions of SDS on horizontal heating surface. The static contact angle, rather than the surface tension value, was found to be the leading factor for the results and probably its prev
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Contributions to quantum probability
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.
Whiting, Alan B
2014-01-01
Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Generalized Probability Functions
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Optimal relative view angles for an object viewed multiple times
Gilani, Syed U.; Shende, Apoorva; Nguyen, Bao; Stilwell, Daniel J.
2015-05-01
Typically, the detection of an object of interest improves as we view the object from multiple angles. For cases where viewing angle matters, object detection can be improved further by optimally selecting the relative angles of multiple views. This motivates the search for viewing angles that maximize the expected probability of detection. Although our work is motivated by applications in subsea sensing, our fundamental analysis is easily adapted for other classes of applications. The specific challenge that motivates our work is the selection of optimal viewing angles for subsea sensing in which sonar is used for bathymetric imaging.
Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.
2014-01-01
Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…
Maadooliat, Mehdi
2012-08-27
Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Probabilities for Solar Siblings
Valtonen, M; Bobylev, V V; Myllari, A
2015-01-01
We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Learning unbelievable marginal probabilities
Pitkow, Xaq; Miller, Ken D
2011-01-01
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...
Angle dependence of Andreev scattering at semiconductor-superconductor interfaces
Mortensen, Asger; Flensberg, Karsten; Jauho, Antti-Pekka
1999-01-01
and increase of the probability of normal reflection. We show that in the presence of a Fermi velocity mismatch between the semiconductor and the superconductor the angles of incidence and transmission are related according to the well-known Snell's law in optics. As a consequence there is a critical angle...
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Podzharenko, Volodymyr A.; Kulakov, Pavlo I.
2001-06-01
The photo-electric angle transmitter of rotation is offered, at which the output voltage is linear function of entering magnitude. In a transmitter the linear phototransducer is used on the basis of pair photo diode -- operating amplifier, which output voltage is linear function of the area of an illuminated photosensitive stratum, and modulator of a light stream of the special shape, which ensures a linear dependence of this area from an angle of rotation. The transmitter has good frequent properties and can be used for dynamic measurements of an angular velocity and angle of rotation, in systems of exact drives and systems of autocontrol.
Savage s Concept of Probability
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
Farley, Gary L.
1990-01-01
Bias-direction or angle-ply weaving is proposed new process for weaving fibers along bias in conventional planar fabric or in complicated three-dimensional multilayer fabric preform of fiber-reinforced composite structure. Based upon movement of racks of needles and corresponding angle yarns across fabric as fabric being formed. Fibers woven along bias increases shear stiffness and shear strength of preform, increasing value of preform as structural member.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Probability distributions for magnetotellurics
Stodt, John A.
1982-11-01
Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.
Gap probability - Measurements and models of a pecan orchard
Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI
1992-01-01
Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.
RANDOM VARIABLE WITH FUZZY PROBABILITY
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
P. V. Ramakrishna
2009-01-01
Full Text Available This article presents the study of Tip Chordline Sweeping (TCS and Axial Sweeping (AXS of low-speed axial compressor rotor blades against the performance of baseline unswept rotor (UNS for different tip clearance levels. The first part of the paper discusses the changes in design parameters when the blades are swept, while the second part throws light on the effect of sweep on tip leakage flow-related phenomena. 15 domains are studied with 5 sweep configurations (0∘, 20∘ TCS, 30∘ TCS, 20∘ AXS, and 30∘ AXS and for 3 tip clearances (0.0%, 0.7%, and 2.7% of the blade chord. A commercial CFD package is employed for the flow simulations and analysis. Results are well validated with experimental data. Forward sweep reduced the flow incidences. This is true all over the span with axial sweeping while little higher incidences below the mid span are observed with tip chordline sweeping. Sweeping is observed to lessen the flow turning. AXS rotors demonstrated more efficient energy transfer among the rotors. Tip chordline sweep deflected the flow towards the hub while effective positive dihedral induced with axial sweeping resulted in outward deflection of flow streamlines. These deflections are more at lower mass flow rates.
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Conditionals, probability, and belief revision
Voorbraak, F.
1989-01-01
A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called
Three paths toward the quantum angle operator
Gazeau, Jean Pierre; Szafraniec, Franciszek Hugon
2016-12-01
We examine mathematical questions around angle (or phase) operator associated with a number operator through a short list of basic requirements. We implement three methods of construction of quantum angle. The first one is based on operator theory and parallels the definition of angle for the upper half-circle through its cosine and completed by a sign inversion. The two other methods are integral quantization generalizing in a certain sense the Berezin-Klauder approaches. One method pertains to Weyl-Heisenberg integral quantization of the plane viewed as the phase space of the motion on the line. It depends on a family of "weight" functions on the plane. The third method rests upon coherent state quantization of the cylinder viewed as the phase space of the motion on the circle. The construction of these coherent states depends on a family of probability distributions on the line.
Kim, Ho Kyung; Cho, Min Kook; Kim, Seong Sik [Pusan National University, Busan (Korea, Republic of)
2007-07-01
In computed tomography (CT), many situations are restricted to obtain enough number of projections or views to avoid artifacts such as streaking and geometrical distortion in the reconstructed images. Speed of motion of an object to be imaged can limit the number of views. Cardiovascular imaging is a representative example. Size of an object can also limit the complete traverse motion or geometrical complexity can obscure to be imaged at certain range of angles. These situations are frequently met in industrial nondestructive testing and evaluation. Dental CT also suffers from similar situation because cervical spine causes less x-ray penetration from some directions such that the available information is not sufficient for standard reconstruction algorithms. The limited angle tomography is now greatly paid attention as a new genre in medical and industrial imaging, popularly known as digital tomosynthesis. In this study, we introduce a modified filtered backprojection method in limited angle tomography and demonstrate its application for the dental imaging.
Maeda, Kei-ichi; Uzawa, Kunihito
2016-12-01
We discuss the dynamical D p -brane solutions describing any number of D p branes whose relative orientations are given by certain SU(2) rotations. These are the generalization of the static angled D p -brane solutions. We study the collision of the dynamical D3 brane with angles in type-II string theory, and show that the particular orientation of the smeared D3-brane configuration can provide an example of colliding branes if they have the same charges. Otherwise a singularity appears before D3 branes collide.
The Art of Probability Assignment
Dimitrov, Vesselin I
2012-01-01
The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Miles, James Edward; Frederiksen, Jane V.; Jensen, Bente Rona
2012-01-01
: Pelvic limbs from red foxes (Vulpes vulpes). METHODS: Q angles were measured on hip dysplasia (HD) and whole limb (WL) view radiographs of each limb between the acetabular rim, mid-point (Q1: patellar center, Q2: femoral trochlea), and tibial tuberosity. Errors of 0.5-2.0 mm at measurement landmarks...
Abraham, Raymond J; Leonard, Paul; Tormena, Cláudio F
2012-04-01
The (1) H chemical shifts of selected three-membered ring compounds in CDCl(3) solvent were obtained. This allowed the determination of the substituent chemical shifts of the substituents in the three-membered rings and the long-range effect of these rings on the distant protons. The substituent chemical shifts of common substituents in the cyclopropane ring differ considerably from the same substituents in acyclic fragments and in cyclohexane and were modelled in terms of a three-bond (γ)-effect. For long-range protons (more than three bonds removed), the substituent effects of the cyclopropane ring were analysed in terms of the cyclopropane magnetic anisotropy and steric effect. The cyclopropane magnetic anisotropy (ring current) shift was modelled by (a) a single equivalent dipole perpendicular to and at the centre of the cyclopropane ring and (b) by three identical equivalent dipoles perpendicular to the ring placed at each carbon atom. Model (b) gave a more accurate description of the (1) H chemical shifts and was the selected model. After parameterization, the overall root mean square error for the dataset of 289 entries was 0.068 ppm. The anisotropic effects are significant for the cyclopropane protons (ca 1 ppm) but decrease rapidly with distance. The heterocyclic rings of oxirane, thiirane and aziridine do not possess a ring current. (3) J(HH) couplings of the epoxy ring proton with side-chain protons were obtained and shown to be dependent on both the H-C-C-H and H-C-C-O orientations. Both density functional theory calculations and a simple Karplus-type equation gave general agreement with the observed couplings (root mean square error 0.5 Hz over a 10-Hz range).
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Hidden Variables or Positive Probabilities?
Rothman, T; Rothman, Tony
2001-01-01
Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Freeman-Durden Decomposition with Oriented Dihedral Scattering%引入有取向二面角散射的Freeman-Durden分解
闫剑; 李洋; 尹嫱; 洪文
2014-01-01
In this paper, when the azimuth direction of polarimetric Synthetic Aperature Radars (SAR) differs from the planting direction of crops, the double bounce of the incident electromagnetic waves from the terrain surface to the growing crops is investigated and compared with the normal double bounce. Oriented dihedral scattering model is developed to explain the investigated double bounce and is introduced into the Freeman-Durden decomposition. The decomposition algorithm corresponding to the improved decomposition is then proposed. The airborne polarimetric SAR data for agricultural land covering two flight tracks are chosen to validate the algorithm; the decomposition results show that for agricultural vegetated land, the improved Freeman-Durden decomposition has the advantage of increasing the decomposition coherency among the polarimetric SAR data along the different flight tracks.%该文首先考察了当极化SAR方位向与农作物种植行向不一致时，入射电磁波到地表、农作物的二次散射与一般二次散射的区别。其次，为描述这种二次散射，建立了有取向的二面角散射模型，并将该模型引入到Freeman-Durden目标分解中，设计了相应的目标分解算法。最后，选取同一农作物种植区两种航迹的机载全极化SAR 数据实现了该分解算法。实验结果证明，对于农作物种植区，改进后的 Freeman-Durden分解能提升不同航迹下的极化SAR数据目标分解的一致性。
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Contact angle hysteresis explained.
Gao, Lichao; McCarthy, Thomas J
2006-07-04
A view of contact angle hysteresis from the perspectives of the three-phase contact line and of the kinetics of contact line motion is given. Arguments are made that advancing and receding are discrete events that have different activation energies. That hysteresis can be quantified as an activation energy by the changes in interfacial area is argued. That this is an appropriate way of viewing hysteresis is demonstrated with examples.
Morgan, Jeannie; Lynnerup, Niels; Hoppa, R.D.
2013-01-01
measurements taken from computed tomography (CT) scans. Previous reports have observed that the lateral angle size in females is significantly larger than in males. The method was applied to an independent series of 77 postmortem CT scans (42 males, 35 females) to validate its accuracy and reliability...... method appears to be of minimal practical use in forensic anthropology and archeology. © 2013 American Academy of Forensic Sciences....
Understanding Students' Beliefs about Probability.
Konold, Clifford
The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…
Expected utility with lower probabilities
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Varieties of Belief and Probability
D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik
2015-01-01
htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for
Landau-Zener Probability Reviewed
Valencia, C
2008-01-01
We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.
Probability and Statistics: 5 Questions
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Linear Positivity and Virtual Probability
Hartle, J B
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...
Survival probability and ruin probability of a risk model
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Relationship between the Angle of Repose and Angle of Internal ...
Keywords: Angle of repose, angle of internal friction, granular materials, triaxial compression ... such a granular material is sharp, making a steep .... study. Therefore, grains had to be condi- tioned to the respective moisture contents by adding ...
el Jaroudi, O; Picquenard, E; Demortier, A; Lelieur, J P; Corset, J
2000-06-12
The influence of the cations on bond length, valence, and torsion angle of S4(2-) and S5(2-) anions was examined in a series of solid alkali tetra- and pentasulfides by relating their Raman spectra to their known X-ray structures through a force-field analysis. The IR and Raman spectra of BaS4.H2O and the Raman spectra of (NH4)2S4.nNH3, gamma-Na2S4, and delta-Na2S5 are presented. The similarity of spectra of gamma-Na2S4 with those of BaS4.H2O suggests similar structures of the S4(2-) anions in these two compounds with a torsion angle smaller than 90 degrees. The variations of SS bond length, SSS valence angle, and dihedral angle of Sn2- anions are related to the polarization of the lone pair and electronic charge of the anion by the electric field of the cations. A correlation between the torsion angle and the SSS valence angle is shown as that previously reported between the length of the bond around which the torsion takes place and the dihedral angle value. These geometry changes are explained by the hyperconjugation concept and the electron long-pair repulsion.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Probability Ranking in Vector Spaces
Melucci, Massimo
2011-01-01
The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.
Holographic probabilities in eternal inflation.
Bousso, Raphael
2006-11-10
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Local Causality, Probability and Explanation
Healey, Richard A
2016-01-01
In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Variable angle correlation spectroscopy
Lee, Y K [Univ. of California, Berkeley, CA (United States)
1994-05-01
In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.
Diurnal distribution of sunshine probability
Aydinli, S.
1982-01-01
The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.
Probability representation of classical states
Man'ko, OV; Man'ko, [No Value; Pilyavets, OV
2005-01-01
Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
The probabilities of unique events.
Sangeet S Khemlani
Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Three lectures on free probability
2012-01-01
These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.
Angle-deviation optical profilometer
Chen-Tai Tan; Yuan-Sheng Chan; Zhen-Chin Lin; Ming-Hung Chiu
2011-01-01
@@ We propose a new optical profilometer for three-dimensional (3D) surface profile measurement in real time.The deviation angle is based on geometrical optics and is proportional to the apex angle of a test plate.Measuring the reflectivity of a parallelogram prism allows detection of the deviation angle when the beam is incident at the nearby critical angle. The reflectivity is inversely proportional to the deviation angle and proportional to the apex angle and surface height. We use a charge-coupled device (CCD) camera at the image plane to capture the reflectivity profile and obtain the 3D surface profile directly.%We propose a new optical profilometer for three-dimensional (3D) surface profile measurement in real time.The deviation angle is based on geometrical optics and is proportional to the apex angle of a test plate.Measuring the refiectivity of a parallelogram prism allows detection of the deviation angle when the beam is incident at the nearby critical angle. The refiectivity is inversely proportional to the deviation angle and proportional to the apex angle and surface height. We use a charge-coupled device (CCD) camera at the image plane to capture the refiectivity profile and obtain the 3D surface profile directly.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Heterodyne Interferometer Angle Metrology
Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud
2010-01-01
A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.
Volcano shapes, entropies, and eruption probabilities
Gudmundsson, Agust; Mohajeri, Nahid
2014-05-01
We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to
Equilibrium contact angle or the most-stable contact angle?
Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A
2014-04-01
It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation.
Cluster Membership Probability: Polarimetric Approach
Medhi, Biman J
2013-01-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Innovation and social probable knowledge
Marco Crocco
2000-01-01
In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...
Knowledge typology for imprecise probabilities.
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Fuzzy Markov chains: uncertain probabilities
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
DECOFF Probabilities of Failed Operations
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability representations of fuzzy systems
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
Generalization of the Euler Angles
Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis
2002-01-01
It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.
Small angle neutron scattering
Cousin Fabrice
2015-01-01
Full Text Available Small Angle Neutron Scattering (SANS is a technique that enables to probe the 3-D structure of materials on a typical size range lying from ∼ 1 nm up to ∼ a few 100 nm, the obtained information being statistically averaged on a sample whose volume is ∼ 1 cm3. This very rich technique enables to make a full structural characterization of a given object of nanometric dimensions (radius of gyration, shape, volume or mass, fractal dimension, specific area… through the determination of the form factor as well as the determination of the way objects are organized within in a continuous media, and therefore to describe interactions between them, through the determination of the structure factor. The specific properties of neutrons (possibility of tuning the scattering intensity by using the isotopic substitution, sensitivity to magnetism, negligible absorption, low energy of the incident neutrons make it particularly interesting in the fields of soft matter, biophysics, magnetic materials and metallurgy. In particular, the contrast variation methods allow to extract some informations that cannot be obtained by any other experimental techniques. This course is divided in two parts. The first one is devoted to the description of the principle of SANS: basics (formalism, coherent scattering/incoherent scattering, notion of elementary scatterer, form factor analysis (I(q→0, Guinier regime, intermediate regime, Porod regime, polydisperse system, structure factor analysis (2nd Virial coefficient, integral equations, characterization of aggregates, and contrast variation methods (how to create contrast in an homogeneous system, matching in ternary systems, extrapolation to zero concentration, Zero Averaged Contrast. It is illustrated by some representative examples. The second one describes the experimental aspects of SANS to guide user in its future experiments: description of SANS spectrometer, resolution of the spectrometer, optimization of
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
A Probability Density Function for Neutrino Masses and Mixings
Fortin, Jean-François; Marleau, Luc
2016-01-01
The anarchy principle leading to the see-saw ensemble is studied analytically with the usual tools of random matrix theory. The probability density function for the see-saw ensemble of $N\\times N$ matrices is obtained in terms of a multidimensional integral. This integral involves all light neutrino masses, leading to a complicated probability density function. It is shown that the probability density function for the neutrino mixing angles and phases is the appropriate Haar measure. The decoupling of the light neutrino masses and neutrino mixings implies no correlation between the neutrino mass eigenstates and the neutrino mixing matrix, in contradiction with observations but in agreement with some of the claims found in the literature.
Probability density function for neutrino masses and mixings
Fortin, Jean-François; Giasson, Nicolas; Marleau, Luc
2016-12-01
The anarchy principle leading to the seesaw ensemble is studied analytically with the usual tools of random matrix theory. The probability density function for the seesaw ensemble of N ×N matrices is obtained in terms of a multidimensional integral. This integral involves all light neutrino masses, leading to a complicated probability density function. It is shown that the probability density function for the neutrino mixing angles and phases is the appropriate Haar measure. The decoupling of the light neutrino masses and neutrino mixings implies no correlation between the neutrino mass eigenstates and the neutrino mixing matrix and leads to a loss of predictive power when comparing with observations. This decoupling is in agreement with some of the claims found in the literature.
Probability biases as Bayesian inference
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Cluster pre-existence probability
Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)
2011-10-15
Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Knot probabilities in random diagrams
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Logic, Probability, and Human Reasoning
2015-01-01
3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ
Probability and statistics: A reminder
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Objective probability and quantum fuzziness
Mohrhoff, U
2007-01-01
This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...
Empirical and Computational Tsunami Probability
Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.
2008-12-01
A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
From lattice BF gauge theory to area-angle Regge calculus
Bonzom, Valentin
2009-01-01
We consider Riemannian 4d BF lattice gauge theory, on a triangulation of spacetime. Introducing the simplicity constraints which turn BF theory into simplicial gravity, some geometric quantities of Regge calculus, areas, and 3d and 4d dihedral angles, are identified. The parallel transport conditions are taken care of to ensure a consistent gluing of simplices. We show that these gluing relations, together with the simplicity constraints, contain the constraints of area-angle Regge calculus in a simple way, via the group structure of the underlying BF gauge theory. This provides a precise road from constrained BF theory to area-angle Regge calculus. Doing so, a framework combining variables of lattice BF theory and Regge calculus is built. The action takes a form {\\it \\`a la Regge} and includes the contribution of the Immirzi parameter. In the absence of simplicity constraints, the standard spin foam model for BF theory is recovered. Insertions of local observables are investigated, leading to Casimir inserti...
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
PROBABILITY MODEL OF GUNTHER GENERATOR
无
2007-01-01
This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Hysteresis during contact angles measurement.
Diaz, M Elena; Fuentes, Javier; Cerro, Ramon L; Savage, Michael D
2010-03-15
A theory, based on the presence of an adsorbed film in the vicinity of the triple contact line, provides a molecular interpretation of intrinsic hysteresis during the measurement of static contact angles. Static contact angles are measured by placing a sessile drop on top of a flat solid surface. If the solid surface has not been previously in contact with a vapor phase saturated with the molecules of the liquid phase, the solid surface is free of adsorbed liquid molecules. In the absence of an adsorbed film, molecular forces configure an advancing contact angle larger than the static contact angle. After some time, due to an evaporation/adsorption process, the interface of the drop coexists with an adsorbed film of liquid molecules as part of the equilibrium configuration, denoted as the static contact angle. This equilibrium configuration is metastable because the droplet has a larger vapor pressure than the surrounding flat film. As the drop evaporates, the vapor/liquid interface contracts and the apparent contact line moves towards the center of the drop. During this process, the film left behind is thicker than the adsorbed film and molecular attraction results in a receding contact angle, smaller than the equilibrium contact angle.
Heffernan, Rhys; Paliwal, Kuldip; Lyons, James; Dehzangi, Abdollah; Sharma, Alok; Wang, Jihua; Sattar, Abdul; Yang, Yuedong; Zhou, Yaoqi
2015-01-01
Direct prediction of protein structure from sequence is a challenging problem. An effective approach is to break it up into independent sub-problems. These sub-problems such as prediction of protein secondary structure can then be solved independently. In a previous study, we found that an iterative use of predicted secondary structure and backbone torsion angles can further improve secondary structure and torsion angle prediction. In this study, we expand the iterative features to include solvent accessible surface area and backbone angles and dihedrals based on Cα atoms. By using a deep learning neural network in three iterations, we achieved 82% accuracy for secondary structure prediction, 0.76 for the correlation coefficient between predicted and actual solvent accessible surface area, 19° and 30° for mean absolute errors of backbone φ and ψ angles, respectively, and 8° and 32° for mean absolute errors of Cα-based θ and τ angles, respectively, for an independent test dataset of 1199 proteins. The accuracy of the method is slightly lower for 72 CASP 11 targets but much higher than those of model structures from current state-of-the-art techniques. This suggests the potentially beneficial use of these predicted properties for model assessment and ranking.
Hf Transition Probabilities and Abundances
Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...
Gd Transition Probabilities and Abundances
Den Hartog, E A; Sneden, C; Cowan, J J
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...
Glaister, P.
1997-09-01
Tetrahedral Bond Angle from Elementary Trigonometry The alternative approach of using the scalar (or dot) product of vectors enables the determination of the bond angle in a tetrahedral molecule in a simple way. There is, of course, an even more straightforward derivation suitable for students who are unfamiliar with vectors, or products thereof, but who do know some elementary trigonometry. The starting point is the figure showing triangle OAB. The point O is the center of a cube, and A and B are at opposite corners of a face of that cube in which fits a regular tetrahedron. The required bond angle alpha = AÔB; and using Pythagoras' theorem, AB = 2(square root 2) is the diagonal of a face of the cube. Hence from right-angled triangle OEB, tan(alpha/2) = (square root 2) and therefore alpha = 2tan-1(square root 2) is approx. 109° 28' (see Fig. 1).
Oriented angles in affine space
Włodzimierz Waliszewski
2004-05-01
Full Text Available The concept of a smooth oriented angle in an arbitrary affine space is introduced. This concept is based on a kinematics concept of a run. Also, a concept of an oriented angle in such a space is considered. Next, it is shown that the adequacy of these concepts holds if and only if the affine space, in question, is of dimension 2 or 1.
Post-Classical Probability Theory
Barnum, Howard
2012-01-01
This paper offers a brief introduction to the framework of "general probabilistic theories", otherwise known as the "convex-operational" approach the foundations of quantum mechanics. Broadly speaking, the goal of research in this vein is to locate quantum mechanics within a very much more general, but conceptually very straightforward, generalization of classical probability theory. The hope is that, by viewing quantum mechanics "from the outside", we may be able better to understand it. We illustrate several respects in which this has proved to be the case, reviewing work on cloning and broadcasting, teleportation and entanglement swapping, key distribution, and ensemble steering in this general framework. We also discuss a recent derivation of the Jordan-algebraic structure of finite-dimensional quantum theory from operationally reasonable postulates.
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
The Inductive Applications of Probability Calculus
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Probability landscapes for integrative genomics
Benecke Arndt
2008-05-01
Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a
王长群; 熊胜利
2004-01-01
A Cayley graph X=Cay(G,S) of group G is said to be normal if R(G),the group of right multiplications,is normal in Aut(X).An infinite family of normal one-regular Cayley graphs Cay(G,S) of quasi-dihedral groups G=〈x,y|x2m=y2=1,xy=xm+1〉 is obtained,where S={x,x-1,xs+1y,xs-1y},m=2s,and s is an even greater than 4.In addition,the normal and one-regular and 4-valent Cayley graphs of quasi-dihedral groups of order 2r are classified.It is proved that any 4-valent normal and one-regular Cayley graphs of quasi-dihedral ghoups G of order 2r are isomorphic to Cay(G,{x,x-1,xs+1 y,xs-1y}) where s=2r-2,r＞3.%群G的一个Cayley图X=Cay(G,S)称为正规的,如果右乘变换群R(G)在Aut(X)中正规.得到了拟二面体群G＝〈x,y|x2m=y2=1,xy=xm+1〉(其中m=2s,s为大于4的偶数)的一个无限类4度正规1-正则Cayley图 Cay(G,S),其中S={x,x-1,xs+1y,xs-1y},并且对2r阶拟二面体群的正规1-正则4度Cayley图进行了分类,其中r＞3.证明了2r阶拟二面体群的任意4度正规1-正则Cayley图同构于Cay(G,{x,x-1,xs+1y,xs-1y}),其中s=2r-2.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Statistical convergence of order $\\alpha$ in probability
Pratulananda Das; Sanjoy Ghosal; Sumit Som
2016-01-01
In this paper ideas of different types of convergence of a sequence of random variables in probability, namely, statistical convergence of order $\\alpha$ in probability, strong $p$-Ces$\\grave{\\mbox{a}}$ro summability of order $\\alpha$ in probability, lacunary statistical convergence or $S_{\\theta}$-convergence of order $\\alpha$ in probability, ${N_{\\theta}}$-convergence of order $\\alpha$ in probability have been introduced and their certain basic properties have been studied.
The Semiotic and Conceptual Genesis of Angle
Tanguay, Denis; Venant, Fabienne
2016-01-01
In the present study, we try to understand how students at the end of primary school conceive of angle: Is an angle a magnitude for them or a geometric figure, and how do they manage to coordinate the two aspects in their understanding of the concepts of angle and of angle measurement? With the aim of better grasping the way "angle" is…
Fusion probability in heavy nuclei
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections
Probable Linezolid-Induced Pancytopenia
Nita Lakhani
2005-01-01
Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
Avoiding Negative Probabilities in Quantum Mechanics
Nyambuya, Golden Gadzirayi
2013-01-01
As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Frequency scaling for angle gathers
Zuberi, M. A H
2014-01-01
Angle gathers provide an extra dimension to analyze the velocity after migration. Space-shift and time shift-imaging conditions are two methods used to obtain angle gathers, but both are reasonably expensive. By scaling the time-lag axis of the time-shifted images, the computational cost of the time shift imaging condition can be considerably reduced. In imaging and more so Full waveform inversion, frequencydomain Helmholtz solvers are used more often to solve for the wavefields than conventional time domain extrapolators. In such cases, we do not need to extend the image, instead we scale the frequency axis of the frequency domain image to obtain the angle gathers more efficiently. Application on synthetic data demonstrate such features.
Angle independent velocity spectrum determination
2014-01-01
An ultrasound imaging system (100) includes a transducer array (102) that emits an ultrasound beam and produces at least one transverse pulse-echo field that oscillates in a direction transverse to the emitted ultrasound beam and that receive echoes produced in response thereto and a spectral vel...... velocity estimator (110) that determines a velocity spectrum for flowing structure, which flows at an angle of 90 degrees and flows at angles less than 90 degrees with respect to the emitted ultrasound beam, based on the received echoes....
Scaling of misorientation angle distributions
Hughes, D.A.; Chrzan, D.C.; Liu, Q.
1998-01-01
The measurement of misorientation angle distributions following different amounts of deformation in cold-rolled aluminum and nickel and compressed stainless steel is reported. The sealing of the dislocation cell boundary misorientation angle distributions is studied. Surprisingly, the distributions...... for the small to large strain regimes for aluminum, 304L stainless steel, nickel, and copper (taken from the literature )appear to be identical. Hence the distributions may be "universal." These results have significant implications for the development of dislocation based deformation models. [S0031...
Systematic variations in divergence angle
Okabe, Takuya
2012-01-01
Practical methods for quantitative analysis of radial and angular coordinates of leafy organs of vascular plants are presented and applied to published phyllotactic patterns of various real systems from young leaves on a shoot tip to florets on a flower head. The constancy of divergence angle is borne out with accuracy of less than a degree. It is shown that apparent fluctuations in divergence angle are in large part systematic variations caused by the invalid assumption of a fixed center and/or by secondary deformations, while random fluctuations are of minor importance.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional probability modulates visual search efficiency.
Cort, Bryan; Anderson, Britt
2013-01-01
We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Conditional Probability Modulates Visual Search Efficiency
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Inferring Beliefs as Subjectively Imprecise Probabilities
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....
Scoring Rules for Subjective Probability Distributions
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
The trajectory of the target probability effect.
Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B
2013-05-01
The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Contactless angle detection using permalloy
Eijkel, Kees J.; Rijk, Rolf
1988-01-01
An overview is given of measurements on angle detectors. The detectors consist of a pair of planar-Hall elements opposite to a rotatable magnet. The measurements are performed on a number of planar-Hall elements of different shape and size, and show good agreement with a previously described theoret
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
Effects of slant angle and illumination angle on MTF estimations
Vhengani, LM
2012-07-01
Full Text Available .085 0.09 0.095 K:\\Working Folder\\Project_On_orbit MTF\\edgetargets\\MTF_Lab_Measurements _20120302_Edge Slant Angle (degrees) Ny qu ist MT F (c yc le/p ixe l) Data Regression -18 -16 -14 -12 -10 -8 -6 -4 -2 0.05 0.055 0.06 0....065 0.07 0.075 0.08 0.085 0.09 K:\\Working Folder\\Project_On_orbit MTF\\edgetargets\\MTF_Lab_Measurements_20120303_Edge Slant Angle (degrees) Ny qu ist MT F (c yc le/p ixe l) Data Regression Figure 6. Regression of positive slant...
On the computability of conditional probability
Ackerman, Nathanael L; Roy, Daniel M
2010-01-01
We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...
An Angle Criterion for Riesz Bases
Lindner, Alexander M; Bittner, B.
1999-01-01
We present a characterization of Riesz bases in terms ofthe angles between certain finite dimensional subspaces. Correlationsbetween the bounds of the Riesz basis and the size of the angles arederived.......We present a characterization of Riesz bases in terms ofthe angles between certain finite dimensional subspaces. Correlationsbetween the bounds of the Riesz basis and the size of the angles arederived....
Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Sillesen, Henrik
1998-01-01
to conventional B-mode imaging MACI offers better defined tissue boundaries and lower variance of the speckle pattern, resulting in an image with reduced random variations. Design and implementation of a compound imaging system is described, images of rubber tubes and porcine aorta are shown and effects......This paper reports on a scanning technique, denoted multi-angle compound imaging (MACI), using spatial compounding. The MACI method also contains elements of frequency compounding, as the transmit frequency is lowered for the highest beam angles in order to reduce grating lobes. Compared...... on visualization are discussed. The speckle reduction is analyzed numerically and the results are found to be in excellent agreement with existing theory. An investigation of detectability of low-contrast lesions shows significant improvements compared to conventional imaging. Finally, possibilities for improving...
Optimisation of Fan Blade Angle
Swaroop M P
2017-01-01
Full Text Available This report represents the optimization of fan blade angle in accordance with the various room temperatures that can be in the tropical area like India. We took this work mainly because cooling is an important factor now a days in every area where construction and rooms are there and ceiling fans are the most common device that is commonly used. So it is of utmost importance to tweak the performance of this ceiling fan so that it can function in its most optimal condition. We have modeled the fan in a modeling software (SOLIDWORKS and imported that into an analyzing software (ANSYS and a result is generated on the various blade angles (0, 4, 8 and 12.5 degrees in accordance to room conditions. A trend line curve with the obtained data is expected as the result which can be crucial for designing of future fans
Bell Could Become the Copernicus of Probability
Khrennikov, Andrei
2016-07-01
Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.
Nucleation of small angle boundaries
Nabarro, FRN
1996-12-01
Full Text Available -ANGLE BOUNDARIES F.R.N. Nabarro Condensed Matter Physics Research Unit, University of the Witwatersrand, Private Bag 3, Wits 2050, Johannesburg, and Division of Materials Science and Technology, CSIR, P.O. Box 395, Pretoria, South... with eq. 11. Acknowledgment F.R.N. Nabarro is grateful to the University of Virginia for hospitality during the course of this work. D. Kuhlmann-Wilsdorf thanks the National Science Foundation, (Surface Engineering...
LHC Report: playing with angles
Mike Lamont for the LHC team
2016-01-01
Ready (after a machine development period), steady (running), go (for a special run)! The crossing angles are an essential feature of the machine set-up. They have to be big enough to reduce the long-range beam-beam effect. The LHC has recently enjoyed a period of steady running and managed to set a new record for “Maximum Stable Luminosity Delivered in 7 days” of 3.29 fb-1 between 29 August and 4 September. The number of bunches per beam remains pegged at 2220 because of the limitations imposed by the SPS beam dump. The bunch population is also somewhat reduced due to outgassing near one of the injection kickers at point 8. Both limitations will be addressed during the year-end technical stop, opening the way for increased performance in 2017. On 10 and 11 September, a two day machine development (MD) period took place. The MD programme included a look at the possibility of reducing the crossing angle at the high-luminosity interaction points. The crossing angles are an ess...
Towards a Categorical Account of Conditional Probability
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL probability wheel
Sheng-Cheng Huang
2016-01-01
Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Total variation denoising of probability measures using iterated function systems with probabilities
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
Bayesian Probabilities and the Histories Algebra
Marlow, Thomas
2006-01-01
We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.
Non-Boolean probabilities and quantum measurement
Niestegge, Gerd
2001-08-03
A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)
Data analysis recipes: Probability calculus for inference
Hogg, David W
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.
Spatial probability aids visual stimulus discrimination
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
The Nasolabial Angle Among Patients with Total Cleft Lip and Palate.
Paradowska-Stolarz, Anna M; Kawala, Beata
2015-01-01
Nasolabial angle is the angle that is measured between points columella, subnasale and labiale superius. The reference values vary from 90 to 120 degrees (the mean value is 109.8 degrees). In some disorders, nasolabial angle might change. This influences the facial profile. One of such deformities are clefts. The nasolabial angle might be decreased in cleft patients due to deformation of the nose and upper lip that might be caused by the reconstructive surgical procedures performed. The aim of the study was to compare the nasolabial angle between the groups of patients with total clefts of the lip, alveolar bone and palate and healthy individuals. The cephalometric X-rays of 118 patients with clefts (73 boys and 45 girls) and 101 healthy individuals (32 boys and 69 girls) were taken into account to measure nasolabial angle and compared. In patients with cleft deformities, the nasolabial angle values were smaller than in healthy individuals. Among the patients with clefts, the ones with a bilateral type of deformity are characterized by the highest mean values of nasolabial angle. The angle is smaller in groups of girls when compared to boys. Nasolabial angle in patients with total clefts of lip, alveolar bone and palate is statistically smaller than in healthy individuals. This might be a result of either the deformation of the upper lip or (more probably) the nose. The orthodontic treatment should be individualized.
Some New Results on Transition Probability
Yu Quan XIE
2008-01-01
In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.
Probabilities are single-case, or nothing
Appleby, D M
2004-01-01
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state.
Real analysis and probability solutions to problems
Ash, Robert P
1972-01-01
Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.
Melucci, Massimo
2012-01-01
Probabilistic models require the notion of event space for defining a probability measure. An event space has a probability measure which ensues the Kolmogorov axioms. However, the probabilities observed from distinct sources, such as that of relevance of documents, may not admit a single event space thus causing some issues. In this article, some results are introduced for ensuring whether the observed prob- abilities of relevance of documents admit a single event space. More- over, an alternative framework of probability is introduced, thus chal- lenging the use of classical probability for ranking documents. Some reflections on the convenience of extending the classical probabilis- tic retrieval toward a more general framework which encompasses the issues are made.
Small angle scattering and polymers
Cotton, J.P. [Laboratoire Leon Brillouin (LLB) - Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France)
1996-12-31
The determination of polymer structure is a problem of interest for both statistical physics and industrial applications. The average polymer structure is defined. Then, it is shown why small angle scattering, associated with isotopic substitution, is very well suited to the measurement of the chain conformation. The corresponding example is the old, but pedagogic, measurement of the chain form factor in the polymer melt. The powerful contrast variation method is illustrated by a recent determination of the concentration profile of a polymer interface. (author) 12 figs., 48 refs.
Theta angle in holographic QCD
Jarvinen, Matti
2016-01-01
V-QCD is a class of effective holographic models for QCD which fully includes the backreaction of quarks to gluon dynamics. The physics of the theta-angle and the axial anomaly can be consistently included in these models. We analyze their phase diagrams over ranges of values of the quark mass, N_f/N_c, and theta, computing observables such as the topological susceptibility and the meson masses. At small quark mass, where effective chiral Lagrangians are reliable, they agree with the predictions of V-QCD.
Comparing linear probability model coefficients across groups
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Recent Developments in Applied Probability and Statistics
Devroye, Luc; Kohler, Michael; Korn, Ralf
2010-01-01
This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.
Examples of Neutrosophic Probability in Physics
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Analytical Study of Thermonuclear Reaction Probability Integrals
Chaudhry, M A; Mathai, A M
2000-01-01
An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability of Grounding and Collision Events
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability of Grounding and Collision Events
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Laboratory-Tutorial activities for teaching probability
Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2016-01-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is achieved by first checking such structures in covariant quantum mechanics, and then passing to spin foam models via the general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the Hilbert space of the canonical theory and the relevant quantum logical structure. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize property transitions and causality in this categorical context in connection with presheaves on quantaloids and respectively causal categories. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Device for Measuring Landslide Critical Angle
Li Xueling; Xia Weisheng; Huang Daoyou; Yu Yun
2016-01-01
The mountain landslide has high destructive effects, discussion of its landslide critical angle has always been one of the major concerns, and we designed a system that can automatically measure the landslide critical angle. This equipment consists of the
Refractivity estimations from an angle-of-arrival spectrum
Zhao Xiao-Feng; Huang Si-Xun
2011-01-01
This paper addresses the probability of atmospheric refractivity estimation by using field measurements at an array of radio receivers in terms of angle-of-arrival spectrum. Angle-of-arrival spectrum information is simulated by the ray optics model and refractivity is expressed in the presence of an ideal tri-linear profile. The estimation of the refractivity is organized as an optimization problem and a genetic algorithm is used to search for the optimal solution from various trial refractivity profiles. Theoretical analysis demonstrates the feasibility of this method to retrieve the refractivity parameters. Simulation results indicate that this approach has a fair anti-noise ability and its accuracy performance is mainly dependent on the antenna aperture size and its positions.
30 CFR 56.19037 - Fleet angles.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Fleet angles. 56.19037 Section 56.19037 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Sheaves § 56.19037 Fleet angles. Fleet angles on hoists installed after November 15, 1979, shall not...
30 CFR 57.19037 - Fleet angles.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Fleet angles. 57.19037 Section 57.19037 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Sheaves § 57.19037 Fleet angles. Fleet angles on hoists installed after November 15, 1979, shall not...
Survival probability in patients with liver trauma.
Buci, Skender; Kukeli, Agim
2016-08-01
Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.
Energetic Electron Pitch Angle Diffusion due to Whistler Wave during Terrestrial Storms
XIAO Fu-Liang; HE Hui-Yong
2006-01-01
A concise and elegant expression of cyclotron harmonic resonant quasi-pure pitch-angle diffusion is constructed for the parallel whistler mode waves, and the quasi-linear diffusion coefficient is prescribed in terms of the whistler mode wave spectral intensity. Numerical computations are performed for the specific case of energetic electrons interacting with a band of frequency of whistler mode turbulence at L ≈ 3. It is found that the quasi-pure pitch-angle diffusion driven by the whistler mode scatters energetic electrons from the larger pitch-angles into the loss cone, and causes pitch-angle distribution to evolve from the pancake-shaped before the terrestrial storms to the flat-top during the main phase. This probably accounts for the quasi-isotropic pitch-angle distribution observed by the combined release and radiation effects satellite spacecraft at L ≈ 3.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Pre-aggregation for Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Are All Probabilities Fundamentally Quantum Mechanical?
Pradhan, Rajat Kumar
2011-01-01
The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...... with popular non-expected utility preference representations that satisfy weak conditions....
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Computation of the Complex Probability Function
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Probability amplitude in quantum like games
Grib, A A; Starkov, K
2003-01-01
Examples of games between two partners with mixed strategies, calculated by the use of the probability amplitude are given. The first game is described by the quantum formalism of spin one half system for which two noncommuting observables are measured. The second game corresponds to the spin one case. Quantum logical orthocomplemented nondistributive lattices for these two games are presented. Interference terms for the probability amplitudes are analyzed by using so called contextual approach to probability (in the von Mises frequency approach). We underline that our games are not based on using of some microscopic systems. The whole scenario is macroscopic.
Basic Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first
Advanced Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob
Comparing linear probability model coefficients across groups
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Tomographic probability representation for quantum fermion fields
Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D
2009-01-01
Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.
Pre-aggregation for Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Zika Probably Not Spread Through Saliva: Study
... page: https://medlineplus.gov/news/fullstory_167531.html Zika Probably Not Spread Through Saliva: Study Research with ... HealthDay News) -- Scientists have some interesting news about Zika: You're unlikely to get the virus from ...
Teaching Elementary Probability Through its History.
Kunoff, Sharon; Pines, Sylvia
1986-01-01
Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Encounter Probability of Individual Wave Height
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Pre-Aggregation with Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Modelling the probability of building fires
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Probability calculations under the IAC hypothesis
Wilson, Mark C; 10.1016/j.mathsocsci.2007.05.003
2012-01-01
We show how powerful algorithms recently developed for counting lattice points and computing volumes of convex polyhedra can be used to compute probabilities of a wide variety of events of interest in social choice theory. Several illustrative examples are given.
Inclusion probability with dropout: an operational formula.
Milot, E; Courteau, J; Crispino, F; Mailly, F
2015-05-01
In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.
Pre-Aggregation with Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Hahn, Thomas; Foldspang, Anders
1997-01-01
Quadriceps muscle contraction tends to straighten the Q angle. We expected that sports comprising a high amount of quadriceps training could be associated with low Q angles. The aim of the present study was to estimate the Q angle in athletes and to investigate its potential associations with par......Quadriceps muscle contraction tends to straighten the Q angle. We expected that sports comprising a high amount of quadriceps training could be associated with low Q angles. The aim of the present study was to estimate the Q angle in athletes and to investigate its potential associations...... with participation in sport. Three hundred and thirty-nine athletes had their Q angle measured. The mean of right-side Q angles was higher than left side, and the mean Q angle was higher in women than in men. The Q angle was positively associated with years of jogging, and negatively with years of soccer, swimming...... and sports participation at all. It is concluded that the use of Q angle measurements is questionable....
Individualized optimal release angles in discus throwing.
Leigh, Steve; Liu, Hui; Hubbard, Mont; Yu, Bing
2010-02-10
The purpose of this study was to determine individualized optimal release angles for elite discus throwers. Three-dimensional coordinate data were obtained for at least 10 competitive trials for each subject. Regression relationships between release speed and release angle, and between aerodynamic distance and release angle were determined for each subject. These relationships were linear with subject-specific characteristics. The subject-specific relationships between release speed and release angle may be due to subjects' technical and physical characteristics. The subject-specific relationships between aerodynamic distance and release angle may be due to interactions between the release angle, the angle of attack, and the aerodynamic distance. Optimal release angles were estimated for each subject using the regression relationships and equations of projectile motion. The estimated optimal release angle was different for different subjects, and ranged from 35 degrees to 44 degrees . The results of this study demonstrate that the optimal release angle for discus throwing is thrower-specific. The release angles used by elite discus throwers in competition are not necessarily optimal for all discus throwers, or even themselves. The results of this study provide significant information for understanding the biomechanics of discus throwing techniques.
Wafer scale oblique angle plasma etching
Burckel, David Bruce; Jarecki, Jr., Robert L.; Finnegan, Patrick Sean
2017-05-23
Wafer scale oblique angle etching of a semiconductor substrate is performed in a conventional plasma etch chamber by using a fixture that supports a multiple number of separate Faraday cages. Each cage is formed to include an angled grid surface and is positioned such that it will be positioned over a separate one of the die locations on the wafer surface when the fixture is placed over the wafer. The presence of the Faraday cages influences the local electric field surrounding each wafer die, re-shaping the local field to be disposed in alignment with the angled grid surface. The re-shaped plasma causes the reactive ions to follow a linear trajectory through the plasma sheath and angled grid surface, ultimately impinging the wafer surface at an angle. The selected geometry of the Faraday cage angled grid surface thus determines the angle at with the reactive ions will impinge the wafer.
Transcription and the Pitch Angle of DNA
Olsen, Kasper W
2013-01-01
The question of the value of the pitch angle of DNA is visited from the perspective of a geometrical analysis of transcription. It is suggested that for transcription to be possible, the pitch angle of B-DNA must be smaller than the angle of zero-twist. At the zero-twist angle the double helix is maximally rotated and its strain-twist coupling vanishes. A numerical estimate of the pitch angle for B-DNA based on differential geometry is compared with numbers obtained from existing empirical data. The crystallographic studies shows that the pitch angle is approximately 38 deg., less than the corresponding zero-twist angle of 41.8 deg., which is consistent with the suggested principle for transcription.
Survival probability for open spherical billiards
Dettmann, Carl P.; Rahman, Mohammed R.
2014-12-01
We study the survival probability for long times in an open spherical billiard, extending previous work on the circular billiard. We provide details of calculations regarding two billiard configurations, specifically a sphere with a circular hole and a sphere with a square hole. The constant terms of the long-time survival probability expansions have been derived analytically. Terms that vanish in the long time limit are investigated analytically and numerically, leading to connections with the Riemann hypothesis.
Data analysis recipes: Probability calculus for inference
Hogg, David W.
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...
Ruin Probability in Linear Time Series Model
ZHANG Lihong
2005-01-01
This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.
Representing Uncertainty by Probability and Possibility
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
Probabilities and health risks: a qualitative approach.
Heyman, B; Henriksen, M; Maughan, K
1998-11-01
Health risks, defined in terms of the probability that an individual will suffer a particular type of adverse health event within a given time period, can be understood as referencing either natural entities or complex patterns of belief which incorporate the observer's values and knowledge, the position adopted in the present paper. The subjectivity inherent in judgements about adversity and time frames can be easily recognised, but social scientists have tended to accept uncritically the objectivity of probability. Most commonly in health risk analysis, the term probability refers to rates established by induction, and so requires the definition of a numerator and denominator. Depending upon their specification, many probabilities may be reasonably postulated for the same event, and individuals may change their risks by deciding to seek or avoid information. These apparent absurdities can be understood if probability is conceptualised as the projection of expectation onto the external world. Probabilities based on induction from observed frequencies provide glimpses of the future at the price of acceptance of the simplifying heuristic that statistics derived from aggregate groups can be validly attributed to individuals within them. The paper illustrates four implications of this conceptualisation of probability with qualitative data from a variety of sources, particularly a study of genetic counselling for pregnant women in a U.K. hospital. Firstly, the official selection of a specific probability heuristic reflects organisational constraints and values as well as predictive optimisation. Secondly, professionals and service users must work to maintain the facticity of an established heuristic in the face of alternatives. Thirdly, individuals, both lay and professional, manage probabilistic information in ways which support their strategic objectives. Fourthly, predictively sub-optimum schema, for example the idea of AIDS as a gay plague, may be selected because
De Finetti's contribution to probability and statistics
Cifarelli, Donato Michele; Regazzini, Eugenio
1996-01-01
This paper summarizes the scientific activity of de Finetti in probability and statistics. It falls into three sections: Section 1 includes an essential biography of de Finetti and a survey of the basic features of the scientific milieu in which he took the first steps of his scientific career; Section 2 concerns de Finetti's work in probability: (a) foundations, (b) processes with independent increments, (c) sequences of exchangeable random variables, and (d) contributions which fall within ...
Characteristic Functions over C*-Probability Spaces
王勤; 李绍宽
2003-01-01
Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.
Imprecise Probability Methods for Weapons UQ
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Probability, clinical decision making and hypothesis testing
A Banerjee
2009-01-01
Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Representing Uncertainty by Probability and Possibility
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
Determination of stability of epimetamorphic rock slope using Minimax Probability Machine
Manoj Kumar
2016-01-01
Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.
Meningiomas of the cerebellopontine angle.
Matthies, C; Carvalho, G; Tatagiba, M; Lima, M; Samii, M
1996-01-01
Meningiomas of the cerebellopontine angle (CPA) represent a clinically and surgically interesting entity. The opportunity of complete surgical excision and the incidence of impairment of nerval structures largely depend on the tumour biology that either leads to displacement of surrounding structures by an expansive type of growth or to an enveloping of nerval and vascular structures by an en plaque type of growth. As the origin and the direction of growth are very variable, the exact tumour extension in relation to the nerval structures and the tumour origin can be identified sometimes only at the time of surgery. Out of a series of 230 meningiomas of the posterior skull base operated between 1978 and 1993, data of 134 meningiomas involving the cerebellopontine angle are presented. There were 20% male and 80% female patients, age at the time of surgery ranging from 18 to 76 years, on the average 51 years. The clinical presentation was characterized by a predominant disturbance of the cranial nerves V (19%), VII (11%), VIII (67%) and the caudal cranial nerves (6%) and signs of ataxia (28%). 80% of the meningiomas were larger than 30 mm in diameter, 53% led to evident brainstem compression or dislocation and 85% extended anteriorly to the internal auditory canal. Using the lateral suboccipital approach in the majority of cases and a combined presigmoidal or combined suboccipital and subtemporal approaches in either sequence in 5%, complete tumour removal (Simpson I and II) was accomplished in 95% and subtotal tumour removal in 5%. Histologically the meningiotheliomatous type was most common (49%) followed by the mixed type (19%), fibroblastic (16%), psammomatous (7%), hemangioblastic (7%) and anaplastic (2%) types. Major post-operative complications were CSF leakage (8%) requiring surgical revision in 2% and hemorrhage (3%) requiring revision in 2%. While the majority of neurological disturbances showed signs of recovery, facial nerve paresis or paralysis was
Tsunami probability in the Caribbean region
Parsons, T.; Geist, E. L.
2008-12-01
We calculated tsunami runup probability at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20km by 20km cells, and the mean tsunami runup rate was determined for each cell. A remarkable ~500-year empirical record was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it's unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c=0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack back-arc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20km by 20km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0-30 percent regionally.
Dynamic contact angle measurements on superhydrophobic surfaces
Kim, Jeong-Hyun; Kavehpour, H. Pirouz; Rothstein, Jonathan P.
2015-03-01
In this paper, the dynamic advancing and receding contact angles of a series of aqueous solutions were measured on a number of hydrophobic and superhydrophobic surfaces using a modified Wilhelmy plate technique. Superhydrophobic surfaces are hydrophobic surfaces with micron or nanometer sized surface roughness. These surfaces have very large static advancing contact angles and little static contact angle hysteresis. In this study, the dynamic advancing and dynamic receding contact angles on superhydrophobic surfaces were measured as a function of plate velocity and capillary number. The dynamic contact angles measured on a smooth hydrophobic Teflon surface were found to obey the scaling with capillary number predicted by the Cox-Voinov-Tanner law, θD3 ∝ Ca. The response of the dynamic contact angle on the superhydrophobic surfaces, however, did not follow the same scaling law. The advancing contact angle was found to remain constant at θA = 160∘, independent of capillary number. The dynamic receding contact angle measurements on superhydrophobic surfaces were found to decrease with increasing capillary number; however, the presence of slip on the superhydrophobic surface was found to result in a shift in the onset of dynamic contact angle variation to larger capillary numbers. In addition, a much weaker dependence of the dynamic contact angle on capillary number was observed for some of the superhydrophobic surfaces tested.
Maurer Till
2005-04-01
Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.
Laboratory-tutorial activities for teaching probability
Roger E. Feeley
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.
Laboratory-tutorial activities for teaching probability
Michael C. Wittmann
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
Approximation of Failure Probability Using Conditional Sampling
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Computing Earthquake Probabilities on Global Scales
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
A Property of Crack Propagation at the Specimen of CFRP with Layer Angle
Hwang, Gue Wan; Cho, Jae Ung [Kongju Univ., Kongju (Korea, Republic of); Cho, Chong Du [Inha Univ., Incheon (Korea, Republic of)
2016-12-15
CFRP is the composite material manufactured by the hybrid resin on the basis of carbon fiber. As this material has the high specific strength and the light weight, it has been widely used at various fields. Particularly, the unidirectional carbon fiber can be applied with the layer angle. CFRP made with layer angle has the strength higher than with no layer angle. In this paper, the property of crack growth due to each layer angle was investigated on the crack propagation and fracture behavior of the CFRP compact tension specimen due to the change of layer angle. The value of maximum stress is shown to be decreased and the crack propagation is slowed down as the layer angle is increased. But the limit according to the layer angle is shown as the stress is increased again from the base point of the layer angle of 60°.This study result is thought to be utilized with the data which verify the probability of fatigue fracture when the defect inside the structure at using CFRP of mechanical structure happens.
Caustic graphene plasmons with Kelvin angle
Shi, Xihang; Gao, Fei; Xu, Hongyi; Yang, Zhaoju; Zhang, Baile
2015-01-01
A century-long argument made by Lord Kelvin that all swimming objects have an effective Mach number of 3, corresponding to the Kelvin angle of 19.5 degree for ship waves, has been recently challenged with the conclusion that the Kelvin angle should gradually transit to the Mach angle as the ship velocity increases. Here we show that a similar phenomenon can happen for graphene plasmons. By analyzing the caustic wave pattern of graphene plasmons stimulated by a swift charged particle moving uniformly above graphene, we show that at low velocities of the charged particle, the caustics of graphene plasmons form the Kelvin angle. At large velocities of the particle, the caustics disappear and the effective semi-angle of the wave pattern approaches the Mach angle. Our study introduces caustic wave theory to the field of graphene plasmonics, and reveals a novel physical picture of graphene plasmon excitation during electron energy-loss spectroscopy measurement.
A thermodynamic model of contact angle hysteresis
Makkonen, Lasse
2017-08-01
When a three-phase contact line moves along a solid surface, the contact angle no longer corresponds to the static equilibrium angle but is larger when the liquid is advancing and smaller when the liquid is receding. The difference between the advancing and receding contact angles, i.e., the contact angle hysteresis, is of paramount importance in wetting and capillarity. For example, it determines the magnitude of the external force that is required to make a drop slide on a solid surface. Until now, fundamental origin of the contact angle hysteresis has been controversial. Here, this origin is revealed and a quantitative theory is derived. The theory is corroborated by the available experimental data for a large number of solid-liquid combinations. The theory is applied in modelling the contact angle hysteresis on a textured surface, and these results are also in quantitative agreement with the experimental data.
Probability, Arrow of Time and Decoherence
Bacciagaluppi, G
2007-01-01
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
A basic course in probability theory
Bhattacharya, Rabi
2016-01-01
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...
Probabilities and Signalling in Quantum Field Theory
Dickinson, Robert; Millington, Peter
2016-01-01
We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.
Comparing coefficients of nested nonlinear probability models
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...
Channel Capacity Estimation using Free Probability Theory
Ryan, Øyvind
2007-01-01
In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...
7th High Dimensional Probability Meeting
Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan
2016-01-01
This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...
A Thermodynamical Approach for Probability Estimation
Isozaki, Takashi
2012-01-01
The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
A Revisit to Probability - Possibility Consistency Principles
Mamoni Dhar
2013-03-01
Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.
Correlations and Non-Linear Probability Models
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
The effect of eccentric exercise on position sense and joint reaction angle of the lower limbs.
Paschalis, V; Nikolaidis, M G; Giakas, G; Jamurtas, A Z; Pappas, A; Koutedakis, Y
2007-04-01
Impaired position sense and impaired joint reaction angle of the lower limbs after muscle-damaging activities is a serious functional limitation that may lead to an increased risk of injury, particularly in older populations. The purpose of the present study was to examine whether position sense and joint reaction angle to release can be affected by eccentric exercise-induced muscle damage. Twelve women underwent an isokinetic exercise session of the lower limb. Isometric peak torque, delayed-onset muscle soreness, serum creatine kinase, position sense, and knee joint reaction angle to release were examined before, immediately after, and 24, 48, and 72 h post-exercise. Due to the effect of eccentric exercise, subjects persistently placed their lower limb at a more extended position, representing a shorter knee extensor muscle. Eccentric exercise increased the knee reaction angle of the lower limb after release from 0 degrees and 15 degrees but not from 30 degrees and 45 degrees . Position sense and joint reaction to release were similarly affected by eccentric exercise and independently of visual feedback. Position sense was impaired only immediately post-exercise (probably due to muscle fatigue), whereas impairment of the reaction angle to release persisted up to 3 days post-exercise (probably due to muscle damage). Attenuation of position sense and joint reaction angle of the lower limbs after damaging activities is a serious functional limitation that may lead to an increase risk of injury, particularly in older populations.
Contact angle measurements under thermodynamic equilibrium conditions.
Lages, Carol; Méndez, Eduardo
2007-08-01
The precise control of the ambient humidity during contact angle measurements is needed to obtain stable and valid data. For a such purpose, a simple low-cost device was designed, and several modified surfaces relevant to biosensor design were studied. Static contact angle values for these surfaces are lower than advancing contact angles published for ambient conditions, indicating that thermodynamic equilibrium conditions are needed to avoid drop evaporation during the measurements.
Predicting the probability of outbreeding depression.
Frankham, Richard; Ballou, Jonathan D; Eldridge, Mark D B; Lacy, Robert C; Ralls, Katherine; Dudash, Michele R; Fenster, Charles B
2011-06-01
Fragmentation of animal and plant populations typically leads to genetic erosion and increased probability of extirpation. Although these effects can usually be reversed by re-establishing gene flow between population fragments, managers sometimes fail to do so due to fears of outbreeding depression (OD). Rapid development of OD is due primarily to adaptive differentiation from selection or fixation of chromosomal variants. Fixed chromosomal variants can be detected empirically. We used an extended form of the breeders' equation to predict the probability of OD due to adaptive differentiation between recently isolated population fragments as a function of intensity of selection, genetic diversity, effective population sizes, and generations of isolation. Empirical data indicated that populations in similar environments had not developed OD even after thousands of generations of isolation. To predict the probability of OD, we developed a decision tree that was based on the four variables from the breeders' equation, taxonomic status, and gene flow within the last 500 years. The predicted probability of OD in crosses between two populations is elevated when the populations have at least one of the following characteristics: are distinct species, have fixed chromosomal differences, exchanged no genes in the last 500 years, or inhabit different environments. Conversely, the predicted probability of OD in crosses between two populations of the same species is low for populations with the same karyotype, isolated for <500 years, and that occupy similar environments. In the former case, we recommend crossing be avoided or tried on a limited, experimental basis. In the latter case, crossing can be carried out with low probability of OD. We used crosses with known results to test the decision tree and found that it correctly identified cases where OD occurred. Current concerns about OD in recently fragmented populations are almost certainly excessive. ©2011 Society for
Explosion probability of unexploded ordnance: expert beliefs.
MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G
2008-08-01
This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies
Validation of fluorescence transition probability calculations
Pia, M G; Sudhaka, Manju
2009-01-01
A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimental measurements than calculations based on the Hartree-Slater method.
Fifty challenging problems in probability with solutions
Mosteller, Frederick
1987-01-01
Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall
Probability, statistics, and decision for civil engineers
Benjamin, Jack R
2014-01-01
Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and
Risk Probability Estimating Based on Clustering
Chen, Yong; Jensen, Christian D.; Gray, Elizabeth
2003-01-01
of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...
Probability densities and Lévy densities
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Harmonic analysis and the theory of probability
Bochner, Salomon
2005-01-01
Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro
Conditional Probabilities and Collapse in Quantum Measurements
Laura, Roberto; Vanni, Leonardo
2008-09-01
We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa
Survival probability in diffractive dijet photoproduction
Klasen, M
2009-01-01
We confront the latest H1 and ZEUS data on diffractive dijet photoproduction with next-to-leading order QCD predictions in order to determine whether a rapidity gap survival probability of less than one is supported by the data. We find evidence for this hypothesis when assuming global factorization breaking for both the direct and resolved photon contributions, in which case the survival probability would have to be E_T^jet-dependent, and for the resolved or in addition the related direct initial-state singular contribution only, where it would be independent of E_T^jet.
Atomic transition probabilities of Nd I
Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.
2011-12-01
Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.
Concepts of probability in radiocarbon analysis
Bernhard Weninger
2011-12-01
Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.
Probabilities for separating sets of order statistics.
Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E
2010-04-01
Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.
Exact probability distribution functions for Parrondo's games
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Duelling idiots and other probability puzzlers
Nahin, Paul J
2002-01-01
What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki
Probability densities and Lévy densities
Barndorff-Nielsen, Ole Eiler
For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....
Proposal for Modified Damage Probability Distribution Functions
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Probability in biology: overview of a comprehensive theory of probability in living systems.
Nakajima, Toshiyuki
2013-09-01
Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
Xiaoyu Hu
2007-11-01
Full Text Available The outage probability of reverse link multicarrier (MC code-division multiple access (CDMA systems with beamforming in the presence of carrier frequency offset (CFO is studied. A conventional uniform linear array (ULA beamformer is utilized. An independent Nakagami fading channel is assumed for each subcarrier of all users. The outage probability is first investigated under a scenario where perfect beamforming is assumed. A closed form expression of the outage probability is derived. The impact of different types of beamforming impairments on the outage probability is then evaluated, including direction-of-arrival (DOA estimation errors, angle spreads, and mutual couplings. Numerical results show that the outage probability improves significantly as the number of antenna elements increases. The effect of CFO on the outage probability is reduced significantly when the beamforming technique is employed. Also, it is seen that small beamforming impairments (DOA estimation errors and angle spreads only affect the outage probability very slightly, and the mutual coupling between adjacent antenna elements does not affect the outage probability noticeably.
Hu Xiaoyu
2008-01-01
Full Text Available The outage probability of reverse link multicarrier (MC code-division multiple access (CDMA systems with beamforming in the presence of carrier frequency offset (CFO is studied. A conventional uniform linear array (ULA beamformer is utilized. An independent Nakagami fading channel is assumed for each subcarrier of all users. The outage probability is first investigated under a scenario where perfect beamforming is assumed. A closed form expression of the outage probability is derived. The impact of different types of beamforming impairments on the outage probability is then evaluated, including direction-of-arrival (DOA estimation errors, angle spreads, and mutual couplings. Numerical results show that the outage probability improves significantly as the number of antenna elements increases. The effect of CFO on the outage probability is reduced significantly when the beamforming technique is employed. Also, it is seen that small beamforming impairments (DOA estimation errors and angle spreads only affect the outage probability very slightly, and the mutual coupling between adjacent antenna elements does not affect the outage probability noticeably.
Probable Bright Supernova discovered by PSST
Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.
2016-09-01
A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).
Updating piping probabilities with survived historical loads
Schweckendiek, T.; Kanning, W.
2009-01-01
Piping, also called under-seepage, is an internal erosion mechanism, which can cause the failure of dikes or other flood defence structures. The uncertainty in the resistance of a flood defence against piping is usually large, causing high probabilities of failure for this mechanism. A considerable
Entanglement Mapping VS. Quantum Conditional Probability Operator
Chruściński, Dariusz; Kossakowski, Andrzej; Matsuoka, Takashi; Ohya, Masanori
2011-01-01
The relation between two methods which construct the density operator on composite system is shown. One of them is called an entanglement mapping and another one is called a quantum conditional probability operator. On the base of this relation we discuss the quantum correlation by means of some types of quantum entropy.
Assessing Schematic Knowledge of Introductory Probability Theory
Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley
2005-01-01
The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…
Independent Events in Elementary Probability Theory
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Rethinking the learning of belief network probabilities
Musick, R.
1996-03-01
Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.
Probability from a Socio-Cultural Perspective
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Critique of `Elements of Quantum Probability'
Gill, R.D.
1998-01-01
We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension
Probability based calibration of pressure coefficients
Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard
2015-01-01
not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...
Probability & Statistics: Modular Learning Exercises. Student Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Probability as a theory dependent concept
Atkinson, D; Peijnenburg, J
1999-01-01
It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori
Haavelmo's Probability Approach and the Cointegrated VAR
Juselius, Katarina
dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and iden- ti…cation. Speci…cally the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2...
Correlations and Non-Linear Probability Models
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations betwee...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models.......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...
Probability & Statistics: Modular Learning Exercises. Teacher Edition
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Comonotonic Book-Making with Nonadditive Probabilities
Diecidue, E.; Wakker, P.P.
2000-01-01
This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the
Idempotent probability measures on ultrametric spaces
Hubal, Oleksandra; Zarichnyi, Mykhailo
2008-07-01
Following the construction due to Hartog and Vink we introduce a metric on the set of idempotent probability measures (Maslov measures) defined on an ultrametric space. This construction determines a functor on the category of ultrametric spaces and nonexpanding maps. We prove that this functor is the functorial part of a monad on this category. This monad turns out to contain the hyperspace monad.
Investigating Probability with the NBA Draft Lottery.
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…
Probability of boundary conditions in quantum cosmology
Suenobu, Hiroshi; Nambu, Yasusada
2017-02-01
One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.
Interstitial lung disease probably caused by imipramine.
Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H
2014-01-01
Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.
Phonotactic Probability Effects in Children Who Stutter
Anderson, Julie D.; Byrd, Courtney T.
2008-01-01
Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…
Return probability and k-step measures
Dronen, Nicholas
2011-01-01
The notion of return probability -- explored most famously by George P\\'{o}lya on d-dimensional lattices -- has potential as a measure for the analysis of networks. We present an efficient method for finding return probability distributions for connected undirected graphs. We argue that return probability has the same discriminatory power as existing k-step measures -- in particular, beta centrality (with negative beta), the graph-theoretical power index (GPI), and subgraph centrality. We compare the running time of our algorithm to beta centrality and subgraph centrality and find that it is significantly faster. When return probability is used to measure the same phenomena as beta centrality, it runs in linear time -- O(n+m), where n and m are the number of nodes and edges, respectively -- which takes much less time than either the matrix inversion or the sequence of matrix multiplications required for calculating the exact or approximate forms of beta centrality, respectively. We call this form of return pr...
Transforming Probabilities without Violating Stochastic Dominance
P.P. Wakker (Peter)
1989-01-01
textabstractThe idea of expected utility, to transform payments into their utilities before calculating expectation, traces back at least to Bernoulli (1738). It is a very natural idea to transform, analogously, probabilities. This paper gives heuristic visual arguments to show that the, at first
STRIP: stream learning of influence probabilities
Kutzkov, Konstantin
2013-01-01
cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...
PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY
Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci
2011-01-01
The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.
Probability & Perception: The Representativeness Heuristic in Action
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Statistical physics of pairwise probability models
Roudi, Yasser; Aurell, Erik; Hertz, John
2009-01-01
(dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data...
Probability from a Socio-Cultural Perspective
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Probability of boundary conditions in quantum cosmology
Nambu, Yasusada; Suenobu, Hiroshi
2017-08-01
One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
Applied probability models with optimization applications
Ross, Sheldon M
1992-01-01
Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.
Contact angle hysteresis on fluoropolymer surfaces.
Tavana, H; Jehnichen, D; Grundke, K; Hair, M L; Neumann, A W
2007-10-31
Contact angle hysteresis of liquids with different molecular and geometrical properties on high quality films of four fluoropolymers was studied. A number of different causes are identified for hysteresis. With n-alkanes as probe liquids, contact angle hysteresis is found to be strongly related to the configuration of polymer chains. The largest hysteresis is obtained with amorphous polymers whereas the smallest hysteresis occurs for polymers with ordered molecular chains. This is explained in terms of sorption of liquid by the solid and penetration of liquid into the polymer film. Correlation of contact angle hysteresis with the size of n-alkane molecules supports this conclusion. On the films of two amorphous fluoropolymers with different molecular configurations, contact angle hysteresis of one and the same liquid with "bulky" molecules is shown to be quite different. On the surfaces of Teflon AF 1600, with stiff molecular chains, the receding angles of the probe liquids are independent of contact time between solid and liquid and similar hysteresis is obtained for all the liquids. Retention of liquid molecules on the solid surface is proposed as the most likely cause of hysteresis in these systems. On the other hand, with EGC-1700 films that consist of flexible chains, the receding angles are strongly time-dependent and the hysteresis is large. Contact angle hysteresis increases even further when liquids with strong dipolar intermolecular forces are used. In this case, major reorganization of EGC-1700 chains due to contact with the test liquids is suggested as the cause. The effect of rate of motion of the three-phase line on the advancing and receding contact angles, and therefore contact angle hysteresis, is investigated. For low viscous liquids, contact angles are independent of the drop front velocity up to approximately 10 mm/min. This agrees with the results of an earlier study that showed that the rate-dependence of the contact angles is an issue only
The probability and severity of decompression sickness.
Howle, Laurens E; Weber, Paul W; Hada, Ethan A; Vann, Richard D; Denoble, Petar J
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild-Type I (manifestations 4-6)-and serious-Type II (manifestations 1-3). Additionally, we considered an alternative grouping of mild-Type A (manifestations 3-6)-and serious-Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of 'mild' DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed.
Establishment probability in newly founded populations
Gusset Markus
2012-06-01
Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.
Automatic cobb angle determination from radiographic images
Sardjono, Tri Arief; Wilkinson, Michael H.F.; Veldhuizen, Albert G.; Ooijen, van Peter M.A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.
2013-01-01
Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Met
Automatic Cobb Angle Determination From Radiographic Images
Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.
2013-01-01
Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.
Does gallbladder angle affect gallstone formation?
Sanal, Bekir; Korkmaz, Mehmet; Zeren, Sezgin; Can, Fatma; Elmali, Ferhan; Bayhan, Zulfu
2016-01-01
Morphology of gallbladder varies considerably from person to person. We believe that one of the morphological variations of gallbladder is the "gallbladder angle". Gallbladder varies also in "angle", which, to the best of our knowledge, has never been investigated before. The purpose of this study was to investigate the impact of gallbladder angle on gallstone formation. in this study, 1075 abdominal computed tomography (CT) images were retrospectively examined. Patients with completely normal gallbladders were selected. Among these patients, those with both abdominal ultrasound and blood tests were identified in the hospital records and included in the study. Based on the findings of the ultrasound scans, patients were divided into two groups as patients with gallstones and patients without gallstones. Following the measurement of gallbladder angles on the CT images, the groups were statistically evaluated. The gallbladder angle was smaller in patients with gallstones (49 ± 21 degrees and 53 ± 19 degrees) and the gallbladder with larger angle was 1.015 (1/0.985) times lower the risk of gallstone formation. However, these were not statistically significant (p>0,05). A more vertically positioned gallbladder does not affect gallstone formation. However, a smaller gallbladder angle may facilitate gallstone formation in patients with the risk factors. Gallstones perhaps more easily and earlier develop in gallbladders with a smaller angle.
Automatic cobb angle determination from radiographic images
Sardjono, Tri Arief; Wilkinson, Michael H.F.; Veldhuizen, Albert G.; van Ooijen, Peter M.A.; Purnama, Ketut E.; Verkerke, Gijsbertus Jacob
2013-01-01
Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.
Acute angle closure glaucoma following ileostomy surgery
Mariana Meirelles Lopes
2015-02-01
Full Text Available Angle-closure glaucoma can be induced by drugs that may cause pupillary dilatation. We report a case of a patient that developed bilateral angle closure glaucoma after an ileostomy surgery because of systemic atropine injection. This case report highlights the importance of a fast ophthalmologic evaluation in diseases with ocular involvement in order to make accurate diagnoses and appropriate treatments.
Apparent contact angle and contact angle hysteresis on liquid infused surfaces
Semprebon, Ciro; McHale, Glen; Kusumaatmaja, Halim
We theoretically investigate the apparent contact angle and contact angle hysteresis of a droplet placed on a liquid infused surface. We show that the apparent contact angle is not uniquely defined by material parameters, but also has a strong dependence on the relative size between the droplet and its surrounding wetting ridge formed by the infusing liquid. We derive a closed form expression for the contact angle in the limit of vanishing wetting ridge, and compute the correction for small but finite ridge, which corresponds to an effective line tension term. We also predict contact angle hysteresis on liquid infused surfaces generated by the pinning of the contact lines by the surface corrugations. Our analytical expressions for both the apparent contact angle and contact angle hysteresis can be interpreted as `weighted sums' between the contact angles of the infusing liquid relative to the droplet and surrounding gas phases, where the weighting coefficients are given by ratios of the fluid surface tensions.
Apparent contact angle and contact angle hysteresis on liquid infused surfaces.
Semprebon, Ciro; McHale, Glen; Kusumaatmaja, Halim
2016-12-21
We theoretically investigate the apparent contact angle and contact angle hysteresis of a droplet placed on a liquid infused surface. We show that the apparent contact angle is not uniquely defined by material parameters, but also has a dependence on the relative size between the droplet and its surrounding wetting ridge formed by the infusing liquid. We derive a closed form expression for the contact angle in the limit of vanishing wetting ridge, and compute the correction for small but finite ridge, which corresponds to an effective line tension term. We also predict contact angle hysteresis on liquid infused surfaces generated by the pinning of the contact lines by the surface corrugations. Our analytical expressions for both the apparent contact angle and contact angle hysteresis can be interpreted as 'weighted sums' between the contact angles of the infusing liquid relative to the droplet and surrounding gas phases, where the weighting coefficients are given by ratios of the fluid surface tensions.
Solid angles III. The role of conformers in solid angle calculations
White, D
1995-06-14
Full Text Available The values of the solid angles Omega for a range of commonly encountered ligands in organometallic chemistry (phosphines, phosphites, amines, arsines and cyclopentadienyl rings) have been determined. The solid angles were derived from a single...
Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods
Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.
2012-01-01
Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Adding a visual linear scale probability to the PIOPED probability of pulmonary embolism.
Christiansen, F; Nilsson, T; Måre, K; Carlsson, A
1997-05-01
Reporting a lung scintigraphy diagnosis as a PIOPED categorical probability of pulmonary embolism offers the clinician a wide range of interpretation. Therefore the purpose of this study was to analyze the impact on lung scintigraphy reporting of adding a visual linear scale (VLS) probability assessment to the ordinary PIOPED categorical probability. The study material was a re-evaluation of lung scintigrams from a prospective study of 170 patients. All patients had been examined by lung scintigraphy and pulmonary angiography. The scintigrams were re-evaluated by 3 raters, and the probability of pulmonary embolism was estimated by the PIOPED categorization and by a VLS probability. The test was repeated after 6 months. There was no significant difference (p > 0.05) in the area under the ROC curve between the PIOPED categorization and the VLS for any of the 3 raters. Analysis of agreement among raters and for repeatability demonstrated low agreement in the mid-range of probabilities. A VLS probability estimate did not significantly improve the overall accuracy of the diagnosis compared to the categorical PIOPED probability assessment alone. From the data of our present study we cannot recommend the addition of a VLS score to the PIOPED categorization.
Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods
Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.
2012-01-01
Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…
Reliable measurement of the receding contact angle.
Korhonen, Juuso T; Huhtamäki, Tommi; Ikkala, Olli; Ras, Robin H A
2013-03-26
Surface wettability is usually evaluated by the contact angle between the perimeter of a water drop and the surface. However, this single measurement is not enough for proper characterization, and the so-called advancing and receding contact angles also need to be measured. Measuring the receding contact angle can be challenging, especially for extremely hydrophobic surfaces. We demonstrate a reliable procedure by using the common needle-in-the-sessile-drop method. Generally, the contact line movement needs to be followed, and true receding movement has to be distinguished from "pseudo-movement" occurring before the receding angle is reached. Depending on the contact angle hysteresis, the initial size of the drop may need to be surprisingly large to achieve a reliable result. Although our motivation for this work was the characterization of superhydrophobic surfaces, we also show that this method works universally ranging from hydrophilic to superhydrophobic surfaces.
Development of Tibiofemoral Angle in Korean Children
Yoo, Jae Ho; Cho, Tae-Joon; Chung, Chin Youb; Yoo, Won Joon
2008-01-01
This study was performed to identify the chronological changes of the knee angle or the tibiofemoral angles in normal healthy Korean children. Full-length anteroposterior view standing radiographs of 818 limbs of 452 Korean children were analyzed. The overall patterns of the chronological changes in the knee angle were similar to those described previously in western or Asian children, but the knee angle development was delayed, i.e., genu varum before 1 yr, neutral at 1.5 yr, increasing genu valgum with maximum a value of 7.8° at 4 yr, followed by a gradual decrease to approximately 5-6° of genu valgum of the adult level at 7 to 8 yr of age. These normative data on chronological changes of knee angles should be taken into consideration when evaluating lower limb alignment in children. PMID:18756063