Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Generating target probability sequences and events
Ella, Vaignana Spoorthy
2013-01-01
Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.
Generating pseudo-random discrete probability distributions
Energy Technology Data Exchange (ETDEWEB)
Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica
2015-08-15
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)
Estimating probable flaw distributions in PWR steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)
1997-02-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.
Sharp Bounds by Probability-Generating Functions and Variable Drift
DEFF Research Database (Denmark)
Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten
2011-01-01
We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al...
Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.
Directory of Open Access Journals (Sweden)
MichaelDougherty
2011-06-01
Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.
EP 1000 steam generator tube rupture analyses
International Nuclear Information System (INIS)
European electrical utility organizations together with Westinghouse and Ansaldo are participating in a program to utilize the Westinghouse passive nuclear plant technology to develop a plant which meets the European Utility Requirements (EUR) and is expected to be licensable in Europe. The program was initiated in 1994 and the plant is designated EP1000. The EP1000 design is notable for simplicity that comes from a reliance on passive safety systems to enhance plant safety. The use of passive safety systems has provided significant and measurable improvements in plant simplification, safety, reliability, investment protection and plant costs. These systems use only natural forces such as gravity, natural circulation, and compressed gas to provide the driving forces for the systems to adequately cool the reactor core following an initiating event. The EP1000 builds up on the Westinghouse passive nuclear plant technology to enhance plant safety and meet European Utility Requirements and specific European National Safety Criteria. This paper summarizes the main results of the Steam Generator Tube Rupture (SGTR) analysis activity, performed in Phase 2B of the European Passive Plant Program. The purpose of the study is to provide evidence that the passive safety system performance provides a significant improvement in terms of safety, providing significant margins to steam generator overfilling and reducing the need for operator actions. The behavior of the EP1000 plant following SGTR accidents has been analyzed by means of the RELAP5/Mod3.2 code. Sensitivity cases were performed, to address the impact of varying the number of steam generator tubes that rupture, and the potential adverse interactions that could result from operation of control systems (i.e., Chemical and Volume Control System, Startup Feedwater). Analyses have also been performed to define and verify improved protection system logic to avoid possible steam generator safety valve challenges both in the
On singular probability densities generated by extremal dynamics
Garcia, Guilherme J. M.; Dickman, Ronald
2003-01-01
Extremal dynamics is the mechanism that drives the Bak-Sneppen model into a (self-organized) critical state, marked by a singular stationary probability density $p(x)$. With the aim of understanding this phenomenon, we study the BS model and several variants via mean-field theory and simulation. In all cases, we find that $p(x)$ is singular at one or more points, as a consequence of extremal dynamics. Furthermore we show that the extremal barrier $x_i$ always belongs to the `prohibited' inter...
Demand and choice probability generating functions for perturbed consumers
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2011-01-01
This paper considers demand systems for utility-maximizing consumers equipped with additive linearly perturbed utility of the form U(x)+m⋅x and faced with general budget constraints x 2 B. Given compact budget sets, the paper provides necessary as well as sufficient conditions for a demand...... generating function to be consistent with utility maximization. Within a budget, the convex hull of the demand correspondence is the subdifferential of the demand generating function. The additive random utility discrete choice model (ARUM) is a special case with finite budget sets where utility is...... value distributions, and reviews and extends methods for constructing CPGF for applications. The results for ARUM are extended to competing risk survival models....
BHASKAR, Mithun; BENARJI, Mohan; MAHESWARAPU, Sydulu
2012-01-01
This paper presents a novel and superior Genetic Algorithm (GA) based resolver for Optimal Power flow (OPF) problem. Here, the main contrast to other Genetic Algorithm based approaches is that a novel expert based initial generation of population and adaptive probability approach (variable Cross over probability and mutation probability) is adopted in selection of offspring together with roulette wheel technique which reduces the computation time and increases the quality considerably. Select...
Estimation of Mutation Rates from Fluctuation Experiments via Probability Generating Functions
Montgomery-Smith, Stephen; Le, Anh; Smith, George; Billstein, Sidney; Oveys, Hesam; Pisechko, Dylan; Yates, Austin
2016-01-01
This paper calculates probability distributions modeling the Luria-Delbr\\"uck experiment. We show that by thinking purely in terms of generating functions, and using a 'backwards in time' paradigm, that formulas describing various situations can be easily obtained. This includes a generating function for Haldane's probability distribution due to Ycart. We apply our formulas to both simulated and real data created by looking at yeast cells acquiring an immunization to the antibiotic canavanine...
Oil spill risk assessment : probability and impact analyses with future projections
International Nuclear Information System (INIS)
This paper described a risk assessment methodology for oil spills in Washington State. The methodology involved analyzing spill probability by source, oil type, spill volume, season, and geographic zone. The method was used to develop probability distributions for actual spill volumes and probabilistic-based potential spill volumes for current and future risk assessments. The potential impact of spills based on geographic location, season, oil type and oil volume were assessed for inland freshwater and marine locations. Impacts were quantified and applied to spill distributions in order to determine risk scores. Risk quotients for each sector were divided by the grand total of risk quotients in order to derive a percentage or proportion value of risk. The risk scores were then used to develop relative risk matrices by source type and geographic zone for actual spill volumes. A customized spill database was developed to incorporate records of oil spill incidents in Washington waters of at least 50 gallons. Potential worst-case discharge (WCD) volumes were calculated for each spill. It was concluded that the methodology provides a state-of-the-art approach to evaluate the impact of a spill. 7 refs., 16 tabs., 3 figs
Mandal, K. G.; Padhi, J.; Kumar, A.; Ghosh, S.; Panda, D. K.; Mohanty, R. K.; Raychaudhuri, M.
2015-08-01
Rainfed agriculture plays and will continue to play a dominant role in providing food and livelihoods for an increasing world population. Rainfall analyses are helpful for proper crop planning under changing environment in any region. Therefore, in this paper, an attempt has been made to analyse 16 years of rainfall (1995-2010) at the Daspalla region in Odisha, eastern India for prediction using six probability distribution functions, forecasting the probable date of onset and withdrawal of monsoon, occurrence of dry spells by using Markov chain model and finally crop planning for the region. For prediction of monsoon and post-monsoon rainfall, log Pearson type III and Gumbel distribution were the best-fit probability distribution functions. The earliest and most delayed week of the onset of rainy season was the 20th standard meteorological week (SMW) (14th-20th May) and 25th SMW (18th-24th June), respectively. Similarly, the earliest and most delayed week of withdrawal of rainfall was the 39th SMW (24th-30th September) and 47th SMW (19th-25th November), respectively. The longest and shortest length of rainy season was 26 and 17 weeks, respectively. The chances of occurrence of dry spells are high from the 1st-22nd SMW and again the 42nd SMW to the end of the year. The probability of weeks (23rd-40th SMW) remaining wet varies between 62 and 100 % for the region. Results obtained through this analysis would be utilised for agricultural planning and mitigation of dry spells at the Daspalla region in Odisha, India.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
A Handoff Technique to Reduce False-Handoff Probability in Next Generation Wireless Networks
Debabrata Sarddar; Kaushik Mandal; Tapas Jana; Utpal Biswas; M.K. Naskar
2010-01-01
Next Generation Wireless Systems (NGWS) include co-existence of current wireless technologies such as WLANs, WiMAX, General Packet Radio Service (GPRS) and Universal Mobile Telecommunications System (UMTS). The most important and challenging issue is seamless handoffmanagement in NGWS to ensure the Quality of Service (QoS). In this article, we propose a GPS based handoff technique to improve handoff probability in NGWS. Using GPS we determine the direction of velocity of the MT (Mobile Termin...
The development of weak alpha source for the application to industrial probability generator
International Nuclear Information System (INIS)
We have been developing a random number generator and probability generator that utilizes natural random phenomenon of radioactive alpha-decay. The time intervals of electric pulse signals induced in a semiconductor detector are converted to physical random number by measuring the interval as clock pulse counts. Since the time interval has an exponential distribution, ρ(t), depending on the count rate of original electric pulse signals, the probability (P(Δt)) is easily defined by P(Δt)=∫t1t2 ρdt/∫0∞ ρdt. The sealed 100 Bq 244Cm source pieces were supplied from the AEA technology. We selected pin-diode detector as the alpha detector. The source pieces were subjected to the test for thermal stress simulating the bonding process to mount the source into a small chip of probability generator. We confirmed that the sealed surface was not changed after adding the thermal stress by the observation with SEM. We have a next plan to replace 244Cm to natural radioisotope 210Pb. As the first step of this attempt, the methods for collecting 210Pb from several natural resources such as aerosol collected in HEPA filters, ground water, and natural radioactive minerals were surveyed. (author)
Fortran code for generating random probability vectors, unitaries, and quantum states
Directory of Open Access Journals (Sweden)
Jonas eMaziero
2016-03-01
Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.
Minimization of Handoff Failure Probability for Next-Generation Wireless Systems
Directory of Open Access Journals (Sweden)
Debabrata Sarddar
2010-06-01
Full Text Available During the past few years, advances in mobile communication theory have enabled the development anddeployment of different wireless technologies, complementary to each other. Hence, their integration canrealize a unified wireless system that has the best features of the individual networks. Next-GenerationWireless Systems (NGWS integrate different wireless systems, each of which is optimized for somespecific services and coverage area to provide ubiquitous communications to the mobile users. In thispaper, we propose to enhance the handoff performance of mobile IP in wireless IP networks by reducingthe false handoff probability in the NGWS handoff management protocol. Based on the information offalse handoff probability, we analyze its effect on mobile speed and handoff signaling delay.
Fortran code for generating random probability vectors, unitaries, and quantum states
Maziero, Jonas
2015-01-01
The usefulness of generating random configurations is recognized in a variety of contexts, as for instance in the simulation of physical systems, in the verification of bounds and/or ansatz solutions for optimization problems, and in secure communications. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And the several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.
On the probability of exceeding allowable leak rates through degraded steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Cizelj, L.; Sorsek, I. [Jozef Stefan Institute, Ljubljana (Slovenia); Riesch-Oppermann, H. [Forschungszentrum Karlsruhe (Germany)
1997-02-01
This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds the predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.
A Handoff Technique to Reduce False-Handoff Probability in Next Generation Wireless Networks
Directory of Open Access Journals (Sweden)
Debabrata Sarddar
2010-05-01
Full Text Available Next Generation Wireless Systems (NGWS include co-existence of current wireless technologies such as WLANs, WiMAX, General Packet Radio Service (GPRS and Universal Mobile Telecommunications System (UMTS. The most important and challenging issue is seamless handoffmanagement in NGWS to ensure the Quality of Service (QoS. In this article, we propose a GPS based handoff technique to improve handoff probability in NGWS. Using GPS we determine the direction of velocity of the MT (Mobile Terminal. In earlier works handoff based on Relative Signal Strength (RSS of BS (base station was proposed. But in the case when a MT is moving towards an NBS whose RSS is less than that of its adjacent NBS, to take the decision of handover is quite decisive. But using GPS we can ensure efficient hand-off. The result shows that the proposed approach can solve the problem and helps to take the right decision decreasing false hand-off initiation probability.
Generating function for particle-number probability distribution in directed percolation
International Nuclear Information System (INIS)
We derive a generic expression for the generating function (GF) of the particle-number probability distribution (PNPD) for a simple reaction diffusion model that belongs to the directed percolation universality class. Starting with a single particle on a lattice, we show that the GF of the PNPD can be written as an infinite series of cumulants taken at zero momentum. This series can be summed up into a complete form at the level of a mean-field approximation. Using the renormalization group techniques, we determine logarithmic corrections for the GF at the upper critical dimension. We also find the critical scaling form for the PNPD and check its universality numerically in one dimension. The critical scaling function is found to be universal up to two non-universal metric factors
Power generation in India: analysing trends and outlook
International Nuclear Information System (INIS)
The objective of this report is to provide up-to-date data, critical analysis and information encompassing all aspects of power generation in India. The report provides historic and future outlook for power generation in India. It also provides an evaluation of private participation in power generation segment of India and investment opportunities in Indian power sector. In addition, the report examines policies, regulatory framework and financing of power generation in India. It also highlights key issues and challenges that are restricting the accelerated development of this sector. The report has thirteen chapters in total. (author)
The probability of generating the symmetric group with a commutator condition
Birulia, Raman
2012-01-01
Let B(n) be the set of pairs of permutations from the symmetric group of degree n with a 3-cycle commutator, and let A(n) be the set of those pairs which generate the symmetric or the alternating group of degree n. We find effective formulas for calculating the cardinalities of both sets. More precisely, we show that #B(n)/n! is a discrete convolution of the partition function and a linear combination of divisor functions, while #A(n)/n! is the product of a polynomial and Jordan's totient function. In particular, it follows that the probability that a pair of random permutations with a 3-cycle commutator generates the symmetric or the alternating group of degree n tends to zero as n tends to infinity, which makes a contrast with Dixon's classical result. Key elements of our proofs are Jordan's theorem from the 19th century, a formula by Ramanujan from the 20th century and a technique of square-tiled surfaces developed by French mathematicians Lelievre and Royer in the beginning of the 21st century. This paper...
ANALYSING SOLAR-WIND HYBRID POWER GENERATING SYSTEM
Mustafa ENGİN; Metin ÇOLAK
2005-01-01
In this paper, a solar-wind hybrid power generating, system that will be used for security lighting was designed. Hybrid system was installed and solar cells, wind turbine, battery bank, charge regulators and inverter performance values were measured through the whole year. Using measured values of overall system efficiency, reliability, demanded energy cost per kWh were calculated, and percentage of generated energy according to resources were defined. We also include in the paper a discussi...
International Nuclear Information System (INIS)
The evolution of the scalar probability density function (pdf), the conditional scalar dissipation rate, and other statistics including transport properties are studied for passive temperature fluctuations in decaying grid-generated turbulence. The effect of filtering and differentiating the time series is also investigated. For a nonzero mean temperature gradient it is shown that the pdf of the temperature fluctuations has pronounced exponential tails for turbulence Reynolds number (Rel) greater than 70 but below this value the pdf is close to Gaussian. The scalar dissipation rate, conditioned on the fluctuations, shows that there is a high expectation of dissipation in the presence of the large, rare fluctuations that produce the exponential tails. Significant positive correlation between the mean square scalar fluctuations and the instantaneous scalar dissipation rate is found when exponential tails occur. The case of temperature fluctuations in the absence of a mean gradient is also studied. Here, the results are less definite because the generation of the fluctuations (by means of fine heated wires) causes an asymmetry in the pdf. The results show, however, that the pdf is close to Gaussian and that the correlation between the mean square temperature fluctuations and the instantaneous scalar dissipation rate is very weak. For the linear profile case, measurements over the range 60≤Rel≤1100 show that the dimensionless heat flux Nu is proportional to Rel0.88 and that the transition from a Gaussian pdf to one with exponential tails occurs at Nu∼31, a value close to transitions observed in other recent mixing experiments conducted in entirely different turbulent flows
Results of Analyses of the Next Generation Solvent for Parsons
International Nuclear Information System (INIS)
Savannah River National Laboratory (SRNL) prepared a nominal 150 gallon batch of Next Generation Solvent (NGS) for Parsons. This material was then analyzed and tested for cesium mass transfer efficiency. The bulk of the results indicate that the solvent is qualified as acceptable for use in the upcoming pilot-scale testing at Parsons Technology Center. This report describes the analysis and testing of a batch of Next Generation Solvent (NGS) prepared in support of pilot-scale testing in the Parsons Technology Center. A total of ∼150 gallons of NGS solvent was prepared in late November of 2011. Details for the work are contained in a controlled laboratory notebook. Analysis of the Parsons NGS solvent indicates that the material is acceptable for use. SRNL is continuing to improve the analytical method for the guanidine.
Automatic generation of alignments for 3D QSAR analyses.
Jewell, N E; Turner, D B; Willett, P; Sexton, G J
2001-01-01
Many 3D QSAR methods require the alignment of the molecules in a dataset, which can require a fair amount of manual effort in deciding upon a rational basis for the superposition. This paper describes the use of FBSS, a program for field-based similarity searching in chemical databases, for generating such alignments automatically. The CoMFA and CoMSIA experiments with several literature datasets show that the QSAR models resulting from the FBSS alignments are broadly comparable in predictive performance with the models resulting from manual alignments. PMID:11774998
Automatic generation of alignments for 3D QSAR analyses
Jewell, N.E.; D.B. Turner; Willett, P.; Sexton, G.J.
2001-01-01
Many 3D QSAR methods require the alignment of the molecules in a dataset, which can require a fair amount of manual effort in deciding upon a rational basis for the superposition. This paper describes the use of FBSS, a pro-ram for field-based similarity searching in chemical databases, for generating such alignments automatically. The CoMFA and CoMSIA experiments with several literature datasets show that the QSAR models resulting from the FBSS alignments are broadly comparable in predictive...
Inverse analyses of laser generated dispersive surface waves
International Nuclear Information System (INIS)
This paper presents results on the inverse analysis of laser generated surface waves in a epoxy bonded layered specimen. Laser ultrasonic experiments were performed and the acquired surface wave signals were processed in the frequency domain to obtain the dispersion relation of the phase velocity in a epoxy-bonded layered specimen. A computer program for calculating the phase velocity dispersion of general isotropic and/or anisotropic layered media was utilized to explore the influence of the epoxy-bonded layer. Inversions of the bonding layer thickness and the elastic wave velocities of the epoxy layer were investigated. The current results show that the thickness and the elastic wave velocities of the bonding layer can be successfully determined. Simultaneous determination of the thickness and the elastic properties of the bonding layer are currently under investigation by the authors.
Next generation sequencing and comparative analyses of Xenopus mitogenomes
Directory of Open Access Journals (Sweden)
Lloyd Rhiannon E
2012-09-01
-coding genes were shown to be under strong negative (purifying selection, with genes under the strongest pressure (Complex 4 also being the most highly expressed, highlighting their potentially crucial functions in the mitochondrial respiratory chain. Conclusions Next generation sequencing of long-PCR amplicons using single taxon or multi-taxon approaches enabled two new species of Xenopus mtDNA to be fully characterized. We anticipate our complete mitochondrial genome amplification methods to be applicable to other amphibians, helpful for identifying the most appropriate markers for differentiating species, populations and resolving phylogenies, a pressing need since amphibians are undergoing drastic global decline. Our mtDNAs also provide templates for conserved primer design and the assembly of RNA and DNA reads following high throughput “omic” techniques such as RNA- and ChIP-seq. These could help us better understand how processes such mitochondrial replication and gene expression influence xenopus growth and development, as well as how they evolved and are regulated.
International Nuclear Information System (INIS)
Haploid cells of Saccharomyces cerevisiae were treated with different DNA damaging agents at various doses. A study of the progeny of individual such cells allowed the assignment of lethal events to distinct post treatment generations. By microscopically inspecting those cells which were not able to form visible colonies the authors could discriminate between cells dying from immediately effective lethal hits and those generating microcolonies probably as a consequence of lethal mutation(s). The experimentally obtained numbers of lethal events were mathematically transformed into mean probabilities of lethal fixations at taking place in cells of certain post treatment generations. Such analyses give detailed insight into the kinetics of lethality as a consequence of different kinds of DNA damage. For example, X-irradiated cells lost viability mainly by lethal hits, only at a higher dose also lethal mutations fixed in the cells that were in direct contact with the mutagen, but not in later generations, occurred. Ethyl methanesulfonate (EMS)-treated cells were hit by 00-fixations in a dose dependent manner. The distribution of all sorts of lethal fixations taken together, which occurred in the EMS-damaged cell families, was not random. For comparison analyses of cells treated with methyl methanesulfonate, N-methyl-N'-nitro-N-nitrosoguanidine and nitrous acid are also reported
Ivana, Pavol; Vlcek, Ivan; Ivanova, Marika
2016-01-01
Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does n...
DEFF Research Database (Denmark)
Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo
2015-01-01
extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...
Chikkagoudar, Satish; Roshan, Usman; Livesay, Dennis
2007-01-01
Probalign computes maximal expected accuracy multiple sequence alignments from partition function posterior probabilities. To date, Probalign is among the very best scoring methods on the BAliBASE, HOMSTRAD and OXBENCH benchmarks. Here, we introduce eProbalign, which is an online implementation of the approach. Moreover, the eProbalign web server doubles as an online platform for post-alignment analysis. The heart-and-soul of the post-alignment functionality is the Probalign Alignment Viewer ...
Actual and perceived low-probability hazards from hydro and nuclear electric power generation
International Nuclear Information System (INIS)
The paper focuses on the low-probability high-consequence events, whose impact on the public opinion is larger, if and when they occur. Hazard is defined as the probability of occurrence of potentially damaging events, whereas risk is used as a merely generic word. In the absence of complete hydro risk studies, a simplified method is developed to estimate the regional seismic hazard for all projected dam sites in a given country. As an illustration, the lower and upper bounds of the seismic hazard are calculated for 86 dam sites in Colombia. This hazard is about two orders of magnitude higher than the hazard posed by a comparable nuclear system. Further, risk perception in Colombia is discussed. By using a questionnaire formulated by the International Atomic Energy Agency it was found that the Colombian public perceives hydro power stations as posing no risk whatsoever, while nuclear risks are perceived as significant. These results run counter to the calculations given in the paper and to some observed facts: two dam collapses in the last twenty years and several devastating earthquakes during the same time period, each individual event causing hundreds of casualties. It is concluded that public perception is not a very good measure for low-probability hazards. It is suggested, therefore, that public opinion should not be taken into account for specific technological risk decisions, but rather should play a role in defining the overall societal safety policies, by using whichever political process may be appropriate in each particular society. (author)
Hu, Yaogang; Li, Hui; Liao, Xinglin; Song, Erbing; Liu, Haitao; Chen, Z.
2016-08-01
This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.
Ivana, Pavol; Ivanova, Marika
2016-01-01
Maxwell dynamic equation of Faraday law erroneously predicts that on homopolar without brush generator, the relative movement of the wire is equivalent with relative motion of the conductor of Faraday homopolar generator and therefore electric intensity must be generated at both devices. Research has shown that it is possible to construct experimental without brush homopolar generator, which proves that movement of electrically neutral conductor in radials of homogeneous magnetic field does not induce any voltage. A new description of the operation of Faraday (with brushes) homopolar generator is here presented such as equipment, which simulates necessary and sufficient condition for the formation of the induction. However, the without brush homopolar meets only a necessary condition, but not sufficient. This article includes a mathematical analysis that shows the current differential concept of the rotation intensity vector creation as an incorrect theoretical mission with minimal impact on the design of kno...
PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL
International Nuclear Information System (INIS)
We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.
Tilly, David; Ahnesjö, Anders
2015-07-01
A fast algorithm is constructed to facilitate dose calculation for a large number of randomly sampled treatment scenarios, each representing a possible realisation of a full treatment with geometric, fraction specific displacements for an arbitrary number of fractions. The algorithm is applied to construct a dose volume coverage probability map (DVCM) based on dose calculated for several hundred treatment scenarios to enable the probabilistic evaluation of a treatment plan. For each treatment scenario, the algorithm calculates the total dose by perturbing a pre-calculated dose, separately for the primary and scatter dose components, for the nominal conditions. The ratio of the scenario specific accumulated fluence, and the average fluence for an infinite number of fractions is used to perturb the pre-calculated dose. Irregularities in the accumulated fluence may cause numerical instabilities in the ratio, which is mitigated by regularisation through convolution with a dose pencil kernel. Compared to full dose calculations the algorithm demonstrates a speedup factor of ~1000. The comparisons to full calculations show a 99% gamma index (2%/2 mm) pass rate for a single highly modulated beam in a virtual water phantom subject to setup errors during five fractions. The gamma comparison shows a 100% pass rate in a moving tumour irradiated by a single beam in a lung-like virtual phantom. DVCM iso-probability lines computed with the fast algorithm, and with full dose calculation for each of the fractions, for a hypo-fractionated prostate case treated with rotational arc therapy treatment were almost indistinguishable.
International Nuclear Information System (INIS)
A fast algorithm is constructed to facilitate dose calculation for a large number of randomly sampled treatment scenarios, each representing a possible realisation of a full treatment with geometric, fraction specific displacements for an arbitrary number of fractions. The algorithm is applied to construct a dose volume coverage probability map (DVCM) based on dose calculated for several hundred treatment scenarios to enable the probabilistic evaluation of a treatment plan.For each treatment scenario, the algorithm calculates the total dose by perturbing a pre-calculated dose, separately for the primary and scatter dose components, for the nominal conditions. The ratio of the scenario specific accumulated fluence, and the average fluence for an infinite number of fractions is used to perturb the pre-calculated dose. Irregularities in the accumulated fluence may cause numerical instabilities in the ratio, which is mitigated by regularisation through convolution with a dose pencil kernel.Compared to full dose calculations the algorithm demonstrates a speedup factor of ∼1000. The comparisons to full calculations show a 99% gamma index (2%/2 mm) pass rate for a single highly modulated beam in a virtual water phantom subject to setup errors during five fractions. The gamma comparison shows a 100% pass rate in a moving tumour irradiated by a single beam in a lung-like virtual phantom. DVCM iso-probability lines computed with the fast algorithm, and with full dose calculation for each of the fractions, for a hypo-fractionated prostate case treated with rotational arc therapy treatment were almost indistinguishable. (paper)
Minimization of Handoff Failure Probability for Next-Generation Wireless Systems
Debabrata Sarddar; Tapas Jana; Souvik Kumar Saha; Joydeep Banerjee; Utpal Biswas; M.K.Naskar
2010-01-01
During the past few years, advances in mobile communication theory have enabled the development and deployment of different wireless technologies, complementary to each other. Hence, their integration can realize a unified wireless system that has the best features of the individual networks. Next-Generation Wireless Systems (NGWS) integrate different wireless systems, each of which is optimized for some specific services and coverage area to provide ubiquitous communications to the mobile us...
International Nuclear Information System (INIS)
The approach estimating the leak probability of flanged joint due to the destruction of fastening studs is described. The mentioned approach consists of two stages. The probability of destroying one stud is calculated at the first stage, and the probability of different combination interpositions of intact and destroyed studs is calculated at the second one. The probability calculation of leak in the area of collector cover of steam generator PGV-1000 is used as an example of developed approach
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
International Nuclear Information System (INIS)
PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Directory of Open Access Journals (Sweden)
Chung-Ho Su
2010-12-01
Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.
International Nuclear Information System (INIS)
To ensure the safety operation of nuclear power plant (NPP), a lot of postulated accident scenarios were considered and analysed. This research chose and analysed the accident of steam generator tube rupture (SGTR) under the actual plant conditions by using the simulation program PCTRAN. The SGTR accident is happen when the NPP is under operation with the steady state condition (power of 3000 MWth, primary pressure of 157 bar and secondary pressure of 63 bar). The accident is initiated by creating a break with equivalent diameter of 100 mm in the area of lower row heat exchanging tubes. The result of analysis is compared with the calculation of the Shariz University, Iran using the thermal hydraulics code RELAP5/mod3.2 and the report in the PSAR data of VVER-1000. This comparison shows that it is possible for using PCTRAN to analyse accidents of VVER-1000 reactor. (author)
Elastic-plastic fracture mechanics analyses of cracked steam generator tubes under internal pressure
Energy Technology Data Exchange (ETDEWEB)
Kim, Hyeong Keun; Ahn, Min Yong; Moon, Seong In; Chang, Yoon Suk; Kim, Young Jin [Sungkyunkwan Univ., Suwon (Korea, Republic of); Hwang, Seong Sik; Kim, Joung Soo [KAERI, Taejon (Korea, Republic of)
2005-07-01
The structural and leakage integrity of steam generator tube should be maintained during operation even though a crack is existed on it. During the past three decades, several limit load solutions have been proposed to resolve the integrity issue. However, for exact load carrying capacity estimation of specific components under different conditions, these solutions have to be modified by using lots of experimental data. The purpose of this paper is to introduce a new burst pressure estimation scheme based on fracture mechanics analyses for steam generator tube with an axial or circumferential through-wall crack. To do this, closed-form engineering equations were derived to get relevant parameters from three dimensional elastic-plastic finite element analyses combined with reference stress method. Also, a series of structural integrity analyses were carried out using the calculated J-integral from engineering equations and fracture toughness data. Thereby, in comparison with the experimental data as well as corresponding estimation results from limit load solutions, it was proven that the proposed estimation scheme can be used as an efficient tool for integrity evaluation of cracked steam generator tubes.
Analyses of Acoustic Streaming Generated by Four Ultrasonic Vibrators in a Vessel
Nakagawa, Masafumi
2004-05-01
When ultrasonic waves are applied, the heat transfer at a heated surface in water increases markedly. The origin of this increase in heat transfer is thought to be due to the agitation effect from the microjets of cavitation and from acoustic streaming. The method in which four vibrators are used has the ability of further enhancing heat transfer. This paper presents the method using four vibrators to eject an acoustic stream jet at a selected position in the vessel. Analyses of this method are performed to establish it theoretically and to compare with an experiment previously conducted. The analyses shown in this research indicate that the aspects of acoustic streaming generated by the four vibrators in the vessel can be correctly predicted and provide a foundation for the development of using this method for the enhancement of heat transfer.
Rincon, Diego F; Hoy, Casey W; Cañas, Luis A
2015-04-01
Most predator-prey models extrapolate functional responses from small-scale experiments assuming spatially uniform within-plant predator-prey interactions. However, some predators focus their search in certain plant regions, and herbivores tend to select leaves to balance their nutrient uptake and exposure to plant defenses. Individual-based models that account for heterogeneous within-plant predator-prey interactions can be used to scale-up functional responses, but they would require the generation of explicit prey spatial distributions within-plant architecture models. The silverleaf whitefly, Bemisia tabaci biotype B (Gennadius) (Hemiptera: Aleyrodidae), is a significant pest of tomato crops worldwide that exhibits highly aggregated populations at several spatial scales, including within the plant. As part of an analytical framework to understand predator-silverleaf whitefly interactions, the objective of this research was to develop an algorithm to generate explicit spatial counts of silverleaf whitefly nymphs within tomato plants. The algorithm requires the plant size and the number of silverleaf whitefly individuals to distribute as inputs, and includes models that describe infestation probabilities per leaf nodal position and the aggregation pattern of the silverleaf whitefly within tomato plants and leaves. The output is a simulated number of silverleaf whitefly individuals for each leaf and leaflet on one or more plants. Parameter estimation was performed using nymph counts per leaflet censused from 30 artificially infested tomato plants. Validation revealed a substantial agreement between algorithm outputs and independent data that included the distribution of counts of both eggs and nymphs. This algorithm can be used in simulation models that explore the effect of local heterogeneity on whitefly-predator dynamics. PMID:26313173
Zhao, Song-Feng; Zhou, Xiao-Xin; Lin, C. D.
It is shown that measurement of alignment-dependent ionization probability and high-order harmonic generation (HHG) of molecules in an intense laser field can be used to probe the orbital symmetry of molecules. In this review, recent progress of molecular tunneling ionization (MO-ADK) model of Tong et al. [Phys. Rev. A 66, 033402 (2002)] is first reviewed. In particular, an efficient method to obtain wavefunctions of linear molecules in the asymptotic region was developed by solving the time-independent Schrödinger equation with B-spline functions, and molecular potential energy surfaces were constructed based on the density functional theory. The accurate wavefunctions are used to extract improved structure parameters in the MO-ADK model. The loss of accuracy of the MO-ADK model in the low intensity multiphoton ionization regime is also addressed by comparing with the molecular Perelomov-Popov-Terent'ev (MO-PPT) model, the single-active-electron time-dependent Schrödinger equation (SAE-TDSE) method, and the experimental data. Finally, how the orbital symmetry affects the HHG of molecules within the strong-field approximation (SFA) was reviewed.
International Nuclear Information System (INIS)
Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar. (authors)
Life cycle analyses applied to first generation bio-fuels consumed in France
International Nuclear Information System (INIS)
This rather voluminous publication reports detailed life cycle analyses for the different present bio-fuels channels also named first-generation bio-fuels: bio-ethanol, bio-diesel, pure vegetal oils, and oil. After a recall of the general principles adopted for this life-cycle analysis, it reports the modelling of the different channels (agricultural steps, bio-fuel production steps, Ethyl tert-butyl ether or ETBE steps, vehicles, animal fats and used vegetal oils, soil assignment change). It gives synthetic descriptions of the different production ways (methyl ester from different plants, ethanol from different plants). It reports and compares the results obtained in terms of performance
Analyses of an air conditioning system with entropy generation minimization and entransy theory
Yan-Qiu, Wu; Li, Cai; Hong-Juan, Wu
2016-06-01
In this paper, based on the generalized heat transfer law, an air conditioning system is analyzed with the entropy generation minimization and the entransy theory. Taking the coefficient of performance (denoted as COP) and heat flow rate Q out which is released into the room as the optimization objectives, we discuss the applicabilities of the entropy generation minimization and entransy theory to the optimizations. Five numerical cases are presented. Combining the numerical results and theoretical analyses, we can conclude that the optimization applicabilities of the two theories are conditional. If Q out is the optimization objective, larger entransy increase rate always leads to larger Q out, while smaller entropy generation rate does not. If we take COP as the optimization objective, neither the entropy generation minimization nor the concept of entransy increase is always applicable. Furthermore, we find that the concept of entransy dissipation is not applicable for the discussed cases. Project supported by the Youth Programs of Chongqing Three Gorges University, China (Grant No. 13QN18).
Lexicographic probability, conditional probability, and nonstandard probability
Halpern, Joseph Y.
2003-01-01
The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are ...
Liu, Zhanqi; Panousis, Con; Smyth, Fiona E; Murphy, Roger; Wirth, Veronika; Cartwright, Glenn; Johns, Terrance G; Scott, Andrew M
2003-08-01
The chimeric monoclonal antibody ch806 specifically targets the tumor-associated mutant epidermal growth factor receptor (de 2-7EGFR or EGFRVIII) and is currently under investigation for its potential use in cancer therapy. The humanised monoclonal antibody hu3S193 specifically targets the Lewis Y epithelial antigen and is currently in Phase I clinical trials in patients with advanced breast, colon, and ovarian carcinomas. To assist the clinical evaluation of ch806 and hu3S193, laboratory assays are required to monitor their serum pharmacokinetics and quantitate any immune responses to the antibodies. Mice immunized with ch806 or hu3S193 were used to generate hybridomas producing antibodies with specific binding to ch806 or hu3S193 and competitive for antigen binding. These anti-idiotype antibodies (designated Ludwig Melbourne Hybridomas, LMH) were investigated as reagents suitable for use as positive controls for HAHA or HACA analyses and for measuring hu3S193 or ch806 in human serum. Anti-idiotypes with the ability to concurrently bind two target antibody molecules were identified, which enabled the development of highly reproducible, sensitive, specific ELISA assays for determining serum concentrations of hu3S193 and ch806 with a 3 ng/mL limit of quantitation using LMH-3 and LMH-12, respectively. BIAcore analyses determined high apparent binding affinity for both idiotypes: LMH-3 binding immobilized hu3S193, Ka = 4.76 x 10(8) M(-1); LMH-12 binding immobilised ch806, Ka = 1.74 x 10(9) M(-1). Establishment of HAHA or HACA analysis of sera samples using BIAcore was possible using LMH-3 and LMH-12 as positive controls for quantitation of immune responses to hu3S193 or ch806 in patient sera. These anti-idiotypes could also be used to study the penetrance and binding of ch806 or hu3S193 to tumor cells through immunohistochemical analysis of tumor biopsies. The generation of anti-idiotype antibodies capable of concurrently binding a target antibody on each variable
DEFF Research Database (Denmark)
Hu, Y.; Li, H.; Liao, X;
2016-01-01
-based condition monitoring methods. Considering the its thermal inertia and strong anti-interference capacity, temperature characteristic parameters as a deterioration indication cannot be adequately disturbed by the uncontrollable noise and uncertainty nature of wind. This paper provides a probability evaluation...
International Nuclear Information System (INIS)
Up to 2009, the author and a colleague conducted trend analyses of problem events related to main generators, emergency diesel generators, breakers, motors and transformers which are more likely to cause problems than other electric components in nuclear power plants. Among the electric components with high frequency of defect occurrence, i.e., emergency diesel generators, several years have passed since the last analyses. These are very important components needed to stop a nuclear reactor safely and to cool it down during external power supply loses. Then trend analyses were conducted for the second time. The trend analyses were performed on 80 problem events with emergency diesel generators which had occurred in U.S. nuclear power plants in the five years from 2005 through 2009 among events reported in the Licensee Event Reports (LERs: event reports submitted to NRC by U.S. nuclear power plants) which have been registered in the nuclear information database of the Institute of Nuclear Safety System, Inc. (INSS) , as well as 40 events registered in the Nuclear Information Archives (NUCIA), which occurred in Japanese nuclear power plants in the same time period. It was learned from the trend analyses of the problem events with emergency diesel generators that frequency of defect occurrence are high in both Japanese and US plants during plant operations and functional tests (that is, defects can be discovered effectively in advance), so that implementation of periodical functional tests under plant operation is an important task for the future. (author)
Directory of Open Access Journals (Sweden)
Long Cui
Full Text Available We present the genetic analyses conducted on a three-generation family (14 individuals with three members affected with isolated-Hirschsprung disease (HSCR and one with HSCR and heterochromia iridum (syndromic-HSCR, a phenotype reminiscent of Waardenburg-Shah syndrome (WS4. WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10 and in the main HSCR gene (RET. Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor the transition results in the abolishment of translation initiation (M1V, in isoform 3 (only in the cytosol the replacement occurs at Met91 (M91V and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency- in the 5'-untranslated region of EDN3 (EDNRB ligand was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family.
Cui, Long; Wong, Emily Hoi-Man; Cheng, Guo; Firmato de Almeida, Manoel; So, Man-Ting; Sham, Pak-Chung; Cherny, Stacey S; Tam, Paul Kwong-Hang; Garcia-Barceló, Maria-Mercè
2013-01-01
We present the genetic analyses conducted on a three-generation family (14 individuals) with three members affected with isolated-Hirschsprung disease (HSCR) and one with HSCR and heterochromia iridum (syndromic-HSCR), a phenotype reminiscent of Waardenburg-Shah syndrome (WS4). WS4 is characterized by pigmentary abnormalities of the skin, eyes and/or hair, sensorineural deafness and HSCR. None of the members had sensorineural deafness. The family was screened for copy number variations (CNVs) using Illumina-HumanOmni2.5-Beadchip and for coding sequence mutations in WS4 genes (EDN3, EDNRB, or SOX10) and in the main HSCR gene (RET). Confocal microscopy and immunoblotting were used to assess the functional impact of the mutations. A heterozygous A/G transition in EDNRB was identified in 4 affected and 3 unaffected individuals. While in EDNRB isoforms 1 and 2 (cellular receptor) the transition results in the abolishment of translation initiation (M1V), in isoform 3 (only in the cytosol) the replacement occurs at Met91 (M91V) and is predicted benign. Another heterozygous transition (c.-248G/A; -predicted to affect translation efficiency-) in the 5'-untranslated region of EDN3 (EDNRB ligand) was detected in all affected individuals but not in healthy carriers of the EDNRB mutation. Also, a de novo CNVs encompassing DACH1 was identified in the patient with heterochromia iridum and HSCR Since the EDNRB and EDN3 variants only coexist in affected individuals, HSCR could be due to the joint effect of mutations in genes of the same pathway. Iris heterochromia could be due to an independent genetic event and would account for the additional phenotype within the family. PMID:23840513
Analyses of steam generator collector rupture for WWER-1000 using Relap5 code
International Nuclear Information System (INIS)
The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment
Analyses of steam generator collector rupture for WWER-1000 using Relap5 code
Energy Technology Data Exchange (ETDEWEB)
Balabanov, E.; Ivanova, A. [Energoproekt, Sofia (Bulgaria)
1995-12-31
The paper presents some of the results of analyses of an accident with a LOCA from the primary to the secondary side of a WWER-1000/320 unit. The objective of the analyses is to estimate the primary coolant to the atmosphere, to point out the necessity of a well defined operator strategy for this type of accident as well as to evaluate the possibility to diagnose the accident and to minimize the radiological impact on the environment.
Quantum, classical and semiclassical analyses of photon statistics in harmonic generation
Bajer, J; Bajer, Jiri; Miranowicz, Adam
2001-01-01
In this review, we compare different descriptions of photon-number statistics in harmonic generation processes within quantum, classical and semiclassical approaches. First, we study the exact quantum evolution of the harmonic generation by applying numerical methods including those of Hamiltonian diagonalization and global characteristics. We show explicitly that the harmonic generations can indeed serve as a source of nonclassical light. Then, we demonstrate that the quasi-stationary sub-Poissonian light can be generated in these quantum processes under conditions corresponding to the so-called no-energy-transfer regime known in classical nonlinear optics. By applying method of classical trajectories, we demonstrate that the analytical predictions of the Fano factors are in good agreement with the quantum results. On comparing second and higher harmonic generations in the no-energy-transfer regime, we show that the highest noise reduction is achieved in third-harmonic generation with the Fano-factor of the ...
Yongkul Won; Hsiao, Frank S.T.; Doo Yong Yang
2008-01-01
Using timeseries and panel data from 1981 to 2005, this paper examines the Granger causality relations between GDP, exports, and FDI among the three first generation Asian newly industrializing economies (ANIEs) : Korea, Taiwan, Singapore, and the four second generation Asian newly industrializing economies (ANIEs) : Malaysia, the Philippines, and Thailand, in addition to China. We first show the difference between the first and second generation ANIEs in terms of real GDP per capita, trade s...
Analysing humanly generated random number sequences: A pattern-based approach
Gravenor, M B; Schulz, M A; Schmalbach, B; Brugger, P; Witt, K.
2012-01-01
In a random number generation task, participants are asked to generate a random sequence of numbers, most typically the digits 1 to 9. Such number sequences are not mathematically random, and both extent and type of bias allow one to characterize the brain's “internal random number generator”. We assume that certain patterns and their variations will frequently occur in humanly generated random number sequences. Thus, we introduce a pattern-based analysis of random number sequences. Twenty he...
Analysing Humanly Generated Random Number Sequences: A Pattern-Based Approach
Schulz, Marc-André; Schmalbach, Barbara; Brugger, Peter; Witt, Karsten
2012-01-01
In a random number generation task, participants are asked to generate a random sequence of numbers, most typically the digits 1 to 9. Such number sequences are not mathematically random, and both extent and type of bias allow one to characterize the brain's “internal random number generator”. We assume that certain patterns and their variations will frequently occur in humanly generated random number sequences. Thus, we introduce a pattern-based analysis of random number sequences. Twenty he...
Directory of Open Access Journals (Sweden)
M. de la Torre Juárez
2011-03-01
Full Text Available Means, standard deviations, homogeneity parameters used in models based on their ratio, and the probability distribution functions (PDFs of cloud properties from the MODerate resolution Infrared Spectrometer (MODIS are estimated globally as function of averaging scale varying from 5 to 500 km. The properties – cloud fraction, droplet effective radius, and liquid water path – all matter for cloud-climate uncertainty quantification and reduction efforts. Global means and standard deviations are confirmed to change with scale. For the range of scales considered, global means vary only within 3% for cloud fraction, 7% for liquid water path, and 0.2% for cloud particle effective radius. These scale dependences contribute to the uncertainties in their global budgets. Scale dependence for standard deviations and generalized flatness are compared to predictions for turbulent systems. Analytical expressions are identified that fit best to each observed PDF. While the best analytical PDF fit to each variable differs, all PDFs are well described by log-normal PDFs when the mean is normalized by the standard deviation inside each averaging domain. Importantly, log-normal distributions yield significantly better fits to the observations than gaussians at all scales. This suggests a possible approach for both sub-grid and unified stochastic modeling of these variables at all scales. The results also highlight the need to establish an adequate spatial resolution for two-stream radiative studies of cloud-climate interactions.
DEFF Research Database (Denmark)
Romanovsky, G.; Xydis, G.; Mutale, J.
2011-01-01
While there are presently different options for renewable and distributed generation (RES/DG) to participate in the UK electricity market, none of the market options is specifically tailored for such types of generation and in particular, the smaller (up to 5 MW) RES/DG. This is because the UK ha...
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Scale/Analytical Analyses of Freezing and Convective Melting with Internal Heat Generation
Energy Technology Data Exchange (ETDEWEB)
Ali S. Siahpush; John Crepeau; Piyush Sabharwall
2013-07-01
Using a scale/analytical analysis approach, we model phase change (melting) for pure materials which generate constant internal heat generation for small Stefan numbers (approximately one). The analysis considers conduction in the solid phase and natural convection, driven by internal heat generation, in the liquid regime. The model is applied for a constant surface temperature boundary condition where the melting temperature is greater than the surface temperature in a cylindrical geometry. The analysis also consider constant heat flux (in a cylindrical geometry).We show the time scales in which conduction and convection heat transfer dominate.
Energy Technology Data Exchange (ETDEWEB)
Frick, S. [Institut fuer Energetik und Umwelt gGmbH, Leipzig (Germany); Huenges, E [GeoForschungsZentrum (GFZ), Potsdam (Germany); Jung, R. [Institut fuer Geowissenschaftliche Gemeinschaftsaufgaben (GGA), Hannover (Germany); Kaltschmitt, M. [Institut fuer Energetik und Umwelt gGmbH, Leipzig (Germany); Technische Univ. Hamburg-Harburg, Hamburg (DE). Inst. fuer Umwelttechnik und Energiewirtschaft (IUE)
2007-07-01
Geothermal energy is due to its existing large resources in Germany an option which can note-worthy contribute to the future energy provision. The amendment of the EEG law (law on the use of renewables) therefore draws much more interest to geothermal electricity generation. Against this background, the objective of this article is to identify the main cost drivers and risks of a geothermal power and heat generation under the geological conditions in Germany and derives the resultant recommendations. (orig.)
Analyses on the Ionization Instability of Non-Equilibrium Seeded Plasma in an MHD Generator
Le, Chi Kien
2016-06-01
Recently, closed cycle magnetohydrodynamic power generation system research has been focused on improving the isentropic efficiency and the enthalpy extraction ratio. By reducing the cross-section area ratio of the disk magnetohydrodynamic generator, it is believed that a high isentropic efficiency can be achieved with the same enthalpy extraction. In this study, the result relating to a plasma state which takes into account the ionization instability of non-equilibrium seeded plasma is added to the theoretical prediction of the relationship between enthalpy extraction and isentropic efficiency. As a result, the electron temperature which reaches the seed complete ionization state without the growth of ionization instability can be realized at a relatively high seed fraction condition. However, the upper limit of the power generation performance is suggested to remain lower than the value expected in the low seed fraction condition. It is also suggested that a higher power generation performance may be obtained by implementing the electron temperature range which reaches the seed complete ionization state at a low seed fraction.
Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro
2016-02-01
Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies. PMID:25403321
International Nuclear Information System (INIS)
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Analysing the statistics of group constants generated by Serpent 2 Monte Carlo code
International Nuclear Information System (INIS)
An important topic in Monte Carlo neutron transport calculations is to verify that the statistics of the calculated estimates are correct. Undersampling, non-converged fission source distribution and inter-cycle correlations may result in inaccurate results. In this paper, we study the effect of the number of neutron histories on the distributions of homogenized group constants and assembly discontinuity factors generated using Serpent 2 Monte Carlo code. We apply two normality tests and a so-called “drift-in-mean” test to the batch-wise distributions of selected parameters generated for two assembly types taken from the MIT BEAVRS benchmark. The results imply that in the tested cases the batch-wise estimates of the studied group constants can be regarded as normally distributed. We also show that undersampling is an issue with the calculated assembly discontinuity factors when the number of neutron histories is small. (author)
Effects of cross sections tables generation and optimization on rod ejection transient analyses
International Nuclear Information System (INIS)
Highlights: • Different cross-section libraries are applied to a rod ejection transient benchmark. • Effects of the optimization of the library grid-point distribution are assessed. • Effects of the library generation are assessed by comparison with other solutions. • Interpolation errors contribute to neutronics uncertainties in modeling transients. - Abstract: Best estimate analysis of rod ejection transients requires 3D kinetics core simulators. If they use cross sections libraries compiled in multidimensional tables, interpolation errors – originated when the core simulator computes the cross sections from the table values – are a source of uncertainty in k-effective calculations that should be accounted for. Those errors depend on the grid covering the domain of state variables and can be easily reduced, in contrast with other sources of uncertainties such as the ones due to nuclear data, by choosing an optimized grid distribution. The present paper assesses the impact of the grid structure on a PWR rod ejection transient analysis using the coupled neutron-kinetics/thermal-hydraulics COBAYA3/COBRA-TF system. For this purpose, the OECD/NEA PWR MOX/UO2 core transient benchmark has been chosen, as material compositions and geometries are available, allowing the use of lattice codes to generate libraries with different grid structures. Since a complete nodal cross-section library is also provided as part of the benchmark specifications, the effects of the library generation on transient behavior are also analyzed. Results showed large discrepancies when using the benchmark library and own-generated libraries when compared with benchmark participants’ solutions. The origin of the discrepancies was found to lie in the nodal cross sections provided in the benchmark
Energetic and Exergetic Analyses of a Direct Steam Generation Solar Thermal Power Plant in Cyprus
Hamidi, Armita
2012-01-01
ABSTRACT: In recent decades, the threat of climate change and other environmental impacts of fossil fuels have reinforced interests in alternative and renewable energy sources for producing electricity. In this regard, solar thermal energy can be utilized in existing power generation plants as replacement for the heat produced by means of fossil fuels. The objective of this study is to investigate the energetic and exergetic feasibility of utilizing a solar thermal power plant in Cyprus. The...
Analysing of the power generation system%浅析汽车发电机
Institute of Scientific and Technical Information of China (English)
牟亮
2016-01-01
简单介绍了发电系统结构和发电系统各组成部分的原理。解决了在看整车电气原理图时候只能看到一个发电系统符号，而看不懂里面的组成与原理的问题。在以后的研究工作中，提供有用的关于发电系统方面的技术资料。%This article simply introduces the structure of the power generation system and the principle of the power generation system. This paper solved the problem that when we see vehicle electrical principle diagram we can only see a symbolic power generation systems and don't understand the inside composition and principle. The article provides useful technical information on power systems for employees of the company in the later work.
Comparative Analyses on OPR1000 Steam Generator Tube Rupture Event Emergency Operational Guideline
Energy Technology Data Exchange (ETDEWEB)
Lee, Sang Won; Bae, Yeon Kyoung; Kim, Hyeong Teak [Korea Hydro and Nuclear Power Co., Ltd., Taejon (Korea, Republic of)
2006-07-01
The Steam Generator Tube Rupture (SGTR) event is one of the important scenarios in respect to the radiation release to the environment. When the SGTR occurs, containment integrity is not effective because of the direct bypass of containment via the ruptured steam generator to the MSSV and MSADV. To prevent this path, the Emergency Operational Guideline of OPR1000 indicates the use of Turbine Bypass Valves (TBVs) as an effective means to depressurize the main steam line and prevent the lifting of MSSV. However, the TBVs are not operable when the offsite power is not available (LOOP). In this situation, the RCS cool-down is achieved by opening the both intact and ruptured SG MSADV. But this action causes the large amount of radiation release to the environment. To minimize the radiation release to the environment, KSNP EOG adopts the improved strategy when the SGTR concurrently with LOOP is occurred. However, these procedures show some duplicated procedure and branch line that might confusing the operator for optimal recovery action. So, in this paper, the comparative analysis on SGTR and SGTR with LOOP is performed and optimized procedure is proposed.
Comparative Analyses on OPR1000 Steam Generator Tube Rupture Event Emergency Operational Guideline
International Nuclear Information System (INIS)
The Steam Generator Tube Rupture (SGTR) event is one of the important scenarios in respect to the radiation release to the environment. When the SGTR occurs, containment integrity is not effective because of the direct bypass of containment via the ruptured steam generator to the MSSV and MSADV. To prevent this path, the Emergency Operational Guideline of OPR1000 indicates the use of Turbine Bypass Valves (TBVs) as an effective means to depressurize the main steam line and prevent the lifting of MSSV. However, the TBVs are not operable when the offsite power is not available (LOOP). In this situation, the RCS cool-down is achieved by opening the both intact and ruptured SG MSADV. But this action causes the large amount of radiation release to the environment. To minimize the radiation release to the environment, KSNP EOG adopts the improved strategy when the SGTR concurrently with LOOP is occurred. However, these procedures show some duplicated procedure and branch line that might confusing the operator for optimal recovery action. So, in this paper, the comparative analysis on SGTR and SGTR with LOOP is performed and optimized procedure is proposed
International Nuclear Information System (INIS)
This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ''open'' GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials
Thermodynamic analyses of a biomass-coal co-gasification power generation system.
Yan, Linbo; Yue, Guangxi; He, Boshu
2016-04-01
A novel chemical looping power generation system is presented based on the biomass-coal co-gasification with steam. The effects of different key operation parameters including biomass mass fraction (Rb), steam to carbon mole ratio (Rsc), gasification temperature (Tg) and iron to fuel mole ratio (Rif) on the system performances like energy efficiency (ηe), total energy efficiency (ηte), exergy efficiency (ηex), total exergy efficiency (ηtex) and carbon capture rate (ηcc) are analyzed. A benchmark condition is set, under which ηte, ηtex and ηcc are found to be 39.9%, 37.6% and 96.0%, respectively. Furthermore, detailed energy Sankey diagram and exergy Grassmann diagram are drawn for the entire system operating under the benchmark condition. The energy and exergy efficiencies of the units composing the system are also predicted. PMID:26826573
Falk, Jessica J; Laib Sampaio, Kerstin; Stegmann, Cora; Lieber, Diana; Kropff, Barbara; Mach, Michael; Sinzger, Christian
2016-09-01
For many questions in human cytomegalovirus (HCMV) research, assays are desired that allow robust and fast quantification of infection efficiencies under high-throughput conditions. The secreted Gaussia luciferase has been demonstrated as a suitable reporter in the context of a fibroblast-adapted HCMV strain, which however is greatly restricted in the number of cell types to which it can be applied. We inserted the Gaussia luciferase expression cassette into the BAC-cloned virus strain TB40-BAC4, which displays the natural broad cell tropism of HCMV and hence allows application to screening approaches in a variety of cell types including fibroblasts, epithelial, and endothelial cells. Here, we applied the reporter virus TB40-BAC4-IE-GLuc to identify mouse hybridoma clones that preferentially neutralize infection of endothelial cells. In addition, as the Gaussia luciferase is secreted into culture supernatants from infected cells it allows kinetic analyses in living cultures. This can speed up and facilitate phenotypic characterization of BAC-cloned mutants. For example, we analyzed a UL74 stop-mutant of TB40-BAC4-IE-GLuc immediately after reconstitution in transfected cultures and found the increase of luciferase delayed and reduced as compared to wild type. Phenotypic monitoring directly in transfected cultures can minimize the risk of compensating mutations that might occur with extended passaging. PMID:27326666
Software tool for analysing the family shopping basket without candidate generation
Directory of Open Access Journals (Sweden)
Roberto Carlos Naranjo Cuervo
2010-05-01
Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.
International Nuclear Information System (INIS)
A severe accident has inherently significant uncertainties due to the complex phenomena and wide range of conditions. Because of its high temperature and pressure, performing experimental validation and practical application are extremely difficult. With these difficulties, there has been few experimental researches performed and there is no plant-specific experimental data. Instead, computer codes have been developed to simulate the accident and have been used conservative assumptions and margins. This study is an effort to reduce the uncertainty in the probabilistic safety assessment and produce a realistic and physical-based failure probability. The methodology was developed and applied to the OPR1000. The creep rupture failure probabilities of reactor coolant system (RCS) components were evaluated under a station blackout severe accident with all powers lost and no recovery of steam generator auxiliary feed-water. The MELCOR 1.8.6 code was used to obtain the plant-specific pressure and temperature history of each part of the RCS and the creep rupture failure times were calculated by the rate-dependent creep rupture model with the plant-specific data. (author)
Sun, Jingxuan
2012-01-01
Sun, Jingxuan 2012. Analysing the Risks and Challenges of the Pad Device as an Education Tool and Its Influence on Generation Y: Bachelor’s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 68. Appendices 2. The objective of this research was to explore the use of the pad device as an education tool. Furthermore, the research work also intended to study the impact of the pad device among young students. The advantages of using the pad devices instead of compu...
Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran
2016-09-01
This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, pcomputer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. PMID:27591538
Nakagawa, Masaki; Togashi, Yuichi
2016-01-01
Cell activities primarily depend on chemical reactions, especially those mediated by enzymes, and this has led to these activities being modeled as catalytic reaction networks. Although deterministic ordinary differential equations of concentrations (rate equations) have been widely used for modeling purposes in the field of systems biology, it has been pointed out that these catalytic reaction networks may behave in a way that is qualitatively different from such deterministic representation when the number of molecules for certain chemical species in the system is small. Apart from this, representing these phenomena by simple binary (on/off) systems that omit the quantities would also not be feasible. As recent experiments have revealed the existence of rare chemical species in cells, the importance of being able to model potential small-number phenomena is being recognized. However, most preceding studies were based on numerical simulations, and theoretical frameworks to analyze these phenomena have not been sufficiently developed. Motivated by the small-number issue, this work aimed to develop an analytical framework for the chemical master equation describing the distributional behavior of catalytic reaction networks. For simplicity, we considered networks consisting of two-body catalytic reactions. We used the probability generating function method to obtain the steady-state solutions of the chemical master equation without specifying the parameters. We obtained the time evolution equations of the first- and second-order moments of concentrations, and the steady-state analytical solution of the chemical master equation under certain conditions. These results led to the rank conservation law, the connecting state to the winner-takes-all state, and analysis of 2-molecules M-species systems. A possible interpretation of the theoretical conclusion for actual biochemical pathways is also discussed. PMID:27047384
Todorovic Balint, Milena; Jelicic, Jelena; Mihaljevic, Biljana; Kostic, Jelena; Stanic, Bojana; Balint, Bela; Pejanovic, Nadja; Lucic, Bojana; Tosic, Natasa; Marjanovic, Irena; Stojiljkovic, Maja; Karan-Djurasevic, Teodora; Perisic, Ognjen; Rakocevic, Goran; Popovic, Milos; Raicevic, Sava; Bila, Jelena; Antic, Darko; Andjelic, Bosko; Pavlovic, Sonja
2016-01-01
The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL) has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS) using the TruSeq Amplicon Cancer Panel (TSACP) for 48 cancer-related genes. Next generation sequencing (NGS) analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3), pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS) (p = 0.036). The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023), inferior event-free survival (p = 0.011) and overall survival (OS) (p = 0.017), while mutations in the PTEN gene were associated with inferior OS (p = 0.048). Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach. PMID:27164089
Todorovic Balint, Milena; Jelicic, Jelena; Mihaljevic, Biljana; Kostic, Jelena; Stanic, Bojana; Balint, Bela; Pejanovic, Nadja; Lucic, Bojana; Tosic, Natasa; Marjanovic, Irena; Stojiljkovic, Maja; Karan-Djurasevic, Teodora; Perisic, Ognjen; Rakocevic, Goran; Popovic, Milos; Raicevic, Sava; Bila, Jelena; Antic, Darko; Andjelic, Bosko; Pavlovic, Sonja
2016-01-01
The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL) has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS) using the TruSeq Amplicon Cancer Panel (TSACP) for 48 cancer-related genes. Next generation sequencing (NGS) analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3), pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS) (p = 0.036). The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023), inferior event-free survival (p = 0.011) and overall survival (OS) (p = 0.017), while mutations in the PTEN gene were associated with inferior OS (p = 0.048). Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach. PMID:27164089
International Nuclear Information System (INIS)
Two LSTF experiments were conducted for OECD/NEA ROSA Project simulating PWR 0.5% cold leg small break LOCA. Steam generator (SG) secondary-side depressurization was performed by fully opening the relief valves at 10 minutes after a safety injection signal with or without non-condensable gas (air) inflow from accumulator tanks with total failure of high pressure injection system. Further assumptions were made to conduct enhanced SG depressurization by fully opening the safety valves when the primary pressure decreased to 2 MPa and no actuation of low pressure injection system, both to well observe natural circulation (NC) phenomena at low pressures. The primary depressurization rate decreased when non-condensable gas started to enter primary loops because of degradation in the condensation heat transfer in SG U-tubes, while two-phase flow NC has continued even after non-condensable gas inflow. Asymmetric NC behaviors appeared between two loops due probably to different number of forward flow SG U-tubes which would have been under influences of non-condensable gas. Post-test analyses by using JAEA-modified RELAP5/MOD3.2.1.2 code indicated that the code has remaining problems in propor prediction of the primary loop flow rate and SG U-tube liquid level behaviors especially after non-condensable gas inflow. The improvement of the condensation heat transfer model under non-condensable gas mixture condition and the SG U-tube model may be necessary for correct analysis of the LSTF SG depressurization transient. (author)
Energy Technology Data Exchange (ETDEWEB)
Brocke, Tobias
2012-07-01
Against the background of energy policy and climate policy decisions, the decentralized power generation has gained in importance in Germany. Previous research activities on this topic mostly concerned with technical, legal, environmental and economic issues as well as potential analyses for certain forms of power generation. In contrast, the contribution under consideration deals with the organizational structures and governance structures of the decentralized power generation at local and regional level. In particular, it concerns the question to what extent the decentralized power generation results in the formation of localized production connections. In addition, it is about the importance of institutional framework as well as the role of regulatory, political and civil society actors who are affected by the distributed power generation.
Directory of Open Access Journals (Sweden)
Bijal A Parikh
Full Text Available The bacterial CRISPR-Cas9 system has been adapted for use as a genome editing tool. While several recent reports have indicated that successful genome editing of mice can be achieved, detailed phenotypic and molecular analyses of the mutant animals are limited. Following pronuclear micro-injection of fertilized eggs with either wild-type Cas9 or the nickase mutant (D10A and single or paired guide RNA (sgRNA for targeting of the tyrosinase (Tyr gene, we assessed genome editing in mice using rapid phenotypic readouts (eye and coat color. Mutant mice with insertions or deletions (indels in Tyr were efficiently generated without detectable off-target cleavage events. Gene correction of a single nucleotide by homologous recombination (HR could only occur when the sgRNA recognition sites in the donor DNA were modified. Gene repair did not occur if the donor DNA was not modified because Cas9 catalytic activity was completely inhibited. Our results indicate that allelic mosaicism can occur following -Cas9-mediated editing in mice and appears to correlate with sgRNA cleavage efficiency at the single-cell stage. We also show that larger than expected deletions may be overlooked based on the screening strategy employed. An unbiased analysis of all the deleted nucleotides in our experiments revealed that the highest frequencies of nucleotide deletions were clustered around the predicted Cas9 cleavage sites, with slightly broader distributions than expected. Finally, additional analysis of founder mice and their offspring indicate that their general health, fertility, and the transmission of genetic changes were not compromised. These results provide the foundation to interpret and predict the diverse outcomes following CRISPR-Cas9-mediated genome editing experiments in mice.
Interpretations of Negative Probabilities
Burgin, Mark
2010-01-01
In this paper, we give a frequency interpretation of negative probability, as well as of extended probability, demonstrating that to a great extent, these new types of probabilities, behave as conventional probabilities. Extended probability comprises both conventional probability and negative probability. The frequency interpretation of negative probabilities gives supportive evidence to the axiomatic system built in (Burgin, 2009; arXiv:0912.4767) for extended probability as it is demonstra...
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Maximum Probability Domains for Hubbard Models
Acke, Guillaume; Claeys, Pieter W; Van Raemdonck, Mario; Poelmans, Ward; Van Neck, Dimitri; Bultinck, Patrick
2015-01-01
The theory of Maximum Probability Domains (MPDs) is formulated for the Hubbard model in terms of projection operators and generating functions for both exact eigenstates as well as Slater determinants. A fast MPD analysis procedure is proposed, which is subsequently used to analyse numerical results for the Hubbard model. It is shown that the essential physics behind the considered Hubbard models can be exposed using MPDs. Furthermore, the MPDs appear to be in line with what is expected from Valence Bond Theory-based knowledge.
Oguro, Kazumasa; SHIMASAWA Manabu; TAKAHATA Junichiro
2010-01-01
We constructed an overlapping-generations model with endogenous fertility to analyze the effect of child benefits and pensions on welfare for current and future generations. The following results were obtained. First, when financial sustainability is not taken into account, the best policy to improve the welfare of future generations is to increase child benefits, financed by issuing government debt. On the other hand, when financial sustainability is taken into account, the best policy is to...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
International Nuclear Information System (INIS)
A random pulse and probability generator (RPG) has been developed utilizing the detection technique of alpha-particles as the random signal source. The collection technique for 222Rn emanated from natural uranium ore was examined for preparing highly pure 210Pb-210Po as an alpha source for RPG. The yield with a trap refrigerated by liquid nitrogen was observed to be above 99% for 222Rn collection. (author)
Briggs, William M
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Cole, David A; Nolen-Hoeksema, Susan; Girgus, Joan; Paul, Gilda
2006-02-01
In 2 longitudinal studies of negative life events and depressive symptoms in adolescents (N = 708) and in children (N = 508), latent trait-state-error structural equation models tested both the stress generation hypothesis and the stress exposure hypothesis. Results strongly suggested that self-reports of depressive symptoms reflect the influence of a perfectly stable trait factor as well as a less stable state factor. Support emerged for both the stress generation model and the stress exposure model. When the state depression factor was modeled as predicting stress, support for the stress generation model appeared to increase with age. When the trait depression factor was modeled as the predictor of stress, support for the stress generation model did not vary with the child's age. In both models, support for the stress exposure remained relatively constant across age. PMID:16492094
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
2013-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Accidents, probabilities and consequences
International Nuclear Information System (INIS)
Following brief discussion of the safety of wind-driven power plants and solar power plants, some aspects of the safety of fast breeder and thermonuclear power plants are presented. It is pointed out that no safety evaluation of breeders comparable to the Rasmussen investigation has been carried out and that discussion of the safety aspects of thermonuclear power is only just begun. Finally, as an illustration of the varying interpretations of risk and safety analyses, four examples are given of predicted probabilities and consequences in Copenhagen of the maximum credible accident at the Barsebaeck plant, under the most unfavourable meterological conditions. These are made by the Environment Commission, Risoe Research Establishment, REO (a pro-nuclear group) and OOA (an anti-nuclear group), and vary by a factor of over 1000. (JIW)
Contributions to quantum probability
International Nuclear Information System (INIS)
distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)
Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long
2014-11-01
The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.
International Nuclear Information System (INIS)
This paper aims to quantify the influence that probability distribution selected to fit wind speed data has on the estimation of the annual mean energy production of wind turbines. To perform this task, a comparative analysis between the well-known two parameter wind speed Weibull distribution and alternative mixture of finite distribution models (less simple but providing better fits in many locations) is applied, in order to contrast simplicity versus accuracy. Data fitted from a set of weather stations located at the Canary Islands and a representative sample of commercial wind turbines are taken into account to carry out this analysis. The calculations provide a wide variety of numerical results but, as a general conclusion, the analysis evidences that any improvement in wind data fits given by the use of a mixture of finite distributions, instead of the standard Weibull distribution, is partially or even totally lost as the annual mean energy production is worked out, practically regardless the weather station, the wind speed distribution model, the turbine size or the turbine concept
International Nuclear Information System (INIS)
To weld steam-generating tubes of EM12 to a tubesheet provided with boss of 2.25 Cr. 1 Mo, STEIN INDUSTRIE has developed a process of internal welding without filler metal. The characterization tests carried out on test assemblies have shown the excellent metallurgical quality of welding performed by this process. High-temperature strength tests showed a safety margin as compared with the results of the finite element calculations. The hypotheses made for these calculations which take into account the elastoviscoplatic properties of materials, and particularly the extension of the properties of 2.25 Cr. 1 Mo to the weld, can therefore be applied to steam-generator sizing calutations. (orig.)
Liu, Feng; Gong, Daping; Zhang, Qian; Wang, Dawei; Cui, Mengmeng; Zhang, Zhiguo; Liu, Guanshan; Wu, Jinxia; Wang, Yuanying
2015-03-01
Tobacco (Nicotiana tabacum L.) is an ideal model system for molecular biological and genetic studies. In this study, activation tagging was used to generate approximately 100,000 transgenic tobacco plants. Southern blot analysis indicated that there were 1.6 T-DNA inserts per line on average in our transformed population. The phenotypes observed include abnormalities in leaf and flower morphology, plant height, flowering time, branching, and fertility. Among 6,000 plants in the T0 generation, 57 displayed obvious phenotypes. Among 4,105 lines in the T1 generation, 311 displayed abnormal phenotypes. Fusion primer and nested integrated PCR was used to identify 963 independent genomic loci of T-DNA insertion sites in 1,257 T1 lines. The distribution of T-DNA insertions was non-uniform and correlated well with the predicted gene density along each chromosome. The insertions were biased toward genic regions and noncoding regions within 5 kb of a gene. Fifteen plants that showed the same phenotype as their parent with a dominant pattern in the T2 generation were chosen randomly to detect the expression levels of genes adjacent to the T-DNA integration sites by semi-quantitative RT-PCR. Fifteen candidate genes were identified. Activation was observed in 7 out of the 15 adjacent genes, including one that was located 13.1 kb away from the enhancer sequence. The activation-tagged population described in this paper will be a highly valuable resource for tobacco functional genomics research using both forward and reverse genetic approaches. PMID:25408504
Gutschow, Christian; The ATLAS collaboration
2016-01-01
The Monte Carlo setups used by ATLAS to model boson+jets and multi-boson processes in 13 TeV pp collisions are described. Comparisons between data and several events generators are provided for key kinematic distributions at 7 TeV, 8 TeV and 13 TeV. Issues associated with sample normalisation and the evaluation of systematic uncertainties are also discussed.
Energy Technology Data Exchange (ETDEWEB)
Buchmayr, B.; Cerjak, H.; Wakonig, H. (Technische Univ., Graz (Austria)); Kleemaier, R.; Nowotny, P. (SGP-VA Energie- und Umwelttechnik GmbH, Vienna (Austria))
1990-09-01
This paper introduces an expert system (ES) for prelocation damage analysis to steam generators. It has been developed in collaboration between Simmering-Graz-Pauker AG (SGP) and the Materials Information and Welding Technology Department of Graz Technical University. The paper indicates to power plant engineers the possibilities for the application of this kind of system and the basic features for realization of the project. (orig.).
Mavko, Borut; Prošek, Andrej
2015-01-01
The Krško nuclear power plant has undertaken a major modernization project. The objectives of the project are: long-term stabilization of the plant's operation, uprating of the net electrical power output, higher availability and enhanced safety of the plant. The modernization also requires a thorough safety re-avaluation and therefore new thermal hydraulic, mechanical and structural analysis. The thermal-hydraulic part of the safety analysis necessary for the steam generator replacement and ...
Technical and economic analyses of a hydrogen-fed gas turbine with steam injection and co-generation
International Nuclear Information System (INIS)
Enel has been working on a hydrogen programme dealing with both hydrogen production and uses. The first phase deals with a hydrogen-fed, gas turbine-based co-generative cycle, in which steam injection in the gas turbine itself is adopted in order to couple high process efficiencies with very low nitrous oxide emissions. This paper presents the main results of the co-generative cycle thermodynamic analysis and focuses on the plant economic evaluations under different economic and regulatory scenarios. Results show that hydrogen can be used very effectively in this kind of plant, where electricity efficiency can reach 40% and global co-generation efficiencies can also exceed 90% in relatively small-scale power plants. However, the very high specific investment costs associated with small plants require promotion policies to be applied in order to allow profitable investments. In the future, this kind of plant could provide densely populated areas with electricity and heat with no additional side-effects on the environment. (author)
International Nuclear Information System (INIS)
In this study, a small scale hybrid solar heating, chilling and power generation system, including parabolic trough solar collector with cavity receiver, a helical screw expander and silica gel-water adsorption chiller, etc., was proposed and extensively investigated. The system has the merits of effecting the power generation cycle at lower temperature level with solar energy more efficiently and can provide both thermal energy and power for remote off-grid regions. A case study was carried out to evaluate an annual energy and exergy efficiency of the system under the climate of northwestern region of China. It is found that both the main energy and exergy loss take place at the parabolic trough collector, amount to 36.2% and 70.4%, respectively. Also found is that the studied system can have a higher solar energy conversion efficiency than the conventional solar thermal power generation system alone. The energy efficiency can be increased to 58.0% from 10.2%, and the exergy efficiency can be increased to 15.2% from 12.5%. Moreover, the economical analysis in terms of cost and payback period (PP) has been carried out. The study reveals that the proposed system the PP of the proposed system is about 18 years under present energy price conditions. The sensitivity analysis shows that if the interest rate decreases to 3% or energy price increase by 50%, PP will be less than 10 years.
International Nuclear Information System (INIS)
In this research, a vortex generator heat exchanger is used to recover exergy from the exhaust of an OM314 diesel engine. Twenty vortex generators with 30° angle of attack are used to increase the heat recovery as well as the low back pressure in the exhaust. The experiments are prepared for five engine loads (0, 20, 40, 60 and 80% of full load), two exhaust gases amount (50 and 100%) and four water mass flow rates (50, 40, 30 and 20 g/s). After a thermodynamical analysis on the obtained data, an optimization study based on Central Composite Design (CCD) is performed due to complex effect of engine loads and water mass flow rates on exergy recovery and irreversibility to reach the best operating condition. - Highlights: • A vortex generator heat exchanger is used for diesel exhaust heat recovery. • A thermodynamic analysis is performed for experimental data. • Exergy recovery, irreversibility are calculated in different exhaust gases amount. • Optimization study is performed using response surface method
International Nuclear Information System (INIS)
In this study, a small scale hybrid solar heating, chilling and power generation system, including parabolic trough solar collector with cavity receiver, a helical screw expander and silica gel-water adsorption chiller, etc., was proposed and extensively investigated. The system has the merits of effecting the power generation cycle at lower temperature level with solar energy more efficiently and can provide both thermal energy and power for remote off-grid regions. A case study was carried out to evaluate an annual energy and exergy efficiency of the system under the climate of northwestern region of China. It is found that both the main energy and exergy loss take place at the parabolic trough collector, amount to 36.2% and 70.4%, respectively. Also found is that the studied system can have a higher solar energy conversion efficiency than the conventional solar thermal power generation system alone. The energy efficiency can be increased to 58.0% from 10.2%, and the exergy efficiency can be increased to 15.2% from 12.5%. Moreover, the economical analysis in terms of cost and payback period (PP) has been carried out. The study reveals that the proposed system the PP of the proposed system is about 18 years under present energy price conditions. The sensitivity analysis shows that if the interest rate decreases to 3% or energy price increase by 50%, PP will be less than 10 years. (author)
Energy Technology Data Exchange (ETDEWEB)
Monniaux, D.
2009-06-15
Software operating critical systems (aircraft, nuclear power plants) should not fail - whereas most computerised systems of daily life (personal computer, ticket vending machines, cell phone) fail from time to time. This is not a simple engineering problem: it is known, since the works of Turing and Cook, that proving that programs work correctly is intrinsically hard. In order to solve this problem, one needs methods that are, at the same time, efficient (moderate costs in time and memory), safe (all possible failures should be found), and precise (few warnings about nonexistent failures). In order to reach a satisfactory compromise between these goals, one can research fields as diverse as formal logic, numerical analysis or 'classical' algorithmics. From 2002 to 2007 I participated in the development of the Astree static analyser. This suggested to me a number of side projects, both theoretical and practical (use of formal proof techniques, analysis of numerical filters...). More recently, I became interested in modular analysis of numerical property and in the applications to program analysis of constraint solving techniques (semi-definite programming, SAT and SAT modulo theory). (author)
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2011-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization o...
Probability and paternity testing.
Elston, R C
1986-01-01
A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of p...
Logical Probability Preferences
Saad, Emad
2013-01-01
We present a unified logical framework for representing and reasoning about both probability quantitative and qualitative preferences in probability answer set programming, called probability answer set optimization programs. The proposed framework is vital to allow defining probability quantitative preferences over the possible outcomes of qualitative preferences. We show the application of probability answer set optimization programs to a variant of the well-known nurse restoring problem, c...
International Nuclear Information System (INIS)
To improve the performance of electrochemical devices such as batteries and fuel cells, it is essential to understand reaction hierarchies over wide temporal and spatial ranges. To this end, operando measurement techniques have been developed that enable analysis of the electrode/electrolyte interface of the reaction site, phase transitions of active materials, and macro reactions within real electrodes over various spatial and temporal scales. These analytic techniques pioneer a new way of performing kinetic analysis by introducing axes of space and time into reaction analyses, and are applicable to various types of electrochemical devices. Moreover, a magnesium rechargeable battery featuring the merits of high theoretical energy density, high safety, and easily acquirable raw materials was developed by employing these operando analytic techniques. (author)
Östblom, Göran; Ljunggren Söderman, Maria; Sjöström, Magnus
2010-01-01
Parallel to the efforts of the EU to achieve a significant and overall reduction of waste quantities within the EU, the Swedish parliament enacted an environmental quality objective stating that ‘the total quantity of waste must not increase …’ i.e. an eventual absolute decoupling of waste generation from GDP. The decoupling issue is ad-dressed, in the present paper, by assessing future waste quantities, for a number of economic scenarios of the Swedish economy to 2030 with alternative assump...
Uniqueness in ergodic decomposition of invariant probabilities
Zimmermann, Dieter
1992-01-01
We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.
Directory of Open Access Journals (Sweden)
Roxana Yockteng
2013-12-01
Full Text Available Premise of the study: To study gene expression in plants, high-quality RNA must be extracted in quantities sufficient for subsequent cDNA library construction. Field-based collections are often limited in quantity and quality of tissue and are typically preserved in RNAlater. Obtaining sufficient and high-quality yield from variously preserved samples is essential to studies of comparative biology. We present a protocol for the extraction of high-quality RNA from even the most recalcitrant plant tissues. Methods and Results: Tissues from mosses, cycads, and angiosperm floral organs and leaves were preserved in RNAlater or frozen fresh at −80°C. Extractions were performed and quality was measured for yield and purity. Conclusions: This protocol results in the extraction of high-quality RNA from a variety of plant tissues representing vascular and nonvascular plants. RNA was used for cDNA synthesis to generate libraries for next-generation sequencing and for expression studies using quantitative PCR (qPCR and semiquantitative reverse transcription PCR (RT-PCR.
Agreeing Probability Measures for Comparative Probability Structures
Wakker, Peter
1981-01-01
It is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a $\\sigma$-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid for the general case, but only for $\\sigma$-algebras. Here the proof of Niiniluoto (1972) is supplemented. Furthermore an example is presented that reveals many misunderstandings in the literature. At the...
Hedesan, Georgiana D.
2014-01-01
This article discusses a Latin manuscript that can be found in the Jan Baptist Van Helmont (1579–1644) archives in Mechelen (Malines), Belgium. The manuscript bears no author and no title, and begins with the words ‘Exterior homo’, hence being referred by this provisional title in the analysis. Ecclesiastical prosecutors investigating Van Helmont for heresy in 1634 considered that it was written by him, but this was vehemently denied by the Flemish physician. The present article takes a first detailed look at the content of the treatise and ideas contained therein. It hence identifies the manuscript as belonging to a seventeenth-century physician influenced by the ideas of Theophrastus Paracelsus (1493–1541) and his interpreter Petrus Severinus (1542–1602), and containing a complex medical philosophy drawn on alchemical thought. Thus, the anonymous author presents a comprehensive view on the nature and structure of man, as well as an idiosyncratic theory of human generation. Following the analysis of the treatise, the article further compares it with the known works of J.B. Van Helmont, and finds that it is very similar to his ideas. Hence, the article concludes that it is ‘likely’ that the manuscript is indeed written by Van Helmont, although lack of direct evidence prevents certain attribution. PMID:25045180
Bassanezi, Renato B; Bergamin Filho, Armando; Amorim, Lilian; Gimenes-Fernandes, Nelson; Gottwald, Tim R; Bové, Joseph M
2003-04-01
grafted on Rangpur lime. Based on the symptoms of CSD and on its spatial and temporal patterns, our hypothesis is that CSD may be caused by a similar but undescribed pathogen such as a virus and probably vectored by insects such as aphids by similar spatial processes to those affecting CTV. PMID:18944366
Benci, Vieri; Wenmackers, Sylvia
2011-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Estimating Small Probabilities for Langevin Dynamics
Aristoff, David
2012-01-01
The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...
Cho, Namjin; Hwang, Byungjin; Yoon, Jung-ki; Park, Sangun; Lee, Joongoo; Seo, Han Na; Lee, Jeewon; Huh, Sunghoon; Chung, Jinsoo; Bang, Duhee
2015-01-01
Interpreting epistatic interactions is crucial for understanding evolutionary dynamics of complex genetic systems and unveiling structure and function of genetic pathways. Although high resolution mapping of en masse variant libraries renders molecular biologists to address genotype-phenotype relationships, long-read sequencing technology remains indispensable to assess functional relationship between mutations that lie far apart. Here, we introduce JigsawSeq for multiplexed sequence identification of pooled gene variant libraries by combining a codon-based molecular barcoding strategy and de novo assembly of short-read data. We first validate JigsawSeq on small sub-pools and observed high precision and recall at various experimental settings. With extensive simulations, we then apply JigsawSeq to large-scale gene variant libraries to show that our method can be reliably scaled using next-generation sequencing. JigsawSeq may serve as a rapid screening tool for functional genomics and offer the opportunity to explore evolutionary trajectories of protein variants. PMID:26387459
Cho, Namjin; Hwang, Byungjin; Yoon, Jung-ki; Park, Sangun; Lee, Joongoo; Seo, Han Na; Lee, Jeewon; Huh, Sunghoon; Chung, Jinsoo; Bang, Duhee
2015-01-01
Interpreting epistatic interactions is crucial for understanding evolutionary dynamics of complex genetic systems and unveiling structure and function of genetic pathways. Although high resolution mapping of en masse variant libraries renders molecular biologists to address genotype-phenotype relationships, long-read sequencing technology remains indispensable to assess functional relationship between mutations that lie far apart. Here, we introduce JigsawSeq for multiplexed sequence identification of pooled gene variant libraries by combining a codon-based molecular barcoding strategy and de novo assembly of short-read data. We first validate JigsawSeq on small sub-pools and observed high precision and recall at various experimental settings. With extensive simulations, we then apply JigsawSeq to large-scale gene variant libraries to show that our method can be reliably scaled using next-generation sequencing. JigsawSeq may serve as a rapid screening tool for functional genomics and offer the opportunity to explore evolutionary trajectories of protein variants. PMID:26387459
Der, Zoltan A.; Baumgardt, Douglas R.
1997-07-01
Machinery typically generates mechanical vibrations at multiple, harmonically related frequencies which arise from various mechanically coupled moving components of machines or characteristic nonlinearities in their operational loads. These mechanical vibrations propagate from their origin through the air as acoustic waves and through the earth as various types of seismic waves. Of the two modes of propagation the seismic mode of propagation is the more complicated since the same harmonic may propagate simultaneously in various wave types (compressional waves, shear waves and various surface wave types) with differing propagation vehicles. Moreover, air-to-ground coupling has been shown to occur in some cases. The consequence of this multi-mode propagation is that standing wave interference patterns are set up over the terrain surrounding the sources which complicates the frequency-wavenumber analysis and identification of the signals. Since the set of harmonics omitted from a given type of machinery tend to be phase- coupled, higher order spectral analysis offers means for detecting and separating such coupled sets and reducing much of the Gaussian background noise and uncoupled sinusoidal noise components. In this paper we utilize sections through bispectral estimates obtained from continuous signals from various types of machinery with durations exceeding a minute.
Estimating extreme flood probabilities
International Nuclear Information System (INIS)
Estimates of the exceedance probabilities of extreme floods are needed for the assessment of flood hazard at Department of Energy facilities. A new approach using a joint probability distribution of extreme rainfalls and antecedent soil moisture conditions, along with a rainfall runoff model, provides estimates of probabilities for floods approaching the probable maximum flood. This approach is illustrated for a 570 km2 catchment in Wisconsin and a 260 km2 catchment in Tennessee
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Gielen, Fabrice; Buryska, Tomas; Van Vliet, Liisa; Butz, Maren; Damborsky, Jiri; Prokop, Zbynek; Hollfelder, Florian
2015-01-01
Analysis of concentration dependencies is key to the quantitative understanding of biological and chemical systems. In experimental tests involving concentration gradients such as inhibitor library screening, the number of data points and the ratio between the stock volume and the volume required in each test determine the quality and efficiency of the information gained. Titerplate assays are currently the most widely used format, even though they require microlitre volumes. Compartmentalization of reactions in pico- to nanoliter water-in-oil droplets in microfluidic devices provides a solution for massive volume reduction. This work addresses the challenge of producing microfluidic-based concentration gradients in a way that every droplet represents one unique reagent combination. We present a simple microcapillary technique able to generate such series of monodisperse water-in-oil droplets (with a frequency of up to 10 Hz) from a sample presented in an open well (e.g., a titerplate). Time-dependent variation of the well content results in microdroplets that represent time capsules of the composition of the source well. By preserving the spatial encoding of the droplets in tubing, each reactor is assigned an accurate concentration value. We used this approach to record kinetic time courses of the haloalkane dehalogenase DbjA and analyzed 150 combinations of enzyme/substrate/inhibitor in less than 5 min, resulting in conclusive Michaelis-Menten and inhibition curves. Avoiding chips and merely requiring two pumps, a magnetic plate with a stirrer, tubing, and a pipet tip, this easy-to-use device rivals the output of much more expensive liquid handling systems using a fraction (∼100-fold less) of the reagents consumed in microwell format. PMID:25496166
International Nuclear Information System (INIS)
We report a systematic study about the effect of cobalt concentration in the growth solution over the crystallization, growth, and optical properties of hydrothermally synthesized Zn1−xCoxO [0 ≤ x ≤ 0.40, x is the weight (wt.) % of Co in the growth solution] nanorods. Dilute Co concentration of 1 wt. % in the growth solution enhances the bulk crystal quality of ZnO nanorods, and high wt. % leads to distortion in the ZnO lattice that depresses the crystallization, growth as well as the surface structure quality of ZnO. Although, Co concentration in the growth solution varies from 1 to 40 wt. %, the real doping concentration is limited to 0.28 at. % that is due to the low growth temperature of 80 °C. The enhancement in the crystal quality of ZnO nanorods at dilute Co concentration in the solution is due to the strain relaxation that is significantly higher for ZnO nanorods prepared without, and with high wt. % of Co in the growth solution. Second harmonic generation is used to investigate the net dipole distribution from these coatings, which provides detailed information about bulk and surface structure quality of ZnO nanorods at the same time. High quality ZnO nanorods are fabricated by a low-temperature (80 °C) hydrothermal synthesis method, and no post synthesis treatment is needed for further crystallization. Therefore, this method is advantageous for the growth of high quality ZnO coatings on plastic substrates that may lead toward its application in flexible electronics
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Introduction to probability models
Ross, Sheldon M
2006-01-01
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
International Nuclear Information System (INIS)
In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
On Quantum Conditional Probability
Directory of Open Access Journals (Sweden)
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Prabhu, Narahari
2011-01-01
Recent research in probability has been concerned with applications such as data mining and finance models. Some aspects of the foundations of probability theory have receded into the background. Yet, these aspects are very important and have to be brought back into prominence.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Directory of Open Access Journals (Sweden)
Elena Druica
2007-05-01
Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.
Tokinaga, Shozo; Ikeda, Yoshikazu
In investments, it is not easy to identify traders'behavior from stock prices, and agent systems may help us. This paper deals with discriminant analyses of stock prices using multifractality of time series generated via multi-agent systems and interpolation based on Wavelet Transforms. We assume five types of agents where a part of agents prefer forecast equations or production rules. Then, it is shown that the time series of artificial stock price reveals as a multifractal time series whose features are defined by the Hausedorff dimension D(h). As a result, we see the relationship between the reliability (reproducibility) of multifractality and D(h) under sufficient number of time series data. However, generally we need sufficient samples to estimate D(h), then we use interpolations of multifractal times series based on the Wavelet Transform.
International Nuclear Information System (INIS)
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Stochastic Programming with Probability
Andrieu, Laetitia; Vázquez-Abad, Felisa
2007-01-01
In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Marshall, Jennings B.
2007-01-01
This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
Directory of Open Access Journals (Sweden)
VICTORIA EUGENIA VALLEJO
Full Text Available El presente estudio evaluó el desempeño de dos sales de tetrazolio, una tradicional: INT y una de nueva generación: XTT, para estimar la densidad de microorganismos degradadores de hidrocarburos (HCs en suelos empleando la técnica del Número Más Probable (NMP. Se analizaron 96 muestras de suelo provenientes de la Ecorregión Cafetera de Colombia. Los microorganismos fueron recuperados en agar mínimo de sales en atmósfera saturada de HCs y la capacidad degradadora fue confirmada por repiques sucesivos utilizando diesel como fuente de carbono. No se observaron diferencias significativas en los recuentos de microorganismos degradadores obtenidos con las dos sales (t de Student, p The objective of this study was to evaluate the performance of two tetrazolium indicators: a traditional one: INT and a new generation one: XTT, for the estimation of hydrocarbon (HC degrading microorganism s density using the Most Probable Number Technique (MPN. Ninety six composite soil samples were taken and analyzed from Ecorregión Cafetera Colombiana. Degrading microorganisms were recovered in minimum salt medium with saturated HC atmosphere. Degrading HC capacity of the microorganisms was confirmed by successive subcultures in the same medium using diesel as only carbon source. Counts obtained with the two salts were not significantly different (Student t test, p < 0,05 but XTT allowed an easier visualization of positive wells due to product solubility of the reduce product. A greater percentage of isolates was obtained using XTT (67%, which suggests that salt type is relevant for recovering of these microorganisms. Additionally, cell detection limit, optimal conditions of XTT concentration and incubation times for detection of activity were evaluated. This evaluation was performed by means of microplate format for hydrocarbon degrading microorganisms using Acinetobacter sp. An inhibitory effect was observed in the recovering of cultivable cells when XTT
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving a...
International Nuclear Information System (INIS)
New Zealand population and cancer statistics have been used to derive the probability that an existing cancer in an individual was the result of a known exposure to radiation. Hypothetical case histories illustrate how sex, race, age at exposure, age at presentation with disease, and the type of cancer affect this probability. The method can be used now to identify claims in which a link between exposure and disease is very strong or very weak, and the types of cancer and population sub-groups for which radiation is most likely to be the causative agent. Advantages and difficulties in using a probability of causation approach in legal or compensation hearings are outlined. The approach is feasible for any carcinogen for which reasonable risk estimates can be made
Minimum Probability Flow Learning
Sohl-Dickstein, Jascha; DeWeese, Michael R
2009-01-01
Learning in probabilistic models is often severely hampered by the general intractability of the normalization factor and its derivatives. Here we propose a new learning technique that obviates the need to compute an intractable normalization factor or sample from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the initial flow of probability away from the data distribution. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate the application of minimum probability flow learning to parameter estimation in Ising models, deep belief networks, multivariate Gaussian distributions and a continuous model with a highly general energy function defined as a power series. In the Ising model case, minimum probability flow learning outperforms cur...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Morris, Danielle H; Jones, Michael E; Schoemaker, Minouk J; McFadden, Emily; Ashworth, Alan; Swerdlow, Anthony J
2012-05-15
The authors examined the effect of women's lifestyles on the timing of natural menopause using data from a cross-sectional questionnaire used in the United Kingdom-based Breakthrough Generations Study in 2003-2011. The analyses included 50,678 women (21,511 who had experienced a natural menopause) who were 40-98 years of age at study entry and did not have a history of breast cancer. Cox competing risks proportional hazards models were fitted to examine the relation of age at natural menopause to lifestyle and anthropometric factors. Results were adjusted for age at reporting, smoking status at menopause, parity, and body mass index at age 40 years, as appropriate. All P values were 2-sided. High adult weight (P(trend) vegetarian (P < 0.001) were associated with older age at menopause. Neither height nor history of an eating disorder was associated with menopausal age. These findings show the importance of lifestyle factors in determining menopausal age. PMID:22494951
Failure probability of ceramic coil springs
Nohut, Serkan; Schneider, Gerold A.
2009-01-01
Ceramic springs are commercially available and a detailed reliability analysis of these components would be useful for their introduction in new applications. In this paper an analytical and a numerical analyses of the failure probability for coil springs under compression is presented. Based on analytically derived relationships and numerically calculated results, fitting functions for volume and surface flaws will be introduced which provide the prediction of the failure probability of cera...
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.
Probably Almost Bayes Decisions
DEFF Research Database (Denmark)
Anoulova, S.; Fischer, Paul; Poelt, S.; Simon, H.- U.
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Quznetsov, Gunn
1998-01-01
The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.
Quznetsov, G. A.
2003-01-01
The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.
Transition probabilities for atoms
International Nuclear Information System (INIS)
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Energy Technology Data Exchange (ETDEWEB)
Grunewald, Thomas; Finke, Robert; Graetz, Rainer
2010-07-01
Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.
Waste Package Misload Probability
Energy Technology Data Exchange (ETDEWEB)
J.K. Knudsen
2001-11-20
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Waste Package Misload Probability
International Nuclear Information System (INIS)
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Objectifying Subjective Probabilities
Czech Academy of Sciences Publication Activity Database
Childers, Timothy
Dordrecht: Springer, 2012 - ( Weber , M.; Dieks, D.; Gonzalez, W.; Hartman, S.; Stadler, F.; Stöltzner, M.), s. 19-28. (The Philosophy of Science in a European Perspective. 3). ISBN 978-94-007-3029-8. [Pluralism in the Foundations of Statistics. Canterbury (GB), 09.09.2010-10.09.2010] R&D Projects: GA ČR(CZ) GAP401/10/1504 Institutional support: RVO:67985955 Keywords : probabilities * direct Inference Subject RIV: AA - Philosophy ; Religion
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
A BAYESIAN ABDUCTION MODEL FOR EXTRACTING THE MOST PROBABLE EVIDENCE TO SUPPORT SENSEMAKING
Directory of Open Access Journals (Sweden)
Paul Munya
2015-01-01
Full Text Available In this paper, we discuss the development of a Bayesian Abduction Model of Sensemaking Support (BAMSS as a tool for information fusion to support prospective sensemaking. Currently, BAMSS can identify the Most Probable Explanation from a Bayesian Belief Network (BBN and extract the prevalent conditional probability values to help the sensemaking analysts to understand the cause-effect of the adversary information. Actual vignettes from databases of modern insurgencies and asymmetry warfare are used to validate the performance of BAMSS. BAMSS computes the posterior probability of the network edges and performs information fusion using a clustering algorithm. In the model, the friendly force commander uses the adversary information to prospectively make sense of the enemy’s intent. Sensitivity analyses were used to confirm the robustness of BAMSS in generating the Most Probable Explanations from a BBN through abductive inference. The simulation results demonstrate the utility of BAMSS as a computational tool to support sense making.
Probability mapping of contaminants
International Nuclear Information System (INIS)
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Measurement Uncertainty and Probability
Willink, Robin
2013-02-01
Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.
Integration, measure and probability
Pitt, H R
2012-01-01
This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t
Probability Theories and the Justification of Theism
Portugal, Agnaldo Cuoco
2003-01-01
In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.
Probable maximum flood control
International Nuclear Information System (INIS)
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Savage s Concept of Probability
Institute of Scientific and Technical Information of China (English)
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Cross section probability tables in multi-group transport calculations
International Nuclear Information System (INIS)
The use of cross section probability tables in multigroup transport calculations is presented. Emphasis is placed on how probability table parameters are generated in a multigroup cross section processor and how existing transport codes must be modifed to use them. In order to illustrate the accuracy obtained by using probability tables, results are presented for a variety of neutron and photon transport problems
International Nuclear Information System (INIS)
In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop an optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. For instance, by decarbonizing half of its generation mix, Saudi Arabia can release around 0.5 Mb/d barrels of oil equivalent per day from 2020. Moreover, total power generation cost reduction can reach up to around 28% per year from 2030 if Saudi Arabia manages to attain the most optimal generation mix structure introduced in the model (50% of power from renewables and nuclear power plants and 50% from the fossil power plants). - Highlights: • We model the current and future power generation situation of Saudi Arabia. • We take into account the integration of the other alternative resources. • We consider different scenarios of power generation structure for the country. • Optimal generation mix can release considerable amount of oil for export
RANDOM VARIABLE WITH FUZZY PROBABILITY
Institute of Scientific and Technical Information of China (English)
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Probability, Logic and Objectivity - The concept of probability of Carl Stumpf
Benedictus, Fedde
2015-01-01
We will show that Carl Stumpf's interpretation of the concept of probability is best understood as that of an objective Bayesian. First we analyse Stumpf's work in relation to that of his contemporary Johannes von Kries, and after that we will discuss various ways in which Stumpf's probability-concept has been construed. By showing that the construals of Stumpf's account by Hans Reichenbach, Richard von Mises and Andreas Kamlah are unfair - and at some points incorrect - we uncover the aspects that are essential to Stumpf's probability interpretation.
The Logic of Parametric Probability
Norman, Joseph W
2012-01-01
The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.
Dynamic Estimation of Credit Rating Transition Probabilities
Berd, Arthur M.
2009-01-01
We present a continuous-time maximum likelihood estimation methodology for credit rating transition probabilities, taking into account the presence of censored data. We perform rolling estimates of the transition matrices with exponential time weighting with varying horizons and discuss the underlying dynamics of transition generator matrices in the long-term and short-term estimation horizons.
How to Read Probability Distributions as Statements about Process
Frank, Steven A.
2014-01-01
Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...
Physics with exotic probability theory
Youssef, Saul
2001-01-01
Probability theory can be modified in essentially one way while maintaining consistency with the basic Bayesian framework. This modification results in copies of standard probability theory for real, complex or quaternion probabilities. These copies, in turn, allow one to derive quantum theory while restoring standard probability theory in the classical limit. The argument leading to these three copies constrain physical theories in the same sense that Cox's original arguments constrain alter...
Quantum Foundations : Is Probability Ontological ?
Rosinger, Elemer E
2007-01-01
It is argued that the Copenhagen Interpretation of Quantum Mechanics, founded ontologically on the concept of probability, may be questionable in view of the fact that within Probability Theory itself the ontological status of the concept of probability has always been, and is still under discussion.
Interpretation of Plateau in High-Harmonic Generation
Institute of Scientific and Technical Information of China (English)
程太旺; 李晓峰; 敖淑艳; 傅盘铭
2003-01-01
The plateau in high-harmonic generation is investigated in the frequency domain. Probability density of an electron in an electromagnetic field is obtained through analysing the quantized-field Volkov state. The plateau of high-harmonic generation reflects the spectral density of the electron at the location of nucleus after abovethreshold ionization.
DEFF Research Database (Denmark)
Jelsøe, Erling; Jæger, Birgit
2015-01-01
When analysing the results of a European wide citizen consultation on sustainable consumption it is necessary to take a number of issues into account, such as the question of representativity and tensions between national and European identies and between consumer and Citizen orientations regarding...
Investigation of probable decays in rhenium isotopes
International Nuclear Information System (INIS)
Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated
International Nuclear Information System (INIS)
A new solvent system referred to as Next Generation Solvent or NGS, has been developed at Oak Ridge National Laboratory for the removal of cesium from alkaline solutions in the Caustic Side Solvent Extraction process. NGS is proposed for deployment at MCU and at the Salt Waste Processing Facility. This work investigated the chemical compatibility between NGS and 16 M, 8 M, and 3 M nitric acid from contact that may occur in handling of analytical samples from MCU or, for 3 M acid, which may occur during contactor cleaning operations at MCU. This work shows that reactions occurred between NGS components and the high molarity nitric acid. In the case of 16 M and 8 M nitric acid, initially organo-nitrate groups are generated and attach to the modifier and that with time oxidation reactions convert the modifier into a tarry substance with gases (NOx and possibly CO) evolving. Calorimetric analysis of the organonitrate revealed the reaction products are not explosive nor will they deflagrate. NGS exposure to 3 M nitric acid resulted in much slower reaction kinetics and that the generated products were not energetic. We recommended conducting Accelerated Rate calorimetry on the materials generated in the 16 M and 8 M nitric acid test. Also, we recommend continue monitoring of the samples contacting NGS with 3 M nitric acid.
International Nuclear Information System (INIS)
In Saudi Arabia, fossil-fuel is the main source of power generation. Due to the huge economic and demographic growth, the electricity consumption in Saudi Arabia has increased and should continue to increase at a very fast rate. At the moment, more than half a million barrels of oil per day is used directly for power generation. Herein, we assess the power generation situation of the country and its future conditions through a modelling approach. For this purpose, we present the current situation by detailing the existing generation mix of electricity. Then we develop a optimization model of the power sector which aims to define the best production and investment pattern to reach the expected demand. Subsequently, we will carry out a sensitivity analysis so as to evaluate the robustness of the model's by taking into account the integration variability of the other alternative (non-fossil fuel based) resources. The results point out that the choices of investment in the power sector strongly affect the potential oil's exports of Saudi Arabia. (authors)
International Nuclear Information System (INIS)
''The Disposal Criticality Analysis Methodology Topical Report'' prescribes an approach to the methodology for performing postclosure criticality analyses within the monitored geologic repository at Yucca Mountain, Nevada. An essential component of the methodology is the ''Configuration Generator Model for In-Package Criticality'' that provides a tool to evaluate the probabilities of degraded configurations achieving a critical state. The configuration generator model is a risk-informed, performance-based process for evaluating the criticality potential of degraded configurations in the monitored geologic repository. The method uses event tree methods to define configuration classes derived from criticality scenarios and to identify configuration class characteristics (parameters, ranges, etc.). The probabilities of achieving the various configuration classes are derived in part from probability density functions for degradation parameters. The NRC has issued ''Safety Evaluation Report for Disposal Criticality Analysis Methodology Topical Report, Revision 0''. That report contained 28 open items that required resolution through additional documentation. Of the 28 open items, numbers 5, 6, 9, 10, 18, and 19 were concerned with a previously proposed software approach to the configuration generator methodology and, in particular, the keff regression analysis associated with the methodology. However, the use of a keff regression analysis is not part of the current configuration generator methodology and, thus, the referenced open items are no longer considered applicable and will not be further addressed
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Approximation of Failure Probability Using Conditional Sampling
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
International Nuclear Information System (INIS)
With purpose of development of the PGNAA system which can operate on the field in the condition of considerably changing of temperature and moisture, a multi channel analyzer-MCA 2k was designed and developed, which can compatibly operate with BGO detector, connected with computer through USB 2.0 port. Beside that, a software for obtaining and displaying the prompt gamma spectrum, with the spectrum stability function and convenience in the data was also designed and developed. The first generation PGNAA system was experimented in the changing condition of temperature in the laboratory. The result give out that the prompt gamma spectrum was stability during temperature changing, the peak area of the elements in the samples changed about 7%. Besides that, the first generation PGNAA system was also experimented to analyze the cement and bauxite samples. The result was also matched with the other analysis methods such as chemistry method and INAA method with the difference about 10%. (author)
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
International Nuclear Information System (INIS)
The maximum dose of ionizing radiation from the geological disposal of TRU wastes will likely be controlled by poorly sorbing soluble radionuclides, such as I-129. Proposed repository designs for the geological disposal of TRU wastes envisage the use of an engineered barrier composed of a bentonite buffer to limit the migration of such radionuclides by impeding groundwater flow. Cementitious materials will inevitably be used for waste packaging, infilling and adding structural integrity to the repository. Using cementitious materials, however, is problematic because they produce highly alkaline leachates which have the potential to cause a complex series of coupled changes in the porewater chemistry, mineralogy and, ultimately, the mass transport properties of the bentonite buffer. To elucidate the consequences of these coupled changes, reactive-transport model analyses have been conducted for bentonite alteration test cases with the use of different combinations of secondary minerals that will likely form in the bentonite buffer. A dissolution rate equation of smectite (a key component of bentonite) applicable to pH 7-13 and 25-80degC was proposed and used in the reactive-transport model analyses. It was found that the amount of dissolved smectite at the center of the bentonite buffer was smaller and those in the vicinity of the cement interface was larger when thermodynamically metastable secondary minerals mainly precipitated as compared with the precipitation of stable phases. The calculated temporal and spatial changes of kinetic smectite dissolution were interpreted as a consequence of the changes in Gibbs free energy and porewater chemistry. Furthermore, the bentonite porewater chemistry was also affected by the stoichiometry and thermodynamic stability of the secondary minerals and the kinetics of smectite dissolution. Except in the close proximity of the cement interface, it was found that regardless of the choice of secondary minerals, the effective
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Energy Technology Data Exchange (ETDEWEB)
Jain, D.K.
1992-01-01
Microbially influenced corrosion (MIC) has been found to play a significant role in causing corrosion, especially in those industries which use natural waters. The most significant of the organisms found to cause corrosion are the sulphate-reducing bacteria (SRB), particularly with anoxic deposits or stagnant weirs. In May 1992, the Bruce Nuclear Generating Station B Vacuum Building was inspected for MIC after being in service for 10 years. This report provides results for both on-site MIC inspection and for microbiological analysis of sediments, water, and slime deposits for evidence of MIC bacteria.
Probabilities of multiple quantum teleportation
Woesler, Richard
2002-01-01
Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...
Probability and statistics: selected problems
Machado, J.A. Tenreiro; Pinto, Carla M. A.
2014-01-01
Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.
Free Probability on a Direct Product of Noncommutative Probability Spaces
Cho, Ilwoo
2005-01-01
In this paper, we observevd the amalgamated free probability of direct product of noncommutative probability spaces. We defined the amalgamated R-transforms, amalgamated moment series and the amalgamated boxed convolution. They maks us to do the amalgamated R-transform calculus, like the scalar-valued case.
Gangopadhyay, Nirupama; Wynne, Kieran; O'Connor, Paula; Gallagher, Eimear; Brunton, Nigel P; Rai, Dilip K; Hayes, Maria
2016-07-15
Angiotensin-I-converting enzyme (ACE-I) plays a key role in control of hypertension, and type-2 diabetes mellitus, which frequently co-exist. Our current work utilised in silico methodologies and peptide databases as tools for predicting release of ACE-I inhibitory peptides from barley proteins. Papain was the enzyme of choice, based on in silico analysis, for experimental hydrolysis of barley protein concentrate, which was performed at the enzyme's optimum conditions (60 °C, pH 6.0) for 24 h. The generated hydrolysate was subjected to molecular weight cut-off (MWCO) filtration, following which the non-ultrafiltered hydrolysate (NUFH), and the generated 3 kDa and 10 kDa MWCO filtrates were assessed for their in vitro ACE-I inhibitory activities. The 3 kDa filtrate (1 mg/ml), that demonstrated highest ACE-I inhibitory activity of 70.37%, was characterised in terms of its peptidic composition using mass spectrometry and 1882 peptides derived from 61 barley proteins were identified, amongst which 15 peptides were selected for chemical synthesis based on their predicted ACE-I inhibitory properties. Of the synthesized peptides, FQLPKF and GFPTLKIF were most potent, demonstrating ACE-I IC50 values of 28.2 μM and 41.2 μM respectively. PMID:26948626
Kandler, Anne; Shennan, Stephen
2015-12-01
Cultural change can be quantified by temporal changes in frequency of different cultural artefacts and it is a central question to identify what underlying cultural transmission processes could have caused the observed frequency changes. Observed changes, however, often describe the dynamics in samples of the population of artefacts, whereas transmission processes act on the whole population. Here we develop a modelling framework aimed at addressing this inference problem. To do so, we firstly generate population structures from which the observed sample could have been drawn randomly and then determine theoretical samples at a later time t2 produced under the assumption that changes in frequencies are caused by a specific transmission process. Thereby we also account for the potential effect of time-averaging processes in the generation of the observed sample. Subsequent statistical comparisons (e.g. using Bayesian inference) of the theoretical and observed samples at t2 can establish which processes could have produced the observed frequency data. In this way, we infer underlying transmission processes directly from available data without any equilibrium assumption. We apply this framework to a dataset describing pottery from settlements of some of the first farmers in Europe (the LBK culture) and conclude that the observed frequency dynamic of different types of decorated pottery is consistent with age-dependent selection, a preference for 'young' pottery types which is potentially indicative of fashion trends. PMID:26674195
de Brevern, Alexandre G.; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain
2015-01-01
Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to...
Probability theory and its models
Humphreys, Paul
2008-01-01
This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.
Decision analysis with approximate probabilities
Whalen, Thomas
1992-01-01
This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Pretest probability assessment derived from attribute matching
Hollander Judd E; Diercks Deborah B; Pollack Charles V; Johnson Charles L; Kline Jeffrey A; Newgard Craig D; Garvey J Lee
2005-01-01
Abstract Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possib...
Subjective probability models for lifetimes
Spizzichino, Fabio
2001-01-01
Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Survival probability and ruin probability of a risk model
Institute of Scientific and Technical Information of China (English)
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
Energy Technology Data Exchange (ETDEWEB)
Fondeur, F. F.; Fink, S. D.
2011-12-07
A new solvent system referred to as Next Generation Solvent or NGS, has been developed at Oak Ridge National Laboratory for the removal of cesium from alkaline solutions in the Caustic Side Solvent Extraction process. The NGS is proposed for deployment at MCU{sup a} and at the Salt Waste Processing Facility. This work investigated the chemical compatibility between NGS and 16 M, 8 M, and 3 M nitric acid from contact that may occur in handling of analytical samples from MCU or, for 3 M acid, which may occur during contactor cleaning operations at MCU. This work shows that reactions occurred between NGS components and the high molarity nitric acid. Reaction rates are much faster in 8 M and 16 M nitric acid than in 3 M nitric acid. In the case of 16 M and 8 M nitric acid, the nitric acid reacts with the extractant to produce initially organo-nitrate species. The reaction also releases soluble fluorinated alcohols such as tetrafluoropropanol. With longer contact time, the modifier reacts to produce a tarry substance with evolved gases (NO{sub x} and possibly CO). Calorimetric analysis of the reaction product mixtures revealed that the organo-nitrates reaction products are not explosive and will not deflagrate.
International Nuclear Information System (INIS)
A new solvent system referred to as Next Generation Solvent or NGS, has been developed at Oak Ridge National Laboratory for the removal of cesium from alkaline solutions in the Caustic Side Solvent Extraction process. The NGS is proposed for deployment at MCUa and at the Salt Waste Processing Facility. This work investigated the chemical compatibility between NGS and 16 M, 8 M, and 3 M nitric acid from contact that may occur in handling of analytical samples from MCU or, for 3 M acid, which may occur during contactor cleaning operations at MCU. This work shows that reactions occurred between NGS components and the high molarity nitric acid. Reaction rates are much faster in 8 M and 16 M nitric acid than in 3 M nitric acid. In the case of 16 M and 8 M nitric acid, the nitric acid reacts with the extractant to produce initially organo-nitrate species. The reaction also releases soluble fluorinated alcohols such as tetrafluoropropanol. With longer contact time, the modifier reacts to produce a tarry substance with evolved gases (NOx and possibly CO). Calorimetric analysis of the reaction product mixtures revealed that the organo-nitrates reaction products are not explosive and will not deflagrate
Reliability analysis of reactor systems by applying probability method
International Nuclear Information System (INIS)
Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component
A priori probabilities of separable quantum states
International Nuclear Information System (INIS)
Zyczkowski, Horodecki, Sanpera and Lewenstein (ZHSL) recently proposed a 'natural measure' on the N-dimensional quantum systems, but expressed surprise when it led them to conclude that for N=2x2, disentangled (separable) systems are more probable (0.632±0.002) in nature than entangled ones. We contend, however, that ZHSL's (rejected) intuition has, in fact, a sound theoretical basis, and that the a priori probability of disentangled 2x2 systems should more properly be viewed as (considerably) less than 0.5. We arrive at this conclusion in two quite distinct ways, the first based on classical and the second, quantum considerations. Both approaches, however, replace (in whole or part) the ZHSL (product) measure by ones based on the volume elements of monotone metrics, which in the classical case amounts to adopting the Jeffreys' prior of Bayesian theory. Only the quantum-theoretic analysis - which yields the smallest probabilities of disentanglement - uses the minimum number of parameters possible, that is N2-1, as opposed to N2+N-1 (although this 'over-parametrization', as recently indicated by Byrd, should be avoidable). However, despite substantial computation, we are not able to obtain precise estimates of these probabilities and the need for additional (possibly supercomputer) analyses is indicated - particularly so for higher-dimensional quantum systems (such as the 2x3 ones, which we also study here). (author)
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Uncertainty quantification approaches for advanced reactor analyses.
Energy Technology Data Exchange (ETDEWEB)
Briggs, L. L.; Nuclear Engineering Division
2009-03-24
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
Matsushita, Taku; Hashizuka, Masahiro; Kuriyama, Taisuke; Matsui, Yoshihiko; Shirasaki, Nobutaka
2016-04-01
The effects of two water purification processes (ozonation, and chlorination after ozonation) on the mutagenicity of a solution containing iopamidol (X-ray contrast medium) were investigated by using the Ames assay. No mutagenicity was observed during ozonation. In contrast, mutagenicity was induced by the ozone-treated iopamidol-containing solution after subsequent chlorination, indicating that mutagenic transformation-products (TPs) were generated. Ten of 70 peaks detected on the LC/MS total ion chromatogram (TIC) of the ozone-treated iopamidol-containing solution after chlorination had a positive correlation (r(2) > 0.6) between their peak areas and the observed mutagenicity, suggesting that TPs detected as these peaks may induce mutagenicity. To narrow down the possible contributors to the observed mutagenicity, we compared the areas of the peaks on the TIC-charts with and without chlorination. Of the ten peaks, six were also detected in the ozone-treated iopamidol-containing solution without chlorination, which did not induce mutagenicity, indicating that these peaks were not related to the observed mutagenicity. Accurate m/z values and MS/MS analysis with an orbitrap MS of the remaining four peaks revealed that two of them represented the same TP in the negative and positive ion modes. The three remaining TPs were assessed in four quantitative structure-activity relationship models for predicting Ames mutagenicity. At least one model predicted that two of the three TPs were mutagenic, whereas none of the models predicted that the other TP was a mutagen, suggesting that the former TPs, estimated as N1-acetyl-5-amino-6-chloro-2-iodobenzene-1,3-dicarboxamide and 3-hydroxy-2-{3-[(2-hydroxyethoxy)carbonyl]-2,4,6-triiodo-5-nitrobenzoyl}amino)propanoic acid, could be the candidate compounds that contributed to the observed mutagenicity. PMID:26807944
Kasai, Chika; Sugimoto, Kazushi; Moritani, Isao; Tanaka, Junichiro; Oya, Yumi; Inoue, Hidekazu; Tameda, Masahiko; Shiraki, Katsuya; Ito, Masaaki; Takei, Yoshiyuki; Takase, Kojiro
2016-01-01
Colorectal cancer (CRC) is the third leading cause of cancer-related deaths in Japan. The etiology of CRC has been linked to numerous factors including genetic mutation, diet, life style, inflammation, and recently, the gut microbiota. However, CRC-associated gut microbiota is still largely unexamined. This study used terminal restriction fragment length polymorphism (T-RFLP) and next-generation sequencing (NGS) to analyze and compare gut microbiota of Japanese control subjects and Japanese patients with carcinoma in adenoma. Stool samples were collected from 49 control subjects, 50 patients with colon adenoma, and 9 patients with colorectal cancer (3/9 with invasive cancer and 6/9 with carcinoma in adenoma) immediately before colonoscopy; DNA was extracted from each stool sample. Based on T-RFLP analysis, 12 subjects (six control and six carcinoma in adenoma subjects) were selected; their samples were used for NGS and species-level analysis. T-RFLP analysis showed no significant differences in bacterial population between control, adenoma and cancer groups. However, NGS revealed that i), control and carcinoma in adenoma subjects had different gut microbiota compositions, ii), one bacterial genus (Slackia) was significantly associated with the control group and four bacterial genera (Actinomyces, Atopobium, Fusobacterium, and Haemophilus) were significantly associated with the carcinoma-in-adenoma group, and iii), several bacterial species were significantly associated with each type (control: Eubacterium coprostanoligens; carcinoma in adenoma: Actinomyces odontolyticus, Bacteroides fragiles, Clostridium nexile, Fusobacterium varium, Haemophilus parainfluenzae, Prevotella stercorea, Streptococcus gordonii, and Veillonella dispar). Gut microbial properties differ between control subjects and carcinoma-in-adenoma patients in this Japanese population, suggesting that gut microbiota is related to CRC prevention and development. PMID:26549775
制氢装置蒸汽发生器内漏诊断与修复%Analyses and Repair for Steam Generator in Hydrogen Unit
Institute of Scientific and Technical Information of China (English)
马红涛
2011-01-01
The steam generator appearing outside wall of over-temperature caused by lining materials and construction quality were analyzed of a new refinery 2 × 104 m3/h (standard operation)hydrogen unit. The results show that what is said above, welding joint between heat exchange tube and tube plate, welding procedure and process irrational, under the above factor combined action, not only makes outside wall of over-temperature, but makes the tube plate welded crack problems as well. According to the above analysis, the measures were taken and the prevention improvement measures were proposed.%对某炼油厂2万m3/h(标准体积)原料气制氢装置转化气蒸汽发生器管程入口锥壳壁局部温度超标的问题进行了详细分析.结果表明,在管程壳体衬里材料和衬里施工质量不符合要求、管板与换热管之间的结构和焊接工艺不完善以及操作过程不规范使换热管变形不协调的综合作用下,不仅使管程入口锥壳壁局部温度超标,而且在管程入口端管板上产生了穿透管接头焊缝、连接两管程以及管桥的裂纹.据此对设备进行了修复,提出了防止发生此类问题的预防和改进措施.
Directory of Open Access Journals (Sweden)
Smith Derek
2009-01-01
Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening
Convergence of simulated annealing by the generalized transition probability
Nishimori, Hidetoshi; Inoue, Jun-Ichi
1998-01-01
We prove weak ergodicity of the inhomogeneous Markov process generated by the generalized transition probability of Tsallis and Stariolo under power-law decay of the temperature. We thus have a mathematical foundation to conjecture convergence of simulated annealing processes with the generalized transition probability to the minimum of the cost function. An explicitly solvable example in one dimension is analyzed in which the generalized transition probability leads to a fast convergence of ...
Time Varying Transition Probabilities for Markov Regime Switching Models
Bazzi, Marco; Blasques, Francisco; Koopman, Siem Jan; Lucas, Andre
2014-01-01
We propose a new Markov switching model with time varying probabilities for the transitions. The novelty of our model is that the transition probabilities evolve over time by means of an observation driven model. The innovation of the time varying probability is generated by the score of the predictive likelihood function. We show how the model dynamics can be readily interpreted. We investigate the performance of the model in a Monte Carlo study and show that the model is successful in estim...
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Gallart, Francesc; Llorens, Pilar; Pérez-Gallego, Nuria; Latron, Jérôme
2016-04-01
The Vallcebre research catchments are located in NE Spain, in a middle mountain area with a Mediterranean sub-humid climate. Most of the bedrock consists of continental red lutites that are easily weathered into loamy soils. This area was intensely used for agriculture in the past when most of the sunny gentle hillslopes were terraced. The land was progressively abandoned since the mid-20th Century and most of the fields were converted to meadows or were spontaneously forested. Early studies carried out in the terraced Cal Parisa catchment demonstrated the occurrence of two types of frequently saturated areas, ones situated in downslope locations with high topographic index values, and the others located in the inner parts of many terraces, where the shallow water table usually outcrops due to the topographical modifications linked to terrace construction. Both the increased extent of saturated areas and the role of a man-made elementary drainage system designed for depleting water from the terraces suggested that terraced areas would induce an enhanced hydrological response during rainfall events when compared with non-terraced hillslopes. The response of 3 sub-catchments, of increasing area and decreasing percentage of terraced area, during a set of major events collected during over 15 years has been analysed. The results show that storm runoff depths were roughly proportional to precipitations above 30 mm although the smallest catchment (Cal Parisa), with the highest percentage of terraces, was able to completely buffer rainfall events of 60 mm in one hour without any runoff when antecedent conditions were dry. Runoff coefficients depended on antecedent conditions and peak discharges were weakly linked to rainfall intensities. Peak lag times, peak runoff rates and recession coefficients were similar in the 3 catchments; the first variable values were in the range between Hortonian and saturation overland flow and the two last ones were in the range of
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Transition probabilities of Br II
Bengtson, R. D.; Miller, M. H.
1976-01-01
Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.
Induction, of and by Probability
Rendell, Larry
2013-01-01
This paper examines some methods and ideas underlying the author's successful probabilistic learning systems(PLS), which have proven uniquely effective and efficient in generalization learning or induction. While the emerging principles are generally applicable, this paper illustrates them in heuristic search, which demands noise management and incremental learning. In our approach, both task performance and learning are guided by probability. Probabilities are incrementally normalized and re...
Trajectory probability hypothesis density filter
García-Fernández, Ángel F.; Svensson, Lennart
2016-01-01
This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...
Hf Transition Probabilities and Abundances
Lawler, J. E.; Hartog, E.A. den; Labby, Z. E.; Sneden, C.; Cowan, J. J.; Ivans, I. I.
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement...
Gd Transition Probabilities and Abundances
Hartog, E.A. den; Lawler, J. E.; Sneden, C.; Cowan, J. J.
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has be...
Sm Transition Probabilities and Abundances
Lawler, J. E.; Hartog, E.A. den; Sneden, C.; Cowan, J. J.
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundanc...
Gaussian Probabilities and Expectation Propagation
Cunningham, John P.; Hennig, Philipp; Lacoste-Julien, Simon
2011-01-01
While Gaussian probability densities are omnipresent in applied mathematics, Gaussian cumulative probabilities are hard to calculate in any but the univariate case. We study the utility of Expectation Propagation (EP) as an approximate integration method for this problem. For rectangular integration regions, the approximation is highly accurate. We also extend the derivations to the more general case of polyhedral integration regions. However, we find that in this polyhedral case, EP's answer...
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Compliance with endogenous audit probabilities
Konrad, Kai A.; Lohse, Tim; Qari, Salmai
2015-01-01
This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...
Novel Bounds on Marginal Probabilities
Mooij, Joris M.; Kappen, Hilbert J
2008-01-01
We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...
Stochastics introduction to probability and statistics
Georgii, Hans-Otto
2012-01-01
This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Joint probability distributions for projection probabilities of random orthonormal states
International Nuclear Information System (INIS)
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)
Joint probability distributions for projection probabilities of random orthonormal states
Alonso, L.; Gorin, T.
2016-04-01
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.
Approaches to Evaluating Probability of Collision Uncertainty
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
Significance of "high probability/low damage" versus "low probability/high damage" flood events
Directory of Open Access Journals (Sweden)
B. Merz
2009-06-01
Full Text Available The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.
Energy Technology Data Exchange (ETDEWEB)
Slentoe, E.; Moeller, F.; Winther, M.; Hjort Mikkelsen, M.
2010-10-15
The report examines in an integrated form, the energy, emissions and welfare economic implications of introducing Danish produced biodiesel, i.e. rapeseed diesel (RME) and the first and second generation wheat ethanol in two scenarios with low and high rate of blending with fossil fuel based automotive fuels. Within this project's, analytical framework and assumptions the welfare economic analysis shows, that it would be beneficial for society to realize the biofuel scenarios to some extent by oil prices above $ 100 a barrel, while it will cause losses by oil prices at $ 65. In all cases, the fossil fuel consumption and the emissions CO2eq emissions are reduced, the effect of which is priced and included in the welfare economic analysis. The implementation of biofuels in Denmark will be dependent on market price. As it stands now, it is not favorable in terms of biofuels. The RME is currently produced in Denmark is exported to other European countries where there are state subsidies. Subsidies would also be a significant factor in Denmark to achieve objectives for biofuel blending. (ln)
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Born Rule and Noncontextual Probability
Logiurato, Fabrizio
2012-01-01
The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...
Incrementalization of Analyses for Next Generation IDEs
Kloppenburg, Sven
2009-01-01
To support developers in their day–to–day work, Integrated Develoment En- vironments (IDEs) incorporate more and more ways to help developers focus on the inherent complexities of developing increasingly larger software systems. The complexity of developing large software systems can be categorized into inherent complexity that stems from the complexity of the problem domain and accidential complexity that stems from the shortcomings of the tools and methods used to tackle the problem. For ex...
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Fusion Probability in Dinuclear System
Hong, Juhee
2015-01-01
Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.
Interference of probabilities in dynamics
Energy Technology Data Exchange (ETDEWEB)
Zak, Michail, E-mail: michail.zak@gmail.com [Jet Propulsion Laboratory California Institute of Technology, Pasadena, CA 91109 (United States)
2014-08-15
A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.
Energy Technology Data Exchange (ETDEWEB)
Grunewald, T.; Graetz, R.
2007-09-29
Equipment intended for use in potentially explosive atmospheres must meet the requirements of the European directive 94/9/EC. The declaration of conformity of the manufacturer testifies that they meet the requirements. The conformity assessment is based on the risk (ignition) assessment which identifies and estimates the ignition sources. The European standards in the area of the directive 94/9/EC (like EN 1127-1, EN 13463-1) describe 13 possible ignition sources. Mechanically generated sparks are one of them. Statements to the ignition effectiveness and especially the ignition probability in case of mechanically generated sparks for a given kinetic impact energy and given explosive gas/air-mixtures are not possible. An extensive literature looking confirms this state. This was and is a problem in making and revising standards. Simple ferritic steel is a common material for the construction of equipment also for non electrical applications intended for use in potentially explosive atmospheres for chemical and mechanical engineering and manufacturing technology. Therefore it was the objective of this study to get some statistical ignition probabilities depending on the kinetic impact energy and the minimum ignition energy of the explosive gas/air-mixture. This study was made with impact testing machines of BAM (Federal Institute of Materials Research and Testing) at three kinetic impact energies. The following results were obtained for all the reference gas/air-mixtures of the IEC-explosion groups (I methane, IIA propane, IIB ethylene, IIC acetylene, hydrogen): 1. It was not possible to generate ignitable mechanically sparks for kinetic impact energies below 3 Nm for the test conditions in this study respectively the impact kinetics and impact geometry of the impact machines. 2. Single mechanically generated particles were able to be a dangerous ignition source through oxidation process at kinetic impact energies of 10 Nm. Furthermore the tests have shown that the
Probability analysis of MCO over-pressurization during staging
International Nuclear Information System (INIS)
The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience
Pollock on probability in epistemology
Fitelson, Branden
2010-01-01
In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.
ESTIMATION OF AGE TRANSITION PROBABILITIES.
ZINTER, JUDITH R.
THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…
Transition probability and preferential gauge
Chen, C.Y.
1999-01-01
This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.
Quantum correlations; quantum probability approach
Majewski, W A
2014-01-01
This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...
Diverse Consequences of Algorithmic Probability
Özkural, Eray
2011-01-01
We reminisce and discuss applications of algorithmic probability to a wide range of problems in artificial intelligence, philosophy and technological society. We propose that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline. We also relate to our own work in incremental machine learning and philosophy of complexity.
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W
2004-01-01
Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Fuzzy Markov chains: uncertain probabilities
James J. Buckley; Eslami, Esfandiar
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Probability representations of fuzzy systems
Institute of Scientific and Technical Information of China (English)
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability Distribution for Flowing Interval Spacing
International Nuclear Information System (INIS)
The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Probability densities in strong turbulence
Yakhot, Victor
2006-03-01
In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.
Probability, Information and Statistical Physics
Kuzemsky, A. L.
2016-03-01
In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
The probability of extraterrestrial life
International Nuclear Information System (INIS)
Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)
Classical Probability and Quantum Outcomes
Directory of Open Access Journals (Sweden)
James D. Malley
2014-05-01
Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Relative transition probabilities of cobalt
Roig, R. A.; Miller, M. H.
1974-01-01
Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.
Probability for primordial black holes
Bousso, R.; Hawking, S. W.
1995-11-01
We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.
Tight Bernoulli tail probability bounds
Dzindzalieta, Dainius
2014-01-01
The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...
Asbestos and Probable Microscopic Polyangiitis
Directory of Open Access Journals (Sweden)
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Probability distributions of landslide volumes
M. T. Brunetti; Guzzetti, F.; M. Rossi
2009-01-01
We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...
Digital differential analysers
Shilejko, A V; Higinbotham, W
1964-01-01
Digital Differential Analysers presents the principles, operations, design, and applications of digital differential analyzers, a machine with the ability to present initial quantities and the possibility of dividing them into separate functional units performing a number of basic mathematical operations. The book discusses the theoretical principles underlying the operation of digital differential analyzers, such as the use of the delta-modulation method and function-generator units. Digital integration methods and the classes of digital differential analyzer designs are also reviewed. The te
Ruin probabilities for a regenerative Poisson gap generated risk process
DEFF Research Database (Denmark)
Asmussen, Søren; Biard, Romain
A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is deﬁned for claim interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise, the claim size distribution is F. Asy...
Coenocline reconstruction using graph theory and Bayesian probability data generator
Czech Academy of Sciences Publication Activity Database
Čejchan, Petr
Krakow : Royal Society, 2007. s. 1-2. [United Kingdom -Visegrad Frontiers of Science Symposium /4./. 21.02.2007-23.02.2007, Krakow] Institutional research plan: CEZ:AV0Z30130516 Keywords : coenocline * gradient anaysis * palaeoecology Subject RIV: DB - Geology ; Mineralogy
International Nuclear Information System (INIS)
Probability of different ionospheric trough observation for Kosmos satellite data (about 3000 circuits) is analysed. Trough appearance probability variations with the season, longitude, local time and magnetic activity are discriminated and investigated. It is shown that trough production probability depends on the magnetic activity and background ionization. The last is determined by illumination variations and neutral wind
Calculational framework for safety analyses of non-reactor nuclear facilities
International Nuclear Information System (INIS)
A calculational framework for the consequences analysis of non-reactor nuclear facilities is presented. The analysis framework starts with accident scenarios which are developed through a traditional hazard analysis and continues with a probabilistic framework for the consequences analysis. The framework encourages the use of response continua derived from engineering judgment and traditional deterministic engineering analyses. The general approach consists of dividing the overall problem into a series of interrelated analysis cells and then devising Markov chain like probability transition matrices for each of the cells. An advantage of this division of the problem is that intermediate output (as probability state vectors) are generated at each calculational interface. The series of analyses when combined yield risk analysis output. The analysis approach is illustrated through application to two non-reactor nuclear analyses: the Ulysses Space Mission, and a hydrogen burn in the Hanford waste storage tanks
Modeling Banks¡¯ Probability of Default
2015-01-01
The unprecedented financial crisis of 2008-2009 has called attention to limitations of existing methods for estimating the default risk of financial intuitions. Over the past decade, we have had considerable success at predicting default and credit relative value using Merton-type structural models and Hybrid Probability of Default models. However, generating accurate model-based estimates of default probabilities (PDs) for financial firms has proven difficult. To address this need, I built a...
Probability sampling design in ethnobotanical surveys of medicinal plants
Mariano Martinez Espinosa; Isanete G. C. Bieski; Domingos Tabajara de Oliveira Martins
2012-01-01
Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NS...
Probability and Statistics The Science of Uncertainty (Revised Edition)
Tabak, John
2011-01-01
Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Lectures on probability and statistics
International Nuclear Information System (INIS)
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Hf Transition Probabilities and Abundances
Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...
Gd Transition Probabilities and Abundances
Den Hartog, E A; Sneden, C; Cowan, J J
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...
Lectures on probability and statistics
Energy Technology Data Exchange (ETDEWEB)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Energy Technology Data Exchange (ETDEWEB)
Zuern, Marcel
2010-07-01
The target of this thesis is the analysis of the connection between technological change and the development of global GHG with a quantitative analytic framework. Due to the special importance of the electricity generation sector for the mitigation of CO{sub 2} special attention is paid to this sector. The analysis of technological progress, particularly in the power generation sector on a global level, asks for substantial requirements of the analytic framework. The great number of actors and the interplay of interdependent factors make an analytical solution to the problem impossible. Therefore, a quantitative numerical model is necessary in order to analyse technological change on a global level. For the analysis of innovation and technological progress the sectoral, regional and chronological dimensions have to be considered explicitly: The analysis should take all economic areas into account because innovations are not restricted to a certain industrial sector or certain area of the economy but involve the whole economy. Concerning the geographical dimension innovations are not bound to a single country but spread out over national borders. Adjustments to technological development take time to unfold, and therefore an analytical framework should cover a long-term horizon. The same requirements apply to the regional, geographical and chronological dimensions when analysing measures to reduce GHG. The general equilibrium model used in this work (CGE - Computable General Equilibrium) fulfils all of the requirements listed above. Since the GHG problem is a global one, its analysis demands of model that is appropriate for this level. The structure of GHG models and the use of economic data on the global level allow for the correct methodology therefore. Since adjustments to measures of climate protection as well as innovations and technological change need time, a dynamic general equilibrium model with a long-term time horizon is used. Further advantages of CGE
Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen
2002-01-01
This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.
The Inductive Applications of Probability Calculus
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Delayed neutron emission probability measurements
International Nuclear Information System (INIS)
Some neutrons are emitted from fission fragments several seconds to several minutes after fission occurs. These delayed neutrons play a key role for the conduct and in safety aspects of nuclear reactors [1]. But the probabilities to emit such neutrons (Pn) are not well known. A summary of different database and compilation of Pn values is presented to show these discrepancies and uncertainties. Experiments are carried out at the Lohengrin mass spectrometer (at Inst. Laue Langevin in Grenoble) and at the ISOLDE facility (CERN) in order to measure some Pn values. Two different techniques are used: either by using gamma-rays detection or neutron emission detection. These two techniques and some preliminary results are presented. (authors)
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Transition Probabilities in 189Os
International Nuclear Information System (INIS)
The level structure of 189Os has been studied from the decay of 189Ir (13,3 days) produced in proton spallation at CERN and mass separated in the ISOLDE on-line facility. The gamma-ray spectrum has been recorded both with a high resolution Si(Li) - detector and Ge(Li) - detectors. Three previously not reported transitions were observed defining a new level at 348.5 keV. Special attention was given to the low energy level band structure. Several multipolarity mixing ratios were deduced from measured L-subshell ratios which, together with measured level half-lives, gave absolute transition probabilities. The low level decay properties are discussed in terms of the Nilsson model with the inclusion of Coriolis coupling
Transition probabilities for argon I
International Nuclear Information System (INIS)
Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)
Probability theory and mathematical statistics for engineers
Pugachev, V S
2014-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Cosmological dynamics in tomographic probability representation
Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.
2004-01-01
The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Uncertainty Analyses and Strategy
International Nuclear Information System (INIS)
The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository
Fusion probability in heavy nuclei
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections over a wider energy range for
Avoiding Negative Probabilities in Quantum Mechanics
Nyambuya, Golden Gadzirayi
2013-01-01
As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
Exact Bures Probabilities of Separability
Slater, P B
1999-01-01
We reexamine the question of what constitutes the conditional Bures or "quantum Jeffreys" prior for a certain four-dimensional convex subset (P) of the eight-dimensional convex set (Q) of 3 x 3 density matrices (rho_{Q}). We find that two competing procedures yield related but not identical priors - the prior previously reported (J. Phys. A 29, L271 [1996]) being normalizable over P, the new prior here, not. Both methods rely upon the same formula of Dittmann for the Bures metric tensor g, but differ in the parameterized form of rho employed. In the earlier approach, the input is a member of P, that is rho_{P}, while here it is rho_{Q}, and only after this computation is the conditioning on P performed. Then, we investigate several one-dimensional subsets of the fifteen-dimensional set of 4 x 4 density matrices, to which we apply, in particular, the first methodology. Doing so, we determine exactly the conditional Bures probabilities of separability into product states of 2 x 2 density matrices. We find that ...
Trajectory versus probability density entropy
Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo
2001-07-01
We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.
Random medium shiver movement probability and surface subsidence
Energy Technology Data Exchange (ETDEWEB)
Guo, Z.; Yin, Z.; Wang, J. [China University of Mining and Technology (China). Beijing Campus
2000-06-01
Based on the physical model of random medium, the shiver movement probability has been analysed and the surface movement prediction model under sub-critical extraction condition is obtained. The advantage of this prediction model when compared with the probability integration method is that the movement distance of the inflection point does not need to be determined. And because the rate of surface subsidence is taken directly as the prediction parameter, the prediction accuracy is improved. The results have some reference value for the prediction of surface movement caused by sub-critical extraction. 2 refs., 4 figs., 2 tabs.
Improved Estimates of Survival Probabilities via Isospectral Transformations
Bunimovich, Leonid; Webb, Benjamin
2012-01-01
We consider open systems generated from one-dimensional maps that admit a finite Markov partition and use the recently developed theory of isospectral graph transformations to estimate a system's survival probabilities. We show that these estimates are better than those obtained through a more direct approach.
Extended Fibonacci numbers and polynomials with probability applications
Directory of Open Access Journals (Sweden)
Demetrios L. Antzoulakos
2004-10-01
Full Text Available The extended Fibonacci sequence of numbers and polynomials is introduced and studied. The generating function, recurrence relations, an expansion in terms of multinomial coefficients, and several properties of the extended Fibonacci numbers and polynomials are obtained. Interesting relations between them and probability problems which take into account lengths of success and failure runs are also established.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Gas prices: realities and probabilities
International Nuclear Information System (INIS)
An assessment of price trends suggests continuing rise in 2001, with some easing of upward price movement in 2002 and 2003. Storage levels as of Nov. 1, 2000 are expected to be at 2.77 Tcf, but if the winter of 2000/2001 proves to be more severe than usual, inventory levels could sink as low as 500 Bcf by April 1, 2001. With increasing demand for natural gas for non-utility electric power generation the major challenge will be to achieve significant supply growth, which means increased developmental drilling and inventory draw-downs, as well as more exploratory drilling in deepwater and frontier regions. Absence of a significant supply response by next summer will affect both growth in demand and in price levels, and the increased demand for electric generation in the summer will create a flatter consumption profile, erasing the traditional summer/winter spread in consumption, further intensifying price volatility. Managing price fluctuations is the second biggest challenge (after potential supply problems) facing the industry
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan Cort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Limit Theorems in Free Probability Theory I
Chistyakov, G. P.; Götze, F.
2006-01-01
Based on a new analytical approach to the definition of additive free convolution on probability measures on the real line we prove free analogs of limit theorems for sums for non-identically distributed random variables in classical Probability Theory.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Probability distributions of landslide volumes
Directory of Open Access Journals (Sweden)
M. T. Brunetti
2009-03-01
Full Text Available We examine 19 datasets with measurements of landslide volume, V_{L}, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10^{−4} m^{3}≤V_{L}≤10^{13} m^{3}. We determine the probability density of landslide volumes, p(V_{L}, using kernel density estimation. Each landslide dataset exhibits heavy tailed (self-similar behaviour for their frequency-size distributions, p(V_{L} as a function of V_{L}, for failures exceeding different threshold volumes, V_{L}*, for each dataset. These non-cumulative heavy-tailed distributions for each dataset are negative power-laws, with exponents 1.0≤β≤1.9, and averaging β≈1.3. The scaling behaviour of V_{L} for the ensemble of the 19 datasets is over 17 orders of magnitude, and is independent of lithological characteristics, morphological settings, triggering mechanisms, length of period and extent of the area covered by the datasets, presence or lack of water in the failed materials, and magnitude of gravitational fields. We argue that the statistics of landslide volume is conditioned primarily on the geometrical properties of the slope or rock mass where failures occur. Differences in the values of the scaling exponents reflect the primary landslide types, with rock falls exhibiting a smaller scaling exponent (1.1≤β≤1.4 than slides and soil slides (1.5≤β≤1.9. We argue that the difference is a consequence of the disparity in the mechanics of rock falls and slides.
On the measurement probability of quantum phases
Schürmann, Thomas
2006-01-01
We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.
Equivalence of two orthogonalities between probability measures
Takatsu, Asuka
2011-01-01
Given any two probability measures on a Euclidean space with mean 0 and finite variance, we demonstrate that the two probability measures are orthogonal in the sense of Wasserstein geometry if and only if the two spaces by spanned by the supports of each probability measure are orthogonal.
Inferring Beliefs as Subjectively Imprecise Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.; Hole, Arna Risa; Rutström, E. Elisabeth
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The...... probabilities are indeed best characterized as probability distributions with non-zero variance....
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
On Markov Chains Induced by Partitioned Transition Probability Matrices
Institute of Scientific and Technical Information of China (English)
Thomas KAIJSER
2011-01-01
Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P. Let K denote the set of probability vectors on S. With every partition M of P we can associate a transition probability function PM on K defined in such a way that if p ∈ K and M ∈ M are such that ‖pM‖ ＞ 0, then, with probability ‖pM‖, the vector p is transferred to the vector pM/‖pM‖. Here ‖· ‖ denotes the l1-norm. In this paper we investigate the convergence in distribution for Markov chains generated by transition probability functions induced by partitions of transition probability matrices. The main motivation for this investigation is the application of the convergence results obtained to filtering processes of partially observed Markov chains with denumerable state space.
Using inferred probabilities to measure the accuracy of imprecise forecasts
Directory of Open Access Journals (Sweden)
Paul Lehner
2012-11-01
Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.
Collision strengths and transition probabilities for Co III forbidden lines
Storey, P. J.; Sochi, Taha
2016-07-01
In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.
Collision strengths and transition probabilities for Co III forbidden lines
Storey, P J
2016-01-01
In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.
Quantification of digital forensic hypotheses using probability theory
Overill, RE; Silomon, JAM; Tse, HKS; Chow, KP
2013-01-01
The issue of downloading illegal material from a website onto a personal digital device is considered from the perspective of conventional (Pascalian) probability theory. We present quantitative results for a simple model system by which we analyse and counter the putative defence case that the forensically recovered illegal material was downloaded accidentally by the defendant. The model is applied to two actual prosecutions involving possession of child pornography.
Edgeworth on the Foundations of Ethics and Probability
Alberto Baccini
2004-01-01
This paper analyses the foundation of utilitarian ethics and theory of probability in the works of Francis Y. Edgeworth. We argue that he pursued an unitary philosophical project, the search for a common epistemological foundation for the social sciences. The common root of the disciplines is the notion of “hereditary experience” derived from Herbert Spencer’s work. We suggest that this reconstruction can modify the overall interpretation of Edgeworth’s thought.
Probability and uncertainty in Keynes's The General Theory
Gillies, D
2003-01-01
Book description: John Maynard Keynes is undoubtedly the most influential Western economist of the twentieth century. His emphasis on the nature and role of uncertainty in economic thought is a dominant theme in his writings. This book brings together a wide array of experts on Keynes' thought such as Gay Tulip Meeks, Sheila Dow and John Davis who discuss, analyse and criticise such themes as Keynesian probability and uncertainty, the foundations of Keynes' economics and the relationship b...
Employment and Wage assimilation of Male First Generation Immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Employment and Wage Assimilation of Male First Generation Immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;
2000-01-01
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Employment and Wage Assimilation of Male First-generation immigrants in Denmark
DEFF Research Database (Denmark)
Husted, Leif; Nielsen, Helena Skyt; Rosholm, Michael;
2001-01-01
Labour market assimilation of Danish first generation male immigrants is analysed based on two panel data sets covering the population of immigrants and 10% of the Danish population during 1984-1995. Wages and employment probabilities are estimated jointly in a random effects model which corrects...
Statistics and probability with applications for engineers and scientists
Gupta, Bhisham C
2013-01-01
Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob
Digital dice computational solutions to practical probability problems
Nahin, Paul J
2013-01-01
Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the
Pretest probability assessment derived from attribute matching
Directory of Open Access Journals (Sweden)
Hollander Judd E
2005-08-01
Full Text Available Abstract Background Pretest probability (PTP assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS. We compare the new method with a validated logistic regression equation (LRE. Methods Eight clinical variables (attributes were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82 for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77 for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25% patients as having a PTP Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE.
Reliability and safety analyses under fuzziness
International Nuclear Information System (INIS)
Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs
Probability and Quantum Paradigms: the Interplay
International Nuclear Information System (INIS)
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Probability and Quantum Paradigms: the Interplay
Kracklauer, A. F.
2007-12-01
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
Introduction: Research and Developments in Probability Education
Manfred Borovcnik; Ramesh Kapadia
2009-01-01
In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...
Probabilities and signalling in quantum field theory
Dickinson, Robert; Forshaw, Jeff; Millington, Peter
2016-01-01
We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators i...
Local Percolation Probabilities for a Natural Sandstone
Hilfer, R.; Rag, T.; Virgi, B.
1996-01-01
Local percolation probabilities are used to characterize the connectivity in porous and heterogeneous media. Together with local porosity distributions they allow to predict transport properties \\cite{hil91d}. While local porosity distributions are readily obtained, measurements of the local percolation probabilities are more difficult and have not been attempted previously. First measurements of three dimensional local porosity distributions and percolation probabilities from a pore space re...
Are All Probabilities Fundamentally Quantum Mechanical?
Pradhan, Rajat Kumar
2011-01-01
The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype e...
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the...... motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Lagrangian Probability Distributions of Turbulent Flows
Friedrich, R.
2002-01-01
We outline a statistical theory of turbulence based on the Lagrangian formulation of fluid motion. We derive a hierarchy of evolution equations for Lagrangian N-point probability distributions as well as a functional equation for a suitably defined probability functional which is the analog of Hopf's functional equation. Furthermore, we adress the derivation of a generalized Fokker-Plank equation for the joint velocity - position probability density of N fluid particles.
Quantum Statistical Mechanics. III. Equilibrium Probability
Attard, Phil
2014-01-01
Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.
Some New Results on Transition Probability
Institute of Scientific and Technical Information of China (English)
Yu Quan XIE
2008-01-01
In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.
Time and probability in quantum cosmology
Energy Technology Data Exchange (ETDEWEB)
Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)
1990-10-01
A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).
Real analysis and probability solutions to problems
Ash, Robert P
1972-01-01
Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.
Bayesian logistic betting strategy against probability forecasting
Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei
2012-01-01
We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.
Entailment in Probability of Thresholded Generalizations
Bamber, Donald
2013-01-01
A nonmonotonic logic of thresholded generalizations is presented. Given propositions A and B from a language L and a positive integer k, the thresholded generalization A=>B{k} means that the conditional probability P(B|A) falls short of one by no more than c*d^k. A two-level probability structure is defined. At the lower level, a model is defined to be a probability function on L. At the upper level, there is a probability distribution over models. A definition is given of what it means for a...
Advantages of the probability amplitude over the probability density in quantum mechanics
Kurihara, Yoshimasa; Quach, Nhi My Uyen
2013-01-01
We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...
Energy Technology Data Exchange (ETDEWEB)
NONE
2008-07-01
This study was made under the particular context of a strong growth of biofuels market, and the implication of French and European public authorities, and certain Member States (Germany, Netherlands, UK), for the development of certification schemes for first generation biofuels. The elaboration of such schemes requires a consensus on the methodology to apply when producing Life Cycle Analysis (LCA) of biofuels. To answer this demand, the study built up the methodological referential for biofuels LCAs in order to assess the Greenhouse Gases (GHG) emissions, fossil fuels consumptions and local atmospheric pollutants emissions induced by the different biofuel production pathways. The work consisted in methodological engineering, and was accomplished thanks to the participation of all the members of the Technical Committee of the study. An initial bibliographic review on biofuels LCAs allowed the identification of the main methodological issues (listed below). For each point, the impact of the methodological choices on the biofuels environmental balances was assessed by several sensitivity analyses. The results of these analyses were taken into account for the elaboration of the recommendations: - Consideration of the environmental burdens associated with buildings, equipments and their maintenance - Quantification of nitrous oxide (N{sub 2}O) emissions from fields - Impact of the Land Use Change (LUC) - Allocation method for the distribution of the environmental impacts of biofuel production pathways between the different products and coproducts generated. Within the framework of this study, we made no distinction in terms of methodological approach between GHG emissions and local pollutants emissions. This results from the fact that the methodological issues cover all the environmental burdens and do not require specific approaches. This executive summary presents the methodological aspects related to biofuels LCAs. The complete report of the study presents in
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
Predicting most probable conformations of a given peptide sequence in the random coil state.
Bayrak, Cigdem Sevim; Erman, Burak
2012-11-01
In this work, we present a computational scheme for finding high probability conformations of peptides. The scheme calculates the probability of a given conformation of the given peptide sequence using the probability distribution of torsion states. Dependence of the states of a residue on the states of its first neighbors along the chain is considered. Prior probabilities of torsion states are obtained from a coil library. Posterior probabilities are calculated by the matrix multiplication Rotational Isomeric States Model of polymer theory. The conformation of a peptide with highest probability is determined by using a hidden Markov model Viterbi algorithm. First, the probability distribution of the torsion states of the residues is obtained. Using the highest probability torsion state, one can generate, step by step, states with lower probabilities. To validate the method, the highest probability state of residues in a given sequence is calculated and compared with probabilities obtained from the Coil Databank. Predictions based on the method are 32% better than predictions based on the most probable states of residues. The ensemble of "n" high probability conformations of a given protein is also determined using the Viterbi algorithm with multistep backtracking. PMID:22955874
Finite-size scaling of survival probability in branching processes
Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro
2015-04-01
Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G (y ) =2 y ey /(ey-1 ) , with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out-c...... well as for bimodal processes with two dominating frequencies in the structural response....
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability of...
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Correlation as Probability of Common Descent.
Falk, Ruma; Well, Arnold D.
1996-01-01
One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Recent Developments in Applied Probability and Statistics
Devroye, Luc; Kohler, Michael; Korn, Ralf
2010-01-01
This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Analytical theory of the probability distribution function of structure formation
Anderson, Johan; Kim, Eun-Jin
2009-01-01
The probability distribution function (PDF) tails of the zonal flow structure formation and the PDF tails of momentum flux by incorporating effect of a shear flow in ion-temperature-gradient (ITG) turbulence are computed in the present paper. The bipolar vortex soliton (modon) is assumed to be the coherent structure responsible for bursty and intermittent events driving the PDF tails. It is found that stronger zonal flows are generated in ITG turbulence than Hasegawa-Mima (HM) turbulence as w...
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Laboratory-Tutorial activities for teaching probability
Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...
The enigma of probability and physics
International Nuclear Information System (INIS)
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Quantum probability assignment limited by relativistic causality.
Han, Yeong Deok; Choi, Taeseung
2016-01-01
Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717
Exact capture probability analysis of GSC receivers over Rayleigh fading channel
Nam, Sungsik
2010-01-01
For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.
Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package
International Nuclear Information System (INIS)
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by keff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package
Assessing the clinical probability of pulmonary embolism
Energy Technology Data Exchange (ETDEWEB)
Miniati, M. [Consiglio Nazionale delle Ricerche, Institute of Clinical Physiology, Pisa (Italy); Pistolesi, M. [University of Florence, Dept. of Section of Nuclear Medicine Critical Care, Florence (Italy)
2001-12-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score {<=} 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score {>=} 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Angular anisotropy representation by probability tables
International Nuclear Information System (INIS)
In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)
Measurement of probability distributions for internal stresses in dislocated crystals
Energy Technology Data Exchange (ETDEWEB)
Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)
2014-11-03
Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.
Uncertainty about probability: a decision analysis perspective
Energy Technology Data Exchange (ETDEWEB)
Howard, R.A.
1988-03-01
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.
Uncertainty about probability: a decision analysis perspective
International Nuclear Information System (INIS)
The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Radiative lifetimes and atomic transition probabilities
International Nuclear Information System (INIS)
Radiative lifetimes and atomic transition probabilities have been measured for over 35 neutral and singly ionized species in the Wisconsin Atomic Transition Probabilities (WATP) Program since it began in 1980. Radiative lifetimes are measured using time-resolved laser-induced fluorescence of a slow atomic/ionic beam. These lifetimes are combined with branching fractions to yield absolute atomic transition probabilities for neutral and singly ionized species. The branching fractions are determined from emission spectra recorded using the 1.0 m Fourier-transform spectrometer at the National Solar Observatory. The current focus of the WATP Program is on the rare-earth elements, in particular Tm, Dy, and Ho
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from the...... same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects with...
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Probability Distributions for a Surjective Unimodal Map
Institute of Scientific and Technical Information of China (English)
HongyanSUN; LongWANG
1996-01-01
In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.
Are All Probabilities Fundamentally Quantum Mechanical?
Pradhan, Rajat Kumar
2011-01-01
The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.
Miscorrection probability beyond the minimum distance
Cassuto, Yuval; Bruck, Jehoshua
2004-01-01
The miscorrection probability of a list decoder is the probability that the decoder will have at least one non-causal codeword in its decoding sphere. Evaluating this probability is important when using a list-decoder as a conventional decoder since in that case we require the list to contain at most one codeword for most of the errors. A lower bound on the miscorrection is the main result. The key ingredient in the proof is a new combinatorial upper bound on the list-size for a general q−ary...
Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities
Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina
2012-01-01
More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...
Andrew J. Filardo
1998-01-01
This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...
The external costs of low probability-high consequence events: Ex ante damages and lay risks
International Nuclear Information System (INIS)
This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report
The probability of traffic accidents associated with the transport of radioactive wastes
International Nuclear Information System (INIS)
This report evaluates the probability of a container impact during transit between generating and disposal sites. Probabilities per route mile are combined with the characteristics of the transport systems described in previous reports, to allow a comparison of different disposal options to be made. (author)
Institute of Scientific and Technical Information of China (English)
李光耀; 骆佳勇; 封孝松; 龚林平
2013-01-01
The configurations of generator-transformer protections and control and monitoring panels in Xiluodu Hydropower Station are briefly introduced,and the complete longitudinal differential protection,unit-transverse differential protection and rotor earth fault are analyzed in detail.The internal fault simulation of generator-transformer undertaken by Tsinghua University shows that the configuration of generator-transformer main protection in Xiluodu Hydropower Station is scientific and rational,and the dual configuration of main protection,abnormal operation protection and backup protection are achieved.The design of generator-transformer protection in Xiluodu Hydropower Station is simple and reliable and its operation and maintenance is easy.%对溪洛渡水电站发变组保护配置方案和组屏方案进行了简要介绍,就发变组保护中的完全纵差保护、单元件横差保护与转子接地保护做了详细介绍.经过清华大学内部故障全面仿真,溪洛渡电站发电机组主保护的保护配置科学合理,并实现了主保护、异常运行保护、后备保护的保护双重化配置方案,设计简洁可靠,运行维护方便.
Certainties and probabilities of the IPCC
International Nuclear Information System (INIS)
Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)
Asymmetry of the work probability distribution
Saha, Arnab; Bhattacharjee, J. K.
2006-01-01
We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.
Transition Probability and the ESR Experiment
McBrierty, Vincent J.
1974-01-01
Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)
Transition Probability Estimates for Reversible Markov Chains
Telcs, Andras
2000-01-01
This paper provides transition probability estimates of transient reversible Markov chains. The key condition of the result is the spatial symmetry and polynomial decay of the Green's function of the chain.
Transition probabilities in superfluid He4
International Nuclear Information System (INIS)
The transition probabilities between various states of superfluid helium-4 are found by using the approximation method of Bogolyubov and making use of his canonical transformations for different states of transitions. (author)
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Determinantal Probability: Basic Properties and Conjectures
Lyons, Russell
2014-01-01
We describe the fundamental constructions and properties of determinantal probability measures and point processes, giving streamlined proofs. We illustrate these with some important examples. We pose several general questions and conjectures.
Encounter Probability of Individual Wave Height
DEFF Research Database (Denmark)
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination of...... the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Representing Uncertainty by Probability and Possibility
DEFF Research Database (Denmark)
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of...
Weak Convergence of Probability Measures Revisited
Salinetti, G.; Wets, R. J.-B.
1987-01-01
The hypo-convergence of upper semicontinuous functions provides a natural framework for the study of the convergence of probability measures. This approach also yields some further characterizations of weak convergence and tightness.
Stimulus probability effects in absolute identification.
Kent, Christopher; Lamberts, Koen
2016-05-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959
Probability of spent fuel transportation accidents
Energy Technology Data Exchange (ETDEWEB)
McClure, J. D.
1981-07-01
The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10/sup -7/ spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10/sup -9//mile.
Krajnc, Božidar; Župec, Janez
2000-01-01
Krško nuclear power plant (NPP) is one the last presurized water reactor NPPs of western design in Europe, which has decided to replace the existing steam generators and at the same time perform a power uprating. A compehensive set of design calculations and safety analyses have been performed to demonstrate: -that the new steam generators are compatible with the existing plant, - that the plant can operate safely and with adequate margins at the uprated power. In this power only the mechanic...
Subjective Probability and the Theory of Games
Kadane, Joseph B.; Patrick D. Larkey
1982-01-01
This paper explores some of the consequences of adopting a modern subjective view of probability for game theory. The consequences are substantial. The subjective view of probability clarifies the important distinction between normative and positive theorizing about behavior in games, a distinction that is often lost in the search for "solution concepts" which largely characterizes game theory since the work of von Neumann and Morgenstern. Many of the distinctions that appear important in con...
A case concerning the improved transition probability
Tang, Jian; Wang, An Min
2006-01-01
As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...
Atomic transition probabilities of neutral samarium
International Nuclear Information System (INIS)
Absolute atomic transition probabilities from a combination of new emission branching fraction measurements using Fourier transform spectrometer data with radiative lifetimes from recent laser induced fluorescence measurements are reported for 299 lines of the first spectrum of samarium (Sm i). Improved values for the upper and lower energy levels of these lines are also reported. Comparisons to published transition probabilities from earlier experiments show satisfactory and good agreement with two of the four published data sets. (paper)
Validation of fluorescence transition probability calculations
M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)
2015-01-01
A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...
Generalized couplings and convergence of transition probabilities
Kulik, Alexei; Scheutzow, Michael
2015-01-01
We provide sufficient conditions for the uniqueness of an invariant measure of a Markov process as well as for the weak convergence of transition probabilities to the invariant measure. Our conditions are formulated in terms of generalized couplings. We apply our results to several SPDEs for which unique ergodicity has been proven in a recent paper by Glatt-Holtz, Mattingly, and Richards and show that under essentially the same assumptions the weak convergence of transition probabilities actu...
Semiclassical transition probabilities for interacting oscillators
Khlebnikov, S. Yu.
1994-01-01
Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...
Country Default Probabilities: Assessing and Backtesting
Vogl, Konstantin; Maltritz, Dominik; Huschens, Stefan; Karmann, Alexander
2006-01-01
We address the problem how to estimate default probabilities for sovereign countries based on market data of traded debt. A structural Merton-type model is applied to a sample of emerging market and transition countries. In this context, only few and heterogeneous default probabilities are derived, which is problematic for backtesting. To deal with this problem, we construct likelihood ratio test statistics and quick backtesting procedures.
Transition probability studies in 175Au
Grahn, Tuomas; Watkins, H.; Joss, David; Page, Robert; Carroll, R. J.; Dewald, A.; Greenlees, Paul; Hackstein, M.; Herzberg, Rolf-Dietmar; Jakobsson, Ulrika; Jones, Peter; Julin, Rauno; Juutinen, Sakari; Ketelhut, Steffen; Kröll, Th
2013-01-01
Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms...
Ruin Probability in Linear Time Series Model
Institute of Scientific and Technical Information of China (English)
ZHANG Lihong
2005-01-01
This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Non-Gaussian Photon Probability Distribution
International Nuclear Information System (INIS)
This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub wavelength confinement; thereby providing a strong case that
Avoiding Negative Probabilities in Quantum Mechanics
2013-01-01
As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless questi...
Breakdown Point Theory for Implied Probability Bootstrap
Lorenzo Camponovo; Taisuke Otsu
2011-01-01
This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...
The Pauli equation for probability distributions
International Nuclear Information System (INIS)
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
The Pauli equation for probability distributions
Energy Technology Data Exchange (ETDEWEB)
Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it
2001-04-27
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
Characteristic Functions over C*-Probability Spaces
Institute of Scientific and Technical Information of China (English)
王勤; 李绍宽
2003-01-01
Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.
On the Robustness of Most Probable Explanations
Chan, Hei; Darwiche, Adnan
2012-01-01
In Bayesian networks, a Most Probable Explanation (MPE) is a complete variable instantiation with a highest probability given the current evidence. In this paper, we discuss the problem of finding robustness conditions of the MPE under single parameter changes. Specifically, we ask the question: How much change in a single network parameter can we afford to apply while keeping the MPE unchanged? We will describe a procedure, which is the first of its kind, that computes this answer for each p...
Analysis of probability of defects in the disposal canisters
International Nuclear Information System (INIS)
This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)
The cumulative reaction probability as eigenvalue problem
Manthe, Uwe; Miller, William H.
1993-09-01
It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.
The cumulative reaction probability as eigenvalue problem
International Nuclear Information System (INIS)
It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=summation kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems---transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0
Integrated-Circuit Pseudorandom-Number Generator
Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur
1992-01-01
Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.
Three Dimensional Deformation of Mining Area Detection by InSAR and Probability Integral Model
H. D. Fan; Gao, X. X.; Cheng, D; Zhao, W. Y.; Zhao, C. L.
2015-01-01
A new solution algorithm that combined D-InSAR and probability integral method was proposed to generate the three dimensional deformation in mining area. The details are as follows: according to the geological and mining data, the control points set should be established, which contains correct phase unwrapping points in subsidence basin edge generated by D-InSAR and several GPS points; Using the modulus method to calculate the optimum parameters of probability integral prediction; F...
International Nuclear Information System (INIS)
A methodology of estimation of the probability of defects detection and the probability of defects absence in tubes components and weldments is presented. The suggested approach gives an important information and may be applied for the service life assessment of NPP equipment. Concrete examples of calculation of reliability indexes are given for tubes used for steam generators, butt welded joints of steam generator tubes, and welded joints, connected tubes and tube sheets
The assessment of low probability containment failure modes using dynamic PRA
Brunett, Acacia Joann
a significant threat to containment integrity. Additional scoping studies regarding the effect of recovery actions on in-vessel hydrogen generation show that reflooding a partially degraded core do not significantly affect hydrogen generation in-vessel, and the NUREG-1150 assumption that insufficient hydrogen is generated in-vessel to produce an energetic deflagration is confirmed. The DET analyses performed in this work show that very late power recovery produces the potential for very energetic combustion events which are capable of failing containment with a non-negligible probability, and that containment cooling systems have a significant impact on core concrete attack, and therefore combustible gas generation ex-vessel. Ultimately, the overall risk of combustion-induced containment failure is low, but its conditional likelihood can have a significant effect on accident mitigation strategies. It is also shown in this work that DETs are particularly well suited to examine low probability events because of their ability to rediscretize CDFs and observe solution convergence.
Variate generation for probabilistic fracture mechanics and fitness-for-service studies
International Nuclear Information System (INIS)
Atomic Energy of Canada Limited is conducting studies in Probabilistic Fracture Mechanics. These studies are being conducted as part of a fitness-for-service programme in support of CANDU reactors. The Monte Carlo analyses, which form part of the Probabilistic Fracture Mechanics studies, require that variates can be sampled from probability density functions. Accurate pseudo-random numbers are necessary for accurate variate generation. This report details the principles of variate generation, and describes the production and testing of pseudo-random numbers. A new algorithm has been produced for the correct performance of the lattice test for the independence of pseudo-random numbers. Two new pseudo-random number generators have been produced. These generators have excellent randomness properties and can be made fully machine-independent. Versions, in FORTRAN, for VAX and CDC computers are given. Accurate and efficient algorithms for the generation of variates from the specialized probability density functions of Probabilistic Fracture Mechanics are given. 38 refs
Perfect Precision Detecting Probability Of An Atom Via Sgc Mechanism
Hamedi, H. R.
2015-06-01
This letter investigates a scheme of high efficient two-dimensional (2D) atom localization via scanning probe absorption in a Y-type four-level atomic scheme with two orthogonal standing waves. It is shown that because of the position dependent atom-field interaction, the spatial probability distribution of the atom can be directly determined via monitoring the probe absorption and gain spectra. The impact of different controlling parameters of the system on 2D localization is studied. We find that owning the effect of spontaneously generated coherence (SGC), the atom can be localized at a particular position and the maximal probability of detecting the atom within the sub-wavelength domain of the two orthogonal standing waves reaches to hundred percent. Phase controlling of position dependent probe absorption is then discussed. The presented scheme may be helpful in laser cooling or atom nanolithography via high precision and high resolution atom localization.
Dowie, J.
2001-01-01
If we cross-classify the absolutist-consequentialist distinction with an intuitive-analytical one we can see that economists probably attract the hostility of those in the other three cells as a result of being analytical consequentialists, as much as because of their concern with "costs". Suggesting that some sources of utility (either "outcome" or "process" in origin) are to be regarded as rights cannot, says the analytical consequentialist, overcome the fact that fulfilling and respecting ...
Computing Earthquake Probabilities on Global Scales
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
Laboratory-tutorial activities for teaching probability
Directory of Open Access Journals (Sweden)
Roger E. Feeley
2006-08-01
Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.
On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!
Directory of Open Access Journals (Sweden)
Mark R. Crovelli
2009-06-01
Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.
Reduced reward-related probability learning in schizophrenia patients
Directory of Open Access Journals (Sweden)
Yılmaz A
2012-01-01
Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation
Economic choices reveal probability distortion in macaque monkeys
Stauffer, William R.; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram
2015-01-01
Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas ...
Consistent probabilities in loop quantum cosmology
Craig, David A
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler-DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce vs. a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation v...
Python for probability, statistics, and machine learning
Unpingco, José
2016-01-01
This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Probability analysis of nuclear power plant hazards
International Nuclear Information System (INIS)
The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Probabilities and Signalling in Quantum Field Theory
Dickinson, Robert; Millington, Peter
2016-01-01
We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Sampling Quantum Nonlocal Correlations with High Probability
González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.
2016-05-01
It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.
EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY
Directory of Open Access Journals (Sweden)
Magdalena Hykšová
2012-03-01
Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.
Uncertainty relation and probability. Numerical illustration
International Nuclear Information System (INIS)
The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)
Estimation of functional failure probability of passive systems based on subset simulation method
International Nuclear Information System (INIS)
In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)
A structural model of intuitive probability
Dessalles, Jean-Louis
2011-01-01
Though the ability of human beings to deal with probabilities has been put into question, the assessment of rarity is a crucial competence underlying much of human decision-making and is pervasive in spontaneous narrative behaviour. This paper proposes a new model of rarity and randomness assessment, designed to be cognitively plausible. Intuitive randomness is defined as a function of structural complexity. It is thus possible to assign probability to events without being obliged to consider the set of alternatives. The model is tested on Lottery sequences and compared with subjects' preferences.
Quantum measurements and Kolmogorovian probability theory
Slavnov, D A
2003-01-01
We establish connections between the requirement of measurability of a probability space and the principle of complimentarity in quantum mechanics. It is shown that measurability of a probability space implies the dependence of results of quantum measurement not only on the properties of a quantum object under consideration, but also on the classical characteristics of the measuring device which is used. We show that if one takes into account the requirement of measurability in a quantum case, the Bell inequality does not follow from the hypothesis about the existence of an objective reality.
Electric quadrupole transition probabilities for atomic lithium
International Nuclear Information System (INIS)
Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT
Poisson spaces with a transition probability
Landsman, N. P.
1997-01-01
The common structure of the space of pure states $P$ of a classical or a quantum mechanical system is that of a Poisson space with a transition probability. This is a topological space equipped with a Poisson structure, as well as with a function $p:P\\times P-> [0,1]$, with certain properties. The Poisson structure is connected with the transition probabilities through unitarity (in a specific formulation intrinsic to the given context). In classical mechanics, where $p(\\rho,\\sigma)=\\dl_{\\rho...
Transition probability studies in 175Au
International Nuclear Information System (INIS)
Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms of available systematics as a function of atomic number and aligned angular momentum.
Lady luck the theory of probability
Weaver, Warren
1982-01-01
""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa