International Nuclear Information System (INIS)
Helton, J.C.
1996-03-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems
International Nuclear Information System (INIS)
Helton, J.C.
1996-01-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems
Energy Technology Data Exchange (ETDEWEB)
Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)
1996-03-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete
2014-01-01
Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.
Mouchtouris, S.; Kokkoris, G.
2018-01-01
A generalized equation for the electron energy probability function (EEPF) of inductively coupled Ar plasmas is proposed under conditions of nonlocal electron kinetics and diffusive cooling. The proposed equation describes the local EEPF in a discharge and the independent variable is the kinetic energy of electrons. The EEPF consists of a bulk and a depleted tail part and incorporates the effect of the plasma potential, Vp, and pressure. Due to diffusive cooling, the break point of the EEPF is eVp. The pressure alters the shape of the bulk and the slope of the tail part. The parameters of the proposed EEPF are extracted by fitting to measure EEPFs (at one point in the reactor) at different pressures. By coupling the proposed EEPF with a hybrid plasma model, measurements in the gaseous electronics conference reference reactor concerning (a) the electron density and temperature and the plasma potential, either spatially resolved or at different pressure (10-50 mTorr) and power, and (b) the ion current density of the electrode, are well reproduced. The effect of the choice of the EEPF on the results is investigated by a comparison to an EEPF coming from the Boltzmann equation (local electron kinetics approach) and to a Maxwellian EEPF. The accuracy of the results and the fact that the proposed EEPF is predefined renders its use a reliable alternative with a low computational cost compared to stochastic electron kinetic models at low pressure conditions, which can be extended to other gases and/or different electron heating mechanisms.
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Conditional probabilities in Ponzano-Regge minisuperspace
International Nuclear Information System (INIS)
Petryk, Roman; Schleich, Kristin
2003-01-01
We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes
Conditional probability on MV-algebras
Czech Academy of Sciences Publication Activity Database
Kroupa, Tomáš
2005-01-01
Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005
Statistical models based on conditional probability distributions
International Nuclear Information System (INIS)
Narayanan, R.S.
1991-10-01
We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)
Computation of the Complex Probability Function
Energy Technology Data Exchange (ETDEWEB)
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Students' Understanding of Conditional Probability on Entering University
Reaburn, Robyn
2013-01-01
An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Eliciting conditional and unconditional rank correlations from conditional probabilities
International Nuclear Information System (INIS)
Morales, O.; Kurowicka, D.; Roelen, A.
2008-01-01
Causes of uncertainties may be interrelated and may introduce dependencies. Ignoring these dependencies may lead to large errors. A number of graphical models in probability theory such as dependence trees, vines and (continuous) Bayesian belief nets [Cooke RM. Markov and entropy properties of tree and vine-dependent variables. In: Proceedings of the ASA section on Bayesian statistical science, 1997; Kurowicka D, Cooke RM. Distribution-free continuous Bayesian belief nets. In: Proceedings of mathematical methods in reliability conference, 2004; Bedford TJ, Cooke RM. Vines-a new graphical model for dependent random variables. Ann Stat 2002; 30(4):1031-68; Kurowicka D, Cooke RM. Uncertainty analysis with high dimensional dependence modelling. New York: Wiley; 2006; Hanea AM, et al. Hybrid methods for quantifying and analyzing Bayesian belief nets. In: Proceedings of the 2005 ENBIS5 conference, 2005; Shachter RD, Kenley CR. Gaussian influence diagrams. Manage Sci 1998; 35(5) .] have been developed to capture dependencies between random variables. The input for these models are various marginal distributions and dependence information, usually in the form of conditional rank correlations. Often expert elicitation is required. This paper focuses on dependence representation, and dependence elicitation. The techniques presented are illustrated with an application from aviation safety
The Probability Approach to English If-Conditional Sentences
Wu, Mei
2012-01-01
Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
Structure functions are not parton probabilities
International Nuclear Information System (INIS)
Brodsky, Stanley J.; Hoyer, Paul; Sannino, Francesco; Marchal, Nils; Peigne, Stephane
2002-01-01
The common view that structure functions measured in deep inelastic lepton scattering are determined by the probability of finding quarks and gluons in the target is not correct in gauge theory. We show that gluon exchange between the fast, outgoing partons and target spectators, which is usually assumed to be an irrelevant gauge artifact, affects the leading twist structure functions in a profound way. This observation removes the apparent contradiction between the projectile (eikonal) and target (parton model) views of diffractive and small x B phenomena. The diffractive scattering of the fast outgoing quarks on spectators in the target causes shadowing in the DIS cross section. Thus the depletion of the nuclear structure functions is not intrinsic to the wave function of the nucleus, but is a coherent effect arising from the destructive interference of diffractive channels induced by final state interactions. This is consistent with the Glauber-Gribov interpretation of shadowing as a rescattering effect
Modulation Based on Probability Density Functions
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Audio feature extraction using probability distribution function
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
Maximum Entropy and Probability Kinematics Constrained by Conditionals
Directory of Open Access Journals (Sweden)
Stefan Lukits
2015-03-01
Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.
Probability functions in the context of signed involutive meadows
Bergstra, J.A.; Ponse, A.
2016-01-01
The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.
Computing exact bundle compliance control charts via probability generating functions.
Chen, Binchao; Matis, Timothy; Benneyan, James
2016-06-01
Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.
Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.
The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...
De Simone, Andrea; Riotto, Antonio
2011-01-01
The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...
Statistical learning of action: the role of conditional probability.
Meyer, Meredith; Baldwin, Dare
2011-12-01
Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.
A note on iterated function systems with discontinuous probabilities
International Nuclear Information System (INIS)
Jaroszewska, Joanna
2013-01-01
Highlights: ► Certain iterated function system with discontinuous probabilities is discussed. ► Existence of an invariant measure via the Schauder–Tychonov theorem is established. ► Asymptotic stability of the system under examination is proved. -- Abstract: We consider an example of an iterated function system with discontinuous probabilities. We prove that it posses an invariant probability measure. We also prove that it is asymptotically stable provided probabilities are positive
Modeling highway travel time distribution with conditional probability models
Energy Technology Data Exchange (ETDEWEB)
Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)
2014-01-01
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).
Decomposition of conditional probability for high-order symbolic Markov chains
Melnik, S. S.; Usatenko, O. V.
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Path probability of stochastic motion: A functional approach
Hattori, Masayuki; Abe, Sumiyoshi
2016-06-01
The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.
Prospect evaluation as a function of numeracy and probability denominator.
Millroth, Philip; Juslin, Peter
2015-05-01
This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
Explaining regional disparities in traffic mortality by decomposing conditional probabilities.
Goldstein, Gregory P; Clark, David E; Travis, Lori L; Haskins, Amy E
2011-04-01
In the USA, the mortality rate from traffic injury is higher in rural and in southern regions, for reasons that are not well understood. For 1754 (56%) of the 3142 US counties, we obtained data allowing for separation of the deaths/population rate into deaths/injury, injuries/crash, crashes/exposure and exposure/population, with exposure measured as vehicle miles travelled. A 'decomposition method' proposed by Li and Baker was extended to study how the contributions of these components were affected by three measures of rural location, as well as southern location. The method of Li and Baker extended without difficulty to include non-binary effects and multiple exposures. Deaths/injury was by far the most important determinant in the county-to-county variation in deaths/population, and accounted for the greatest portion of the rural/urban disparity. After controlling for the rural effect, injuries/crash accounted for most of the southern/northern disparity. The increased mortality rate from traffic injury in rural areas can be attributed to the increased probability of death given that a person has been injured, possibly due to challenges faced by emergency medical response systems. In southern areas, there is an increased probability of injury given that a person has crashed, possibly due to differences in vehicle, road, or driving conditions.
Continuation of probability density functions using a generalized Lyapunov approach
Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.
2017-01-01
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial
On Farmer's line, probability density functions, and overall risk
International Nuclear Information System (INIS)
Munera, H.A.; Yadigaroglu, G.
1986-01-01
Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value
Decision making generalized by a cumulative probability weighting function
dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto
2018-01-01
Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.
Estimation of functional failure probability of passive systems based on subset simulation method
International Nuclear Information System (INIS)
Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing
2012-01-01
In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)
Wigner function and the probability representation of quantum states
Directory of Open Access Journals (Sweden)
Man’ko Margarita A.
2014-01-01
Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.
Optimizing an objective function under a bivariate probability model
X. Brusset; N.M. Temme (Nico)
2007-01-01
htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be
Entanglement probabilities of polymers: a white noise functional approach
International Nuclear Information System (INIS)
Bernido, Christopher C; Carpio-Bernido, M Victoria
2003-01-01
The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel
Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization
Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.
2014-01-01
Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406
Probability-density-function characterization of multipartite entanglement
International Nuclear Information System (INIS)
Facchi, P.; Florio, G.; Pascazio, S.
2006-01-01
We propose a method to characterize and quantify multipartite entanglement for pure states. The method hinges upon the study of the probability density function of bipartite entanglement and is tested on an ensemble of qubits in a variety of situations. This characterization is also compared to several measures of multipartite entanglement
Numerical Loading of a Maxwellian Probability Distribution Function
International Nuclear Information System (INIS)
Lewandowski, J.L.V.
2003-01-01
A renormalization procedure for the numerical loading of a Maxwellian probability distribution function (PDF) is formulated. The procedure, which involves the solution of three coupled nonlinear equations, yields a numerically loaded PDF with improved properties for higher velocity moments. This method is particularly useful for low-noise particle-in-cell simulations with electron dynamics
Visualization techniques for spatial probability density function data
Directory of Open Access Journals (Sweden)
Udeepta D Bordoloi
2006-01-01
Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.
Determination of the failure probability in the weld region of ap-600 vessel for transient condition
International Nuclear Information System (INIS)
Wahyono, I.P.
1997-01-01
Failure probability in the weld region of AP-600 vessel was determined for transient condition scenario. The type of transient is increase of the heat removal from primary cooling system due to sudden opening of safety valves or steam relief valves on the secondary cooling system or the steam generator. Temperature and pressure in the vessel was considered as the base of deterministic calculation of the stress intensity factor. Calculation of film coefficient of the convective heat transfers is a function of the transient time and water parameter. Pressure, material temperature, flaw depth and transient time are variables for the stress intensity factor. Failure probability consideration was done by using the above information in regard with the flaw and probability distributions of Octavia II and Marshall. Calculation of the failure probability by probability fracture mechanic simulation is applied on the weld region. Failure of the vessel is assumed as a failure of the weld material with one crack which stress intensity factor applied is higher than the critical stress intensity factor. VISA II code (Vessel Integrity Simulation Analysis II) was used for deterministic calculation and simulation. Failure probability of the material is 1.E-5 for Octavia II distribution and 4E-6 for marshall distribution for each transient event postulated. The failure occurred at the 1.7th menit of the initial transient under 12.53 ksi of the pressure
Assumed Probability Density Functions for Shallow and Deep Convection
Steven K Krueger; Peter A Bogenschutz; Marat Khairoutdinov
2010-01-01
The assumed joint probability density function (PDF) between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS) parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PD...
Energy Technology Data Exchange (ETDEWEB)
Antal, T [Physics Department, Simon Fraser University, Burnaby, BC V5A 1S6 (Canada); Droz, M [Departement de Physique Theorique, Universite de Geneve, CH 1211 Geneva 4 (Switzerland); Racz, Z [Institute for Theoretical Physics, Eoetvoes University, 1117 Budapest, Pazmany setany 1/a (Hungary)
2004-02-06
Finite-size scaling functions are investigated both for the mean-square magnetization fluctuations and for the probability distribution of the magnetization in the one-dimensional Ising model. The scaling functions are evaluated in the limit of the temperature going to zero (T {yields} 0), the size of the system going to infinity (N {yields} {infinity}) while N[1 - tanh(J/k{sub B}T)] is kept finite (J being the nearest neighbour coupling). Exact calculations using various boundary conditions (periodic, antiperiodic, free, block) demonstrate explicitly how the scaling functions depend on the boundary conditions. We also show that the block (small part of a large system) magnetization distribution results are identical to those obtained for free boundary conditions.
Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations
International Nuclear Information System (INIS)
El-Shanshoury, Gh.I.
2017-01-01
The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days
Exact probability distribution function for the volatility of cumulative production
Zadourian, Rubina; Klümper, Andreas
2018-04-01
In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.
Sharp Bounds by Probability-Generating Functions and Variable Drift
DEFF Research Database (Denmark)
Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten
2011-01-01
We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al....... (GECCO 2010) in several respects. First, the upper bound on the expected running time of the most successful quasirandom evolutionary algorithm for the OneMax function is improved from 1.28nln n to 0.982nlnn, which breaks the barrier of nln n posed by coupon-collector processes. Compared to the classical...
Satake, Eiki; Vashlishan Murray, Amy
2015-01-01
This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…
INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS
Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.
2012-01-01
The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.
Continuation of probability density functions using a generalized Lyapunov approach
Energy Technology Data Exchange (ETDEWEB)
Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)
2017-05-01
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.
Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence
International Nuclear Information System (INIS)
Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del
2009-01-01
Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.
Elements of a function analytic approach to probability.
Energy Technology Data Exchange (ETDEWEB)
Ghanem, Roger Georges (University of Southern California, Los Angeles, CA); Red-Horse, John Robert
2008-02-01
We first provide a detailed motivation for using probability theory as a mathematical context in which to analyze engineering and scientific systems that possess uncertainties. We then present introductory notes on the function analytic approach to probabilistic analysis, emphasizing the connections to various classical deterministic mathematical analysis elements. Lastly, we describe how to use the approach as a means to augment deterministic analysis methods in a particular Hilbert space context, and thus enable a rigorous framework for commingling deterministic and probabilistic analysis tools in an application setting.
DEFF Research Database (Denmark)
Hu, Y.; Li, H.; Liao, X
2016-01-01
method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...... of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.......This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration...
Theoretical derivation of wind power probability distribution function and applications
International Nuclear Information System (INIS)
Altunkaynak, Abdüsselam; Erdik, Tarkan; Dabanlı, İsmail; Şen, Zekai
2012-01-01
Highlights: ► Derivation of wind power stochastic characteristics are standard deviation and the dimensionless skewness. ► The perturbation is expressions for the wind power statistics from Weibull probability distribution function (PDF). ► Comparisons with the corresponding characteristics of wind speed PDF abides by the Weibull PDF. ► The wind power abides with the Weibull-PDF. -- Abstract: The instantaneous wind power contained in the air current is directly proportional with the cube of the wind speed. In practice, there is a record of wind speeds in the form of a time series. It is, therefore, necessary to develop a formulation that takes into consideration the statistical parameters of such a time series. The purpose of this paper is to derive the general wind power formulation in terms of the statistical parameters by using the perturbation theory, which leads to a general formulation of the wind power expectation and other statistical parameter expressions such as the standard deviation and the coefficient of variation. The formulation is very general and can be applied specifically for any wind speed probability distribution function. Its application to two-parameter Weibull probability distribution of wind speeds is presented in full detail. It is concluded that provided wind speed is distributed according to a Weibull distribution, the wind power could be derived based on wind speed data. It is possible to determine wind power at any desired risk level, however, in practical studies most often 5% or 10% risk levels are preferred and the necessary simple procedure is presented for this purpose in this paper.
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source
Cytoarchitecture, probability maps and functions of the human frontal pole.
Bludau, S; Eickhoff, S B; Mohlberg, H; Caspers, S; Laird, A R; Fox, P T; Schleicher, A; Zilles, K; Amunts, K
2014-06-01
The frontal pole has more expanded than any other part in the human brain as compared to our ancestors. It plays an important role for specifically human behavior and cognitive abilities, e.g. action selection (Kovach et al., 2012). Evidence about divergent functions of its medial and lateral part has been provided, both in the healthy brain and in psychiatric disorders. The anatomical correlates of such functional segregation, however, are still unknown due to a lack of stereotaxic, microstructural maps obtained in a representative sample of brains. Here we show that the human frontopolar cortex consists of two cytoarchitectonically and functionally distinct areas: lateral frontopolar area 1 (Fp1) and medial frontopolar area 2 (Fp2). Based on observer-independent mapping in serial, cell-body stained sections of 10 brains, three-dimensional, probabilistic maps of areas Fp1 and Fp2 were created. They show, for each position of the reference space, the probability with which each area was found in a particular voxel. Applying these maps as seed regions for a meta-analysis revealed that Fp1 and Fp2 differentially contribute to functional networks: Fp1 was involved in cognition, working memory and perception, whereas Fp2 was part of brain networks underlying affective processing and social cognition. The present study thus disclosed cortical correlates of a functional segregation of the human frontopolar cortex. The probabilistic maps provide a sound anatomical basis for interpreting neuroimaging data in the living human brain, and open new perspectives for analyzing structure-function relationships in the prefrontal cortex. The new data will also serve as a starting point for further comparative studies between human and non-human primate brains. This allows finding similarities and differences in the organizational principles of the frontal lobe during evolution as neurobiological basis for our behavior and cognitive abilities. Copyright © 2013 Elsevier Inc. All
Probability density functions for CP-violating rephasing invariants
Fortin, Jean-François; Giasson, Nicolas; Marleau, Luc
2018-05-01
The implications of the anarchy principle on CP violation in the lepton sector are investigated. A systematic method is introduced to compute the probability density functions for the CP-violating rephasing invariants of the PMNS matrix from the Haar measure relevant to the anarchy principle. Contrary to the CKM matrix which is hierarchical, it is shown that the Haar measure, and hence the anarchy principle, are very likely to lead to the observed PMNS matrix. Predictions on the CP-violating Dirac rephasing invariant |jD | and Majorana rephasing invariant |j1 | are also obtained. They correspond to 〈 |jD | 〉 Haar = π / 105 ≈ 0.030 and 〈 |j1 | 〉 Haar = 1 / (6 π) ≈ 0.053 respectively, in agreement with the experimental hint from T2K of | jDexp | ≈ 0.032 ± 0.005 (or ≈ 0.033 ± 0.003) for the normal (or inverted) hierarchy.
Interactive design of probability density functions for shape grammars
Dang, Minh
2015-11-02
A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density function (pdf) over such a shape space and to sample models according to the designed pdf. First, we propose a user interface that enables a user to quickly provide preference scores for selected shapes and suggest sampling strategies to decide which models to present to the user to evaluate. Second, we propose a novel kernel function to encode the similarity between two procedural models. Third, we propose a framework to interpolate user preference scores by combining multiple techniques: function factorization, Gaussian process regression, autorelevance detection, and l1 regularization. Fourth, we modify the original grammars to generate models with a pdf proportional to the user preference scores. Finally, we provide evaluations of our user interface and framework parameters and a comparison to other exploratory modeling techniques using modeling tasks in five example shape spaces: furniture, low-rise buildings, skyscrapers, airplanes, and vegetation.
Probability of detection as a function of multiple influencing parameters
Energy Technology Data Exchange (ETDEWEB)
Pavlovic, Mato
2014-10-15
Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.
Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum
Directory of Open Access Journals (Sweden)
Farshid eSepehrband
2016-05-01
Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.
Probability of detection as a function of multiple influencing parameters
International Nuclear Information System (INIS)
Pavlovic, Mato
2014-01-01
Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.
Robust functional statistics applied to Probability Density Function shape screening of sEMG data.
Boudaoud, S; Rix, H; Al Harrach, M; Marin, F
2014-01-01
Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.
Conditional probability of the tornado missile impact given a tornado occurrence
International Nuclear Information System (INIS)
Goodman, J.; Koch, J.E.
1982-01-01
Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets
Probability Density Function Method for Observing Reconstructed Attractor Structure
Institute of Scientific and Technical Information of China (English)
陆宏伟; 陈亚珠; 卫青
2004-01-01
Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men. PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor. To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure. Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6 - 6.5 dimensional complex dynamical systems. It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough. A cluster effect mechanism is presented to explain this phenomenon. By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated. Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.
Radakovic, Nenad; McDougall, Douglas
2012-01-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…
Assumed Probability Density Functions for Shallow and Deep Convection
Directory of Open Access Journals (Sweden)
Steven K Krueger
2010-10-01
Full Text Available The assumed joint probability density function (PDF between vertical velocity and conserved temperature and total water scalars has been suggested to be a relatively computationally inexpensive and unified subgrid-scale (SGS parameterization for boundary layer clouds and turbulent moments. This paper analyzes the performance of five families of PDFs using large-eddy simulations of deep convection, shallow convection, and a transition from stratocumulus to trade wind cumulus. Three of the PDF families are based on the double Gaussian form and the remaining two are the single Gaussian and a Double Delta Function (analogous to a mass flux model. The assumed PDF method is tested for grid sizes as small as 0.4 km to as large as 204.8 km. In addition, studies are performed for PDF sensitivity to errors in the input moments and for how well the PDFs diagnose some higher-order moments. In general, the double Gaussian PDFs more accurately represent SGS cloud structure and turbulence moments in the boundary layer compared to the single Gaussian and Double Delta Function PDFs for the range of grid sizes tested. This is especially true for small SGS cloud fractions. While the most complex PDF, Lewellen-Yoh, better represents shallow convective cloud properties (cloud fraction and liquid water mixing ratio compared to the less complex Analytic Double Gaussian 1 PDF, there appears to be no advantage in implementing Lewellen-Yoh for deep convection. However, the Analytic Double Gaussian 1 PDF better represents the liquid water flux, is less sensitive to errors in the input moments, and diagnoses higher order moments more accurately. Between the Lewellen-Yoh and Analytic Double Gaussian 1 PDFs, it appears that neither family is distinctly better at representing cloudy layers. However, due to the reduced computational cost and fairly robust results, it appears that the Analytic Double Gaussian 1 PDF could be an ideal family for SGS cloud and turbulence
International Nuclear Information System (INIS)
Wayland, J.R.
1977-03-01
The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
Interactive design of probability density functions for shape grammars
Dang, Minh; Lienhard, Stefan; Ceylan, Duygu; Neubert, Boris; Wonka, Peter; Pauly, Mark
2015-01-01
A shape grammar defines a procedural shape space containing a variety of models of the same class, e.g. buildings, trees, furniture, airplanes, bikes, etc. We present a framework that enables a user to interactively design a probability density
On the discretization of probability density functions and the ...
Indian Academy of Sciences (India)
important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.
International Nuclear Information System (INIS)
Li Qianshu; Lue Liqiang; Wei Gongmin
2004-01-01
This paper discusses the relationship between the Wigner function, along with other related quasiprobability distribution functions, and the probability density distribution function constructed from the wave function of the Schroedinger equation in quantum phase space, as formulated by Torres-Vega and Frederick (TF). At the same time, a general approach in solving the wave function of the Schroedinger equation of TF quantum phase space theory is proposed. The relationship of the wave functions between the TF quantum phase space representation and the coordinate or momentum representation is thus revealed
Off-critical local height probabilities on a plane and critical partition functions on a cylinder
Directory of Open Access Journals (Sweden)
Omar Foda
2018-03-01
Full Text Available We compute off-critical local height probabilities in regime-III restricted solid-on-solid models in a 4N-quadrant spiral geometry, with periodic boundary conditions in the angular direction, and fixed boundary conditions in the radial direction, as a function of N, the winding number of the spiral, and τ, the departure from criticality of the model, and observe that the result depends only on the product Nτ. In the limit N→1, τ→τ0, such that τ0 is finite, we recover the off-critical local height probability on a plane, τ0-away from criticality. In the limit N→∞, τ→0, such that Nτ=τ0 is finite, and following a conformal transformation, we obtain a critical partition function on a cylinder of aspect-ratio τ0. We conclude that the off-critical local height probability on a plane, τ0-away from criticality, is equal to a critical partition function on a cylinder of aspect-ratio τ0, in agreement with a result of Saleur and Bauer.
ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals
International Nuclear Information System (INIS)
Vogel, J.E.
1983-01-01
1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x
International Nuclear Information System (INIS)
Kang, Dae Il; Han, Sang Hoon
2006-01-01
RG 1.177 requires that the conditional risk (incremental conditional core damage probability and incremental conditional large early release probability: ICCDP and ICLERP), given that a specific component is out of service (OOS), be quantified for a permanent change of the allowed outage time (AOT) of a safety system. An AOT is the length of time that a particular component or system is permitted to be OOS while the plant is operating. The ICCDP is defined as: ICCDP = [(conditional CDF with the subject equipment OOS)- (baseline CDF with nominal expected equipment unavailabilities)] [duration of the single AOT under consideration]. Any event enabling the component OOS can initiate the time clock for the limiting condition of operation for a nuclear power plant. Thus, the largest ICCDP among the ICCDPs estimated from any occurrence of the basic events for the component fault tree should be selected for determining whether the AOT can be extended or not. If the component is under a preventive maintenance, the conditional risk can be straightforwardly calculated without changing the CCF probability. The main concern is the estimations of the CCF probability because there are the possibilities of the failures of other similar components due to the same root causes. The quantifications of the risk, given that a subject equipment is in a failed state, are performed by setting the identified event of subject equipment to TRUE. The CCF probabilities are also changed according to the identified failure cause. In the previous studies, however, the ICCDP was quantified with the consideration of the possibility of a simultaneous occurrence of two CCF events. Based on the above, we derived the formulas of the CCF probabilities for the cases where a specific component is in a failed state and we presented sample calculation results of the ICCDP for the low pressure safety injection system (LPSIS) of Ulchin Unit 3
Comparison of density estimators. [Estimation of probability density functions
Energy Technology Data Exchange (ETDEWEB)
Kao, S.; Monahan, J.F.
1977-09-01
Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)
International Nuclear Information System (INIS)
Ozgener, B.; Ozgener, H.A.
2005-01-01
A multiregion, multigroup collision probability method with white boundary condition is developed for thermalization calculations of light water moderated reactors. Hydrogen scatterings are treated by Nelkin's kernel while scatterings from other nuclei are assumed to obey the free-gas scattering kernel. The isotropic return (white) boundary condition is applied directly by using the appropriate collision probabilities. Comparisons with alternate numerical methods show the validity of the present formulation. Comparisons with some experimental results indicate that the present formulation is capable of calculating disadvantage factors which are closer to the experimental results than alternative methods
The force distribution probability function for simple fluids by density functional theory.
Rickayzen, G; Heyes, D M
2013-02-28
Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.
On the probability density interpretation of smoothed Wigner functions
International Nuclear Information System (INIS)
De Aguiar, M.A.M.; Ozorio de Almeida, A.M.
1990-01-01
It has been conjectured that the averages of the Wigner function over phase space volumes, larger than those of minimum uncertainty, are always positive. This is true for Gaussian averaging, so that the Husimi distribution is positive. However, we provide a specific counterexample for the averaging with a discontinuous hat function. The analysis of the specific system of a one-dimensional particle in a box also elucidates the respective advantages of the Wigner and the Husimi functions for the study of the semiclassical limit. The falsification of the averaging conjecture is shown not to depend on the discontinuities of the hat function, by considering the latter as the limit of a sequence of analytic functions. (author)
Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures
Prodromou, Theodosia
2016-01-01
In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…
Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory
Chruściński, Dariusz
2013-03-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.
Quantum-correlation breaking channels, quantum conditional probability and Perron–Frobenius theory
International Nuclear Information System (INIS)
Chruściński, Dariusz
2013-01-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum–classical and classical–classical channels. Applying the quantum analog of Perron–Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum–classical channels to arbitrary quantum channels.
Demand and choice probability generating functions for perturbed consumers
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2011-01-01
This paper considers demand systems for utility-maximizing consumers equipped with additive linearly perturbed utility of the form U(x)+m⋅x and faced with general budget constraints x 2 B. Given compact budget sets, the paper provides necessary as well as sufficient conditions for a demand genera...
Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function
Fennell, John; Baddeley, Roland
2012-01-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…
The effect of conditional probability of chord progression on brain response: an MEG study.
Directory of Open Access Journals (Sweden)
Seung-Goo Kim
Full Text Available BACKGROUND: Recent electrophysiological and neuroimaging studies have explored how and where musical syntax in Western music is processed in the human brain. An inappropriate chord progression elicits an event-related potential (ERP component called an early right anterior negativity (ERAN or simply an early anterior negativity (EAN in an early stage of processing the musical syntax. Though the possible underlying mechanism of the EAN is assumed to be probabilistic learning, the effect of the probability of chord progressions on the EAN response has not been previously explored explicitly. METHODOLOGY/PRINCIPAL FINDINGS: In the present study, the empirical conditional probabilities in a Western music corpus were employed as an approximation of the frequencies in previous exposure of participants. Three types of chord progression were presented to musicians and non-musicians in order to examine the correlation between the probability of chord progression and the neuromagnetic response using magnetoencephalography (MEG. Chord progressions were found to elicit early responses in a negatively correlating fashion with the conditional probability. Observed EANm (as a magnetic counterpart of the EAN component responses were consistent with the previously reported EAN responses in terms of latency and location. The effect of conditional probability interacted with the effect of musical training. In addition, the neural response also correlated with the behavioral measures in the non-musicians. CONCLUSIONS/SIGNIFICANCE: Our study is the first to reveal the correlation between the probability of chord progression and the corresponding neuromagnetic response. The current results suggest that the physiological response is a reflection of the probabilistic representations of the musical syntax. Moreover, the results indicate that the probabilistic representation is related to the musical training as well as the sensitivity of an individual.
Consolidity analysis for fully fuzzy functions, matrices, probability and statistics
Directory of Open Access Journals (Sweden)
Walaa Ibrahim Gabr
2015-03-01
Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.
Class dependency of fuzzy relational database using relational calculus and conditional probability
Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya
2018-03-01
In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.
Fram, Miranda S.; Belitz, Kenneth
2011-01-01
We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).
Probability distribution functions for intermittent scrape-off layer plasma fluctuations
Theodorsen, A.; Garcia, O. E.
2018-03-01
A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.
Fernandes, Kátia; Verchot, Louis; Baethgen, Walter; Gutierrez-Velez, Victor; Pinedo-Vasquez, Miguel; Martius, Christopher
2017-05-01
In Indonesia, drought driven fires occur typically during the warm phase of the El Niño Southern Oscillation. This was the case of the events of 1997 and 2015 that resulted in months-long hazardous atmospheric pollution levels in Equatorial Asia and record greenhouse gas emissions. Nonetheless, anomalously active fire seasons have also been observed in non-drought years. In this work, we investigated the impact of temperature on fires and found that when the July-October (JASO) period is anomalously dry, the sensitivity of fires to temperature is modest. In contrast, under normal-to-wet conditions, fire probability increases sharply when JASO is anomalously warm. This describes a regime in which an active fire season is not limited to drought years. Greater susceptibility to fires in response to a warmer environment finds support in the high evapotranspiration rates observed in normal-to-wet and warm conditions in Indonesia. We also find that fire probability in wet JASOs would be considerably less sensitive to temperature were not for the added effect of recent positive trends. Near-term regional climate projections reveal that, despite negligible changes in precipitation, a continuing warming trend will heighten fire probability over the next few decades especially in non-drought years. Mild fire seasons currently observed in association with wet conditions and cool temperatures will become rare events in Indonesia.
Impact of proof test interval and coverage on probability of failure of safety instrumented function
International Nuclear Information System (INIS)
Jin, Jianghong; Pang, Lei; Hu, Bin; Wang, Xiaodong
2016-01-01
Highlights: • Introduction of proof test coverage makes the calculation of the probability of failure for SIF more accurate. • The probability of failure undetected by proof test is independently defined as P TIF and calculated. • P TIF is quantified using reliability block diagram and simple formula of PFD avg . • Improving proof test coverage and adopting reasonable test period can reduce the probability of failure for SIF. - Abstract: Imperfection of proof test can result in the safety function failure of safety instrumented system (SIS) at any time in its life period. IEC61508 and other references ignored or only elementarily analyzed the imperfection of proof test. In order to further study the impact of the imperfection of proof test on the probability of failure for safety instrumented function (SIF), the necessity of proof test and influence of its imperfection on system performance was first analyzed theoretically. The probability of failure for safety instrumented function resulted from the imperfection of proof test was defined as probability of test independent failures (P TIF ), and P TIF was separately calculated by introducing proof test coverage and adopting reliability block diagram, with reference to the simplified calculation formula of average probability of failure on demand (PFD avg ). Research results show that: the shorter proof test period and the higher proof test coverage indicate the smaller probability of failure for safety instrumented function. The probability of failure for safety instrumented function which is calculated by introducing proof test coverage will be more accurate.
Power probability density function control and performance assessment of a nuclear research reactor
International Nuclear Information System (INIS)
Abharian, Amir Esmaeili; Fadaei, Amir Hosein
2014-01-01
Highlights: • In this paper, the performance assessment of static PDF control system is discussed. • The reactor PDF model is set up based on the B-spline functions. • Acquaints of Nu, and Th-h. equations solve concurrently by reformed Hansen’s method. • A principle of performance assessment is put forward for the PDF of the NR control. - Abstract: One of the main issues in controlling a system is to keep track of the conditions of the system function. The performance condition of the system should be inspected continuously, to keep the system in reliable working condition. In this study, the nuclear reactor is considered as a complicated system and a principle of performance assessment is used for analyzing the performance of the power probability density function (PDF) of the nuclear research reactor control. First, the model of the power PDF is set up, then the controller is designed to make the power PDF for tracing the given shape, that make the reactor to be a closed-loop system. The operating data of the closed-loop reactor are used to assess the control performance with the performance assessment criteria. The modeling, controller design and the performance assessment of the power PDF are all applied to the control of Tehran Research Reactor (TRR) power in a nuclear process. In this paper, the performance assessment of the static PDF control system is discussed, the efficacy and efficiency of the proposed method are investigated, and finally its reliability is proven
International Nuclear Information System (INIS)
Niestegge, Gerd
2010-01-01
In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lueders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases. (general)
Kuhlmann, Beatrice G; Vaterrodt, Bianca; Bayen, Ute J
2012-09-01
Two experiments examined reliance on schematic knowledge in source monitoring. Based on a probability-matching account of source guessing, a schema bias will only emerge if participants do not have a representation of the source-item contingency in the study list, or if the perceived contingency is consistent with schematic expectations. Thus, the account predicts that encoding conditions that affect contingency detection also affect schema bias. In Experiment 1, the schema bias commonly found when schematic information about the sources is not provided before encoding was diminished by an intentional source-memory instruction. In Experiment 2, the depth of processing of schema-consistent and schema-inconsistent source-item pairings was manipulated. Participants consequently overestimated the occurrence of the pairing type they processed in a deep manner, and their source guessing reflected this biased contingency perception. Results support the probability-matching account of source guessing. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Probability laws related to the Jacobi theta and Riemann zeta function and Brownian excursions
Biane, P.; Pitman, J.; Yor, M.
1999-01-01
This paper reviews known results which connect Riemann's integral representations of his zeta function, involving Jacobi's theta function and its derivatives, to some particular probability laws governing sums of independent exponential variables. These laws are related to one-dimensional Brownian motion and to higher dimensional Bessel processes. We present some characterizations of these probability laws, and some approximations of Riemann's zeta function which are related to these laws.
Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.
Fennell, John; Baddeley, Roland
2012-10-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Bakosi, J.; Franzese, P.; Boybeyi, Z.
2007-11-01
Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth and Pope [Phys. Fluids 29, 387 (1986)] with Durbin's [J. Fluid Mech. 249, 465 (1993)] method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous transport with a nonlocal representation of the near-wall Reynolds stress anisotropy. The presence of walls is incorporated through the imposition of no-slip and impermeability conditions on particles without the use of damping or wall-functions. Information on the turbulent time scale is supplied by the gamma-distribution model of van Slooten et al. [Phys. Fluids 10, 246 (1998)]. Two different micromixing models are compared that incorporate the effect of small scale mixing on the transported scalar: the widely used interaction by exchange with the mean and the interaction by exchange with the conditional mean model. Single-point velocity and concentration statistics are compared to direct numerical simulation and experimental data at Reτ=1080 based on the friction velocity and the channel half width. The joint model accurately reproduces a wide variety of conditional and unconditional statistics in both physical and composition space.
Compact baby universe model in ten dimension and probability function of quantum gravity
International Nuclear Information System (INIS)
Yan Jun; Hu Shike
1991-01-01
The quantum probability functions are calculated for ten-dimensional compact baby universe model. The authors find that the probability for the Yang-Mills baby universe to undergo a spontaneous compactification down to a four-dimensional spacetime is greater than that to remain in the original homogeneous multidimensional state. Some questions about large-wormhole catastrophe are also discussed
Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen
2010-04-01
The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.
Blutner, Reinhard
2009-03-01
Recently, Gerd Niestegge developed a new approach to quantum mechanics via conditional probabilities developing the well-known proposal to consider the Lüders-von Neumann measurement as a non-classical extension of probability conditionalization. I will apply his powerful and rigorous approach to the treatment of concepts using a geometrical model of meaning. In this model, instances are treated as vectors of a Hilbert space H. In the present approach there are at least two possibilities to form categories. The first possibility sees categories as a mixture of its instances (described by a density matrix). In the simplest case we get the classical probability theory including the Bayesian formula. The second possibility sees categories formed by a distinctive prototype which is the superposition of the (weighted) instances. The construction of prototypes can be seen as transferring a mixed quantum state into a pure quantum state freezing the probabilistic characteristics of the superposed instances into the structure of the formed prototype. Closely related to the idea of forming concepts by prototypes is the existence of interference effects. Such inference effects are typically found in macroscopic quantum systems and I will discuss them in connection with several puzzles of bounded rationality. The present approach nicely generalizes earlier proposals made by authors such as Diederik Aerts, Andrei Khrennikov, Ricardo Franco, and Jerome Busemeyer. Concluding, I will suggest that an active dialogue between cognitive approaches to logic and semantics and the modern approach of quantum information science is mandatory.
Hawkes-diffusion process and the conditional probability of defaults in the Eurozone
Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin
2016-05-01
This study examines market information embedded in the European sovereign CDS (credit default swap) market by analyzing the sovereign CDSs of 13 Eurozone countries from January 1, 2008, to February 29, 2012, which includes the recent Eurozone debt crisis period. We design the conditional probability of defaults for the CDS prices based on the Hawkes-diffusion process and obtain the theoretical prices of CDS indexes. To estimate the model parameters, we calibrate the model prices to empirical prices obtained from individual sovereign CDS term structure data. The estimated parameters clearly explain both cross-sectional and time-series data. Our empirical results show that the probability of a huge loss event sharply increased during the Eurozone debt crisis, indicating a contagion effect. Even countries with strong and stable economies, such as Germany and France, suffered from the contagion effect. We also find that the probability of small events is sensitive to the state of the economy, spiking several times due to the global financial crisis and the Greek government debt crisis.
Probability function of breaking-limited surface elevation. [wind generated waves of ocean
Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.
1989-01-01
The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.
Li, Shuying; Zhuang, Jun; Shen, Shifei
2017-07-01
In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.
Du, Yuanwei; Guo, Yubin
2015-01-01
The intrinsic mechanism of multimorbidity is difficult to recognize and prediction and diagnosis are difficult to carry out accordingly. Bayesian networks can help to diagnose multimorbidity in health care, but it is difficult to obtain the conditional probability table (CPT) because of the lack of clinically statistical data. Today, expert knowledge and experience are increasingly used in training Bayesian networks in order to help predict or diagnose diseases, but the CPT in Bayesian networks is usually irrational or ineffective for ignoring realistic constraints especially in multimorbidity. In order to solve these problems, an evidence reasoning (ER) approach is employed to extract and fuse inference data from experts using a belief distribution and recursive ER algorithm, based on which evidence reasoning method for constructing conditional probability tables in Bayesian network of multimorbidity is presented step by step. A multimorbidity numerical example is used to demonstrate the method and prove its feasibility and application. Bayesian network can be determined as long as the inference assessment is inferred by each expert according to his/her knowledge or experience. Our method is more effective than existing methods for extracting expert inference data accurately and is fused effectively for constructing CPTs in a Bayesian network of multimorbidity.
The distribution function of a probability measure on a space with a fractal structure
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Granero, M.A.; Galvez-Rodriguez, J.F.
2017-07-01
In this work we show how to define a probability measure with the help of a fractal structure. One of the keys of this approach is to use the completion of the fractal structure. Then we use the theory of a cumulative distribution function on a Polish ultrametric space and describe it in this context. Finally, with the help of fractal structures, we prove that a function satisfying the properties of a cumulative distribution function on a Polish ultrametric space is a cumulative distribution function with respect to some probability measure on the space. (Author)
Blue functions: probability and current density propagators in non-relativistic quantum mechanics
International Nuclear Information System (INIS)
Withers, L P Jr
2011-01-01
Like a Green function to propagate a particle's wavefunction in time, a Blue function is introduced to propagate the particle's probability and current density. Accordingly, the complete Blue function has four components. They are constructed from path integrals involving a quantity like the action that we call the motion. The Blue function acts on the displaced probability density as the kernel of an integral operator. As a result, we find that the Wigner density occurs as an expression for physical propagation. We also show that, in quantum mechanics, the displaced current density is conserved bilocally (in two places at one time), as expressed by a generalized continuity equation. (paper)
Steurer, Johann; Held, Ulrike; Miettinen, Olli S
2013-11-01
Knowing about a diagnostic probability requires general knowledge about the way in which the probability depends on the diagnostic indicators involved in the specification of the case at issue. Diagnostic probability functions (DPFs) are generally unavailable at present. Our objective was to illustrate how diagnostic experts' case-specific tacit knowledge about diagnostic probabilities could be garnered in the form of DPFs. Focusing on diagnosis of acute coronary heart disease (ACHD), we presented doctors with extensive experience in hospitals' emergency departments a set of hypothetical cases specified in terms of an inclusive set of diagnostic indicators. We translated the medians of these experts' case-specific probabilities into a logistic DPF for ACHD. The principal result was the experts' typical diagnostic probability for ACHD as a joint function of the set of diagnostic indicators. A related result of note was the finding that the experts' probabilities in any given case had a surprising degree of variability. Garnering diagnostic experts' case-specific tacit knowledge about diagnostic probabilities in the form of DPFs is feasible to accomplish. Thus, once the methodology of this type of work has been "perfected," practice-guiding diagnostic expert systems can be developed. Copyright © 2013 Elsevier Inc. All rights reserved.
Francisco, E.; Pendás, A. Martín; Blanco, M. A.
2008-04-01
Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer
Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R
2016-01-01
When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.
Directory of Open Access Journals (Sweden)
Abdalla Ahmed Abdel-Ghaly
2016-06-01
Full Text Available This paper suggests the use of the conditional probability integral transformation (CPIT method as a goodness of fit (GOF technique in the field of accelerated life testing (ALT, specifically for validating the underlying distributional assumption in accelerated failure time (AFT model. The method is based on transforming the data into independent and identically distributed (i.i.d Uniform (0, 1 random variables and then applying the modified Watson statistic to test the uniformity of the transformed random variables. This technique is used to validate each of the exponential, Weibull and lognormal distributions' assumptions in AFT model under constant stress and complete sampling. The performance of the CPIT method is investigated via a simulation study. It is concluded that this method performs well in case of exponential and lognormal distributions. Finally, a real life example is provided to illustrate the application of the proposed procedure.
Directory of Open Access Journals (Sweden)
A. B. Levina
2016-03-01
Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking
On the method of logarithmic cumulants for parametric probability density function estimation.
Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane
2013-10-01
Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.
Quadratic Functionals with General Boundary Conditions
International Nuclear Information System (INIS)
Dosla, Z.; Dosly, O.
1997-01-01
The purpose of this paper is to give the Reid 'Roundabout Theorem' for quadratic functionals with general boundary conditions. In particular, we describe the so-called coupled point and regularity condition introduced in terms of Riccati equation solutions
Rainfall and net infiltration probabilities for future climate conditions at Yucca Mountain
International Nuclear Information System (INIS)
Long, A.; Childs, S.W.
1993-01-01
Performance assessment of repository integrity is a task rendered difficult because it requires predicting the future. This challenge has occupied many scientists who realize that the best assessments are required to maximize the probability of successful repository sitting and design. As part of a performance assessment effort directed by the EPRI, the authors have used probabilistic methods to assess the magnitude and timing of net infiltration at Yucca Mountain. A mathematical model for net infiltration previously published incorporated a probabilistic treatment of climate, surface hydrologic processes and a mathematical model of the infiltration process. In this paper, we present the details of the climatological analysis. The precipitation model is event-based, simulating characteristics of modern rainfall near Yucca Mountain, then extending the model to most likely values for different degrees of pluvial climates. Next the precipitation event model is fed into a process-based infiltration model that considers spatial variability in parameters relevant to net infiltration of Yucca Mountain. The model predicts that average annual net infiltration at Yucca Mountain will range from a mean of about 1 mm under present climatic conditions to a mean of at least 2.4 mm under full glacial (pluvial) conditions. Considerable variations about these means are expected to occur from year-to-year
Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation
Directory of Open Access Journals (Sweden)
Michal Halas
2012-01-01
Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.
Directory of Open Access Journals (Sweden)
Cinicioglu Esma Nur
2014-01-01
Full Text Available Dempster−Shafer belief function theory can address a wider class of uncertainty than the standard probability theory does, and this fact appeals the researchers in operations research society for potential application areas. However, the lack of a decision theory of belief functions gives rise to the need to use the probability transformation methods for decision making. For representation of statistical evidence, the class of consonant belief functions is used which is not closed under Dempster’s rule of combination but is closed under Walley’s rule of combination. In this research, it is shown that the outcomes obtained using both Dempster’s and Walley’s rules do result in different probability distributions when pignistic transformation is used. However, when plausibility transformation is used, they do result in the same probability distribution. This result shows that the choice of the combination rule and probability transformation method may have a significant effect on decision making since it may change the choice of the decision alternative selected. This result is illustrated via an example of missile type identification.
Testing the Conditional Mean Function of Autoregressive Conditional Duration Models
DEFF Research Database (Denmark)
Hautsch, Nikolaus
be subject to censoring structures. In an empirical study based on financial transaction data we present an application of the model to estimate conditional asset price change probabilities. Evaluating the forecasting properties of the model, it is shown that the proposed approach is a promising competitor......This paper proposes a dynamic proportional hazard (PH) model with non-specified baseline hazard for the modelling of autoregressive duration processes. A categorization of the durations allows us to reformulate the PH model as an ordered response model based on extreme value distributed errors...
A Novel Adaptive Conditional Probability-Based Predicting Model for User’s Personality Traits
Directory of Open Access Journals (Sweden)
Mengmeng Wang
2015-01-01
Full Text Available With the pervasive increase in social media use, the explosion of users’ generated data provides a potentially very rich source of information, which plays an important role in helping online researchers understand user’s behaviors deeply. Since user’s personality traits are the driving force of user’s behaviors, hence, in this paper, along with social network features, we first extract linguistic features, emotional statistical features, and topic features from user’s Facebook status updates, followed by quantifying importance of features via Kendall correlation coefficient. And then, on the basis of weighted features and dynamic updated thresholds of personality traits, we deploy a novel adaptive conditional probability-based predicting model which considers prior knowledge of correlations between user’s personality traits to predict user’s Big Five personality traits. In the experimental work, we explore the existence of correlations between user’s personality traits which provides a better theoretical support for our proposed method. Moreover, on the same Facebook dataset, compared to other methods, our method can achieve an F1-measure of 80.6% when taking into account correlations between user’s personality traits, and there is an impressive improvement of 5.8% over other approaches.
International Nuclear Information System (INIS)
Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing
2012-01-01
In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)
Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function
Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.
2011-01-01
In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.
Probability density of wave function of excited photoelectron: understanding XANES features
Czech Academy of Sciences Publication Activity Database
Šipr, Ondřej
2001-01-01
Roč. 8, - (2001), s. 232-234 ISSN 0909-0495 R&D Projects: GA ČR GA202/99/0404 Institutional research plan: CEZ:A02/98:Z1-010-914 Keywords : XANES * PED - probability density of wave function Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.519, year: 2001
Bounds for the probability distribution function of the linear ACD process
Fernandes, Marcelo
2003-01-01
Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.
DEFF Research Database (Denmark)
Falk, Anne Katrine Vinther; Gryning, Sven-Erik
1997-01-01
In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials...
Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.
Allen, Jeff; Ghattas, Andrew
2016-06-01
Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.
Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.
Directory of Open Access Journals (Sweden)
Michael R W Dawson
Full Text Available Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.
Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability
2017-01-01
Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent’s environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned. PMID:28212422
Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.
Dawson, Michael R W; Gupta, Maya
2017-01-01
Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.
Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.
2017-12-01
Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.
Probability distribution for the Gaussian curvature of the zero level surface of a random function
Hannay, J. H.
2018-04-01
A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z) = 0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f = 0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.
Mo, Xueyin; Zhang, Jinglu; Fan, Yuan; Svensson, Peter; Wang, Kelun
2015-01-01
To explore the hypothesis that burning mouth syndrome (BMS) probably is a neuropathic pain condition, thermal and mechanical sensory and pain thresholds were tested and compared with age- and gender-matched control participants using a standardized battery of psychophysical techniques. Twenty-five BMS patients (men: 8, women: 17, age: 49.5 ± 11.4 years) and 19 age- and gender-matched healthy control participants were included. The cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical detection threshold (MDT) and mechanical pain threshold (MPT), in accordance with the German Network of Neuropathic Pain guidelines, were measured at the following four sites: the dorsum of the left hand (hand), the skin at the mental foramen (chin), on the tip of the tongue (tongue), and the mucosa of the lower lip (lip). Statistical analysis was performed using ANOVA with repeated measures to compare the means within and between groups. Furthermore, Z-score profiles were generated, and exploratory correlation analyses between QST and clinical variables were performed. Two-tailed tests with a significance level of 5 % were used throughout. CDTs (P < 0.02) were significantly lower (less sensitivity) and HPTs (P < 0.001) were significantly higher (less sensitivity) at the tongue and lip in BMS patients compared to control participants. WDT (P = 0.007) was also significantly higher at the tongue in BMS patients compared to control subjects . There were no significant differences in MDT and MPT between the BMS patients and healthy subjects at any of the four test sites. Z-scores showed that significant loss of function can be identified for CDT (Z-scores = -0.9±1.1) and HPT (Z-scores = 1.5±0.4). There were no significant correlations between QST and clinical variables (pain intensity, duration, depressions scores). BMS patients had a significant loss of thermal function but not
International Nuclear Information System (INIS)
Oliveira, P.M.C. de.
1976-12-01
A method of calculation of the K atomic shell ionization probability by heavy particles impact, in the semi-classical approximation is presented. In this approximation, the projectile has a classical trajectory. The potential energy due to the projectile is taken as perturbation of the Hamiltonian of the neutral atom. We use scaled Thomas-Fermi wave function for the atomic electrons. The method is valid for intermediate atomic number elements and particle energies of some MeV. Probabilities are calculated for the case of Ag (Z = 47) and protons of 1 and 2 MeV. Results are given as function of scattering angle, and agree well known experimental data and also improve older calculations. (Author) [pt
Electron-trapping probability in natural dosemeters as a function of irradiation temperature
DEFF Research Database (Denmark)
Wallinga, J.; Murray, A.S.; Wintle, A.G.
2002-01-01
The electron-trapping probability in OSL traps as a function of irradiation temperature is investigated for sedimentary quartz and feldspar. A dependency was found for both minerals; this phenomenon could give rise to errors in dose estimation when the irradiation temperature used in laboratory...... procedures is different from that in the natural environment. No evidence was found for the existence of shallow trap saturation effects that Could give rise to a dose-rate dependency of electron trapping....
Bakosi, J.; Franzese, P.; Boybeyi, Z.
2010-01-01
Dispersion of a passive scalar from concentrated sources in fully developed turbulent channel flow is studied with the probability density function (PDF) method. The joint PDF of velocity, turbulent frequency and scalar concentration is represented by a large number of Lagrangian particles. A stochastic near-wall PDF model combines the generalized Langevin model of Haworth & Pope with Durbin's method of elliptic relaxation to provide a mathematically exact treatment of convective and viscous ...
Moser , Gabriele; Zerubia , Josiane; Serpico , Sebastiano B.
2006-01-01
International audience; In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cov...
1981-05-01
Education, 10 (2), A45-A49 (1976). 48. Rain&, R. K., and C. L. Kaul (Koul), "Some inequalities involving the Fox’s H- function," Proceedings of the Indian...1973). 51. Srivastava , A., and K. C. Gupta, "On certain recurrence rela- tions," Mathematische Nachrichten, 46, 13- 23 (1970), 49, 187- 197 (1971). 52...34 Vilnana Parishad Anusandhan Patrika, 10, 205- 217 (1967). 69. Gupta, K. C., and A. Srivastava , "On finite expansions for the H- function," Indian Journal
Directory of Open Access Journals (Sweden)
Veronica Biazzo
2000-05-01
Full Text Available In this paper, we illustrate an implementation with Maple V of some procedures which allow to exactly propagate precise and imprecise probability assessments. The extension of imprecise assessments is based on a suitable generalization of the concept of coherence of de Finetti. The procedures described are supported by some examples and relevant cases.
Systematics of the breakup probability function for {sup 6}Li and {sup 7}Li projectiles
Energy Technology Data Exchange (ETDEWEB)
Capurro, O.A., E-mail: capurro@tandar.cnea.gov.ar [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); Pacheco, A.J.; Arazi, A. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Carnelli, P.F.F. [CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); Fernández Niello, J.O. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); and others
2016-01-15
Experimental non-capture breakup cross sections can be used to determine the probability of projectile and ejectile fragmentation in nuclear reactions involving weakly bound nuclei. Recently, the probability of both type of dissociations has been analyzed in nuclear reactions involving {sup 9}Be projectiles onto various heavy targets at sub-barrier energies. In the present work we extend this kind of systematic analysis to the case of {sup 6}Li and {sup 7}Li projectiles with the purpose of investigating general features of projectile-like breakup probabilities for reactions induced by stable weakly bound nuclei. For that purpose we have obtained the probabilities of projectile and ejectile breakup for a large number of systems, starting from a compilation of the corresponding reported non-capture breakup cross sections. We parametrize the results in accordance with the previous studies for the case of beryllium projectiles, and we discuss their systematic behavior as a function of the projectile, the target mass and the reaction Q-value.
Wave functions and two-electron probability distributions of the Hooke's-law atom and helium
International Nuclear Information System (INIS)
O'Neill, Darragh P.; Gill, Peter M. W.
2003-01-01
The Hooke's-law atom (hookium) provides an exactly soluble model for a two-electron atom in which the nuclear-electron Coulombic attraction has been replaced by a harmonic one. Starting from the known exact position-space wave function for the ground state of hookium, we present the momentum-space wave function. We also look at the intracules, two-electron probability distributions, for hookium in position, momentum, and phase space. These are compared with the Hartree-Fock results and the Coulomb holes (the difference between the exact and Hartree-Fock intracules) in position, momentum, and phase space are examined. We then compare these results with analogous results for the ground state of helium using a simple, explicitly correlated wave function
International Nuclear Information System (INIS)
Watterson, Ian G.
2007-01-01
Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
Exact joint density-current probability function for the asymmetric exclusion process.
Depken, Martin; Stinchcombe, Robin
2004-07-23
We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society
International Nuclear Information System (INIS)
Misawa, T.; Itakura, H.
1995-01-01
The present article focuses on a dynamical simulation of molecular motion in liquids. In the simulation involving diffusion-controlled reaction with discrete time steps, lack of information regarding the trajectory within the time step may result in a failure to count the number of reactions of the particles within the step. In order to rectify this, an interpolated diffusion process is used. The process is derived from a stochastic interpolation formula recently developed by the first author [J. Math. Phys. 34, 775 (1993)]. In this method, the probability that reaction has occurred during the time step given the initial and final positions of the particles is calculated. Some numerical examples confirm that the theoretical result corresponds to an improvement over the Clifford-Green work [Mol. Phys. 57, 123 (1986)] on the same matter
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
International Nuclear Information System (INIS)
Moriya, Netzer
2010-01-01
A method, based on binomial filtering, to estimate the noise level of an arbitrary, smoothed pure signal, contaminated with an additive, uncorrelated noise component is presented. If the noise characteristics of the experimental spectrum are known, as for instance the type of the corresponding probability density function (e.g., Gaussian), the noise properties can be extracted. In such cases, both the noise level, as may arbitrarily be defined, and a simulated white noise component can be generated, such that the simulated noise component is statistically indistinguishable from the true noise component present in the original signal. In this paper we present a detailed analysis of the noise level extraction when the additive noise is Gaussian or Lorentzian. We show that the statistical parameters in these cases (mainly the variance and the half width at half maximum, respectively) can directly be obtained from the experimental spectrum even when the pure signal is erratic. Further discussion is given for cases where the noise probability density function is initially unknown.
Audio Query by Example Using Similarity Measures between Probability Density Functions of Features
Directory of Open Access Journals (Sweden)
Marko Helén
2010-01-01
Full Text Available This paper proposes a query by example system for generic audio. We estimate the similarity of the example signal and the samples in the queried database by calculating the distance between the probability density functions (pdfs of their frame-wise acoustic features. Since the features are continuous valued, we propose to model them using Gaussian mixture models (GMMs or hidden Markov models (HMMs. The models parametrize each sample efficiently and retain sufficient information for similarity measurement. To measure the distance between the models, we apply a novel Euclidean distance, approximations of Kullback-Leibler divergence, and a cross-likelihood ratio test. The performance of the measures was tested in simulations where audio samples are automatically retrieved from a general audio database, based on the estimated similarity to a user-provided example. The simulations show that the distance between probability density functions is an accurate measure for similarity. Measures based on GMMs or HMMs are shown to produce better results than that of the existing methods based on simpler statistics or histograms of the features. A good performance with low computational cost is obtained with the proposed Euclidean distance.
Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.
2018-05-01
As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based
International Nuclear Information System (INIS)
Viana, R.S.; Yoriyaz, H.; Santos, A.
2011-01-01
The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)
Energy Technology Data Exchange (ETDEWEB)
Viana, R.S.; Yoriyaz, H.; Santos, A., E-mail: rodrigossviana@gmail.com, E-mail: hyoriyaz@ipen.br, E-mail: asantos@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2011-07-01
The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)
Development and evaluation of probability density functions for a set of human exposure factors
Energy Technology Data Exchange (ETDEWEB)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-06-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.
Development and evaluation of probability density functions for a set of human exposure factors
International Nuclear Information System (INIS)
Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.
1999-01-01
The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors
Kendal, W S
2000-04-01
To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.
Homfray, Virginia; Tanton, Clare; Mitchell, Kirstin R; Miller, Robert F; Field, Nigel; Macdowall, Wendy; Wellings, Kaye; Sonnenberg, Pam; Johnson, Anne M; Mercer, Catherine H
2015-07-17
Despite biological advantages of male circumcision in reducing HIV/sexually transmitted infection acquisition, concern is often expressed that it may reduce sexual enjoyment and function. We examine the association between circumcision and sexual function among sexually active men in Britain using data from Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3). Natsal-3 asked about circumcision and included a validated measure of sexual function, the Natsal-SF, which takes into account not only sexual difficulties but also the relationship context and overall level of satisfaction. A stratified probability survey of 6293 men and 8869 women aged 16-74 years, resident in Britain, undertaken 2010-2012, using computer-assisted face-to-face interviewing with computer-assisted self-interview for the more sensitive questions. Logistic regression was used to calculate odds ratios (ORs) to examine the association between reporting male circumcision and aspects of sexual function among sexually active men (n = 4816). The prevalence of male circumcision in Britain was 20.7% [95% confidence interval (CI): 19.3-21.8]. There was no association between male circumcision and, being in the lowest quintile of scores for the Natsal-SF, an indicator of poorer sexual function (adjusted OR: 0.95, 95% CI: 0.76-1.18). Circumcised men were as likely as uncircumcised men to report the specific sexual difficulties asked about in Natsal-3, except that a larger proportion of circumcised men reported erectile difficulties. This association was of borderline statistical significance after adjusting for age and relationship status (adjusted OR: 1.27, 95% CI: 0.99-1.63). Data from a large, nationally representative British survey suggest that circumcision is not associated with men's overall sexual function at a population level.
Probability density functions of photochemicals over a coastal area of Northern Italy
International Nuclear Information System (INIS)
Georgiadis, T.; Fortezza, F.; Alberti, L.; Strocchi, V.; Marani, A.; Dal Bo', G.
1998-01-01
The present paper surveys the findings of experimental studies and analyses of statistical probability density functions (PDFs) applied to air pollutant concentrations to provide an interpretation of the ground-level distributions of photochemical oxidants in the coastal area of Ravenna (Italy). The atmospheric-pollution data set was collected from the local environmental monitoring network for the period 1978-1989. Results suggest that the statistical distribution of surface ozone, once normalised over the solar radiation PDF for the whole measurement period, follows a log-normal law as found for other pollutants. Although the Weibull distribution also offers a good fit of the experimental data, the area's meteorological features seem to favour the former distribution once the statistical index estimates have been analysed. Local transport phenomena are discussed to explain the data tail trends
Lei, Youming; Zheng, Fan
2016-12-01
Stochastic chaos induced by diffusion processes, with identical spectral density but different probability density functions (PDFs), is investigated in selected lightly damped Hamiltonian systems. The threshold amplitude of diffusion processes for the onset of chaos is derived by using the stochastic Melnikov method together with a mean-square criterion. Two quasi-Hamiltonian systems, namely, a damped single pendulum and damped Duffing oscillator perturbed by stochastic excitations, are used as illustrative examples. Four different cases of stochastic processes are taking as the driving excitations. It is shown that in such two systems the spectral density of diffusion processes completely determines the threshold amplitude for chaos, regardless of the shape of their PDFs, Gaussian or otherwise. Furthermore, the mean top Lyapunov exponent is employed to verify analytical results. The results obtained by numerical simulations are in accordance with the analytical results. This demonstrates that the stochastic Melnikov method is effective in predicting the onset of chaos in the quasi-Hamiltonian systems.
The probability of return conditional on migration duration: evidence from Kosovo
Directory of Open Access Journals (Sweden)
Kotorri Mrika
2017-12-01
Full Text Available The aim of this paper is to conceptualise the migration duration decision within the expected utility maximisation framework, and from that to derive and estimate an empirical proposition. For this purpose, the conceptual framework in Kotorri (2015 is extended where households decide to return to the home country conditional on their migration duration. In the empirical analysis, the Cox proportional hazards model is employed. This analysis is the first to investigate migration duration based on a random sample stemming from the Kosovo census of population conducted in 2011. The findings suggest rather mixed support for the household approach. The hazard to return decreases with income but not nonlinearly. The results indicate that household return migration behaviour is influenced by demographic characteristics, psychic income, and political factors.
On the evolution of the density probability density function in strongly self-gravitating systems
International Nuclear Information System (INIS)
Girichidis, Philipp; Konstandin, Lukas; Klessen, Ralf S.; Whitworth, Anthony P.
2014-01-01
The time evolution of the probability density function (PDF) of the mass density is formulated and solved for systems in free-fall using a simple approximate function for the collapse of a sphere. We demonstrate that a pressure-free collapse results in a power-law tail on the high-density side of the PDF. The slope quickly asymptotes to the functional form P V (ρ)∝ρ –1.54 for the (volume-weighted) PDF and P M (ρ)∝ρ –0.54 for the corresponding mass-weighted distribution. From the simple approximation of the PDF we derive analytic descriptions for mass accretion, finding that dynamically quiet systems with narrow density PDFs lead to retarded star formation and low star formation rates (SFRs). Conversely, strong turbulent motions that broaden the PDF accelerate the collapse causing a bursting mode of star formation. Finally, we compare our theoretical work with observations. The measured SFRs are consistent with our model during the early phases of the collapse. Comparison of observed column density PDFs with those derived from our model suggests that observed star-forming cores are roughly in free-fall.
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Porto, Markus; Roman, H Eduardo
2002-04-01
We consider autoregressive conditional heteroskedasticity (ARCH) processes in which the variance sigma(2)(y) depends linearly on the absolute value of the random variable y as sigma(2)(y) = a+b absolute value of y. While for the standard model, where sigma(2)(y) = a + b y(2), the corresponding probability distribution function (PDF) P(y) decays as a power law for absolute value of y-->infinity, in the linear case it decays exponentially as P(y) approximately exp(-alpha absolute value of y), with alpha = 2/b. We extend these results to the more general case sigma(2)(y) = a+b absolute value of y(q), with 0 history of the ARCH process is taken into account, the resulting PDF becomes a stretched exponential even for q = 1, with a stretched exponent beta = 2/3, in a much better agreement with the empirical data.
Characterizing the Lyα forest flux probability distribution function using Legendre polynomials
Energy Technology Data Exchange (ETDEWEB)
Cieplak, Agnieszka M.; Slosar, Anže, E-mail: acieplak@bnl.gov, E-mail: anze@bnl.gov [Brookhaven National Laboratory, Bldg 510, Upton, NY, 11973 (United States)
2017-10-01
The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n -th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.
Characterizing the Lyα forest flux probability distribution function using Legendre polynomials
Cieplak, Agnieszka M.; Slosar, Anže
2017-10-01
The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.
Exact probability function for bulk density and current in the asymmetric exclusion process
Depken, Martin; Stinchcombe, Robin
2005-03-01
We examine the asymmetric simple exclusion process with open boundaries, a paradigm of driven diffusive systems, having a nonequilibrium steady-state transition. We provide a full derivation and expanded discussion and digression on results previously reported briefly in M. Depken and R. Stinchcombe, Phys. Rev. Lett. 93, 040602 (2004). In particular we derive an exact form for the joint probability function for the bulk density and current, both for finite systems, and also in the thermodynamic limit. The resulting distribution is non-Gaussian, and while the fluctuations in the current are continuous at the continuous phase transitions, the density fluctuations are discontinuous. The derivations are done by using the standard operator algebraic techniques and by introducing a modified version of the original operator algebra. As a by-product of these considerations we also arrive at a very simple way of calculating the normalization constant appearing in the standard treatment with the operator algebra. Like the partition function in equilibrium systems, this normalization constant is shown to completely characterize the fluctuations, albeit in a very different manner.
Faster exact Markovian probability functions for motif occurrences: a DFA-only approach.
Ribeca, Paolo; Raineri, Emanuele
2008-12-15
The computation of the statistical properties of motif occurrences has an obviously relevant application: patterns that are significantly over- or under-represented in genomes or proteins are interesting candidates for biological roles. However, the problem is computationally hard; as a result, virtually all the existing motif finders use fast but approximate scoring functions, in spite of the fact that they have been shown to produce systematically incorrect results. A few interesting exact approaches are known, but they are very slow and hence not practical in the case of realistic sequences. We give an exact solution, solely based on deterministic finite-state automata (DFA), to the problem of finding the whole relevant part of the probability distribution function of a simple-word motif in a homogeneous (biological) sequence. Out of that, the z-value can always be computed, while the P-value can be obtained either when it is not too extreme with respect to the number of floating-point digits available in the implementation, or when the number of pattern occurrences is moderately low. In particular, the time complexity of the algorithms for Markov models of moderate order (0 manage to obtain an algorithm which is both easily interpretable and efficient. This approach can be used for exact statistical studies of very long genomes and protein sequences, as we illustrate with some examples on the scale of the human genome.
Discovering context-aware conditional functional dependencies
Institute of Scientific and Technical Information of China (English)
Yuefeng DU; Derong SHEN; Tiezheng NIE; Yue KOU; Ge YU
2017-01-01
Conditional functional dependencies(CFDs) are important techniques for data consistency.However,CFDs are limited to 1) provide the reasonable values for consistency repairing and 2) detect potential errors.This paper presents context-aware conditional functional dependencies(CCFDs) which contribute to provide reasonable values and detect potential errors.Especially,we focus on automatically discovering minimal CCFDs.In this paper,we present context relativity to measure the relationship of CFDs.The overlap of the related CFDs can provide reasonable values which result in more accuracy consistency repairing,and some related CFDs are combined into CCFDs.Moreover,we prove that discovering minimal CCFDs is NP-complete and we design the precise method and the heuristic method.We also present the dominating value to facilitate the process in both the precise method and the heuristic method.Additionally,the context relativity of the CFDs affects the cleaning results.We will give an approximate threshold of context relativity according to data distribution for suggestion.The repairing results are approved more accuracy,even evidenced by our empirical evaluation.
Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A
2017-04-01
To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.
Purcell, Jeremy J.; Rapp, Brenda
2013-01-01
Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981
Javitt, D C; Grochowski, S; Shelley, A M; Ritter, W
1998-03-01
Schizophrenia is a severe mental disorder associated with disturbances in perception and cognition. Event-related potentials (ERP) provide a mechanism for evaluating potential mechanisms underlying neurophysiological dysfunction in schizophrenia. Mismatch negativity (MMN) is a short-duration auditory cognitive ERP component that indexes operation of the auditory sensory ('echoic') memory system. Prior studies have demonstrated impaired MMN generation in schizophrenia along with deficits in auditory sensory memory performance. MMN is elicited in an auditory oddball paradigm in which a sequence of repetitive standard tones is interrupted infrequently by a physically deviant ('oddball') stimulus. The present study evaluates MMN generation as a function of deviant stimulus probability, interstimulus interval, interdeviant interval and the degree of pitch separation between the standard and deviant stimuli. The major findings of the present study are first, that MMN amplitude is decreased in schizophrenia across a broad range of stimulus conditions, and second, that the degree of deficit in schizophrenia is largest under conditions when MMN is normally largest. The pattern of deficit observed in schizophrenia differs from the pattern observed in other conditions associated with MMN dysfunction, including Alzheimer's disease, stroke, and alcohol intoxication.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Su, Nan-Yao; Lee, Sang-Hee
2008-04-01
Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.
Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin
2010-05-01
This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.
Rodríguez, Cristina; Drummond, Hugh
2018-01-01
In wild long-lived animals, analysis of impacts of stressful natal conditions on adult performance has rarely embraced the entire age span, and the possibility that costs are expressed late in life has seldom been examined. Using 26 years of data from 8541 fledglings and 1310 adults of the blue-footed booby (Sula nebouxii), a marine bird that can live up to 23 years, we tested whether experiencing the warm waters and food scarcity associated with El Niño in the natal year reduces recruitment or survival over the adult lifetime. Warm water in the natal year reduced the probability of recruiting; each additional degree (°C) of water temperature meant a reduction of roughly 50% in fledglings' probability of returning to the natal colony as breeders. Warm water in the current year impacted adult survival, with greater effect at the oldest ages than during early adulthood. However, warm water in the natal year did not affect survival at any age over the adult lifespan. A previous study showed that early recruitment and widely spaced breeding allow boobies that experience warm waters in the natal year to achieve normal fledgling production over the first 10 years; our results now show that this reproductive effort incurs no survival penalty, not even late in life. This pattern is additional evidence of buffering against stressful natal conditions via life-history adjustments. PMID:29410788
Using probability density function in the procedure for recognition of the type of physical exercise
Directory of Open Access Journals (Sweden)
Cakić Nikola
2017-01-01
Full Text Available This paper presents a method for recognition of physical exercises, using only a triaxial accelerometer of a smartphone. The smartphone itself is free to move inside subject's pocket. Exercises for leg muscle strengthening from subject's standing position squat, right knee rise and lunge with right leg were analyzed. All exercises were performed with the accelerometric sensor of a smartphone placed in the pocket next to the leg used for exercises. In order to test the proposed recognition method, the knee rise exercise of the opposite leg with the same position of the sensor was randomly selected. Filtering of the raw accelerometric signals was carried out using Butterworth tenth-order low-pass filter. The filtered signals from each of the three axes were described using three signal descriptors. After the descriptors were calculated, a probability density function was constructed for each of the descriptors. The program that implemented the proposed recognition method was executed online within an Android application of the smartphone. Signals from two male and two female subjects were considered as a reference for exercise recognition. The exercise recognition accuracy was 94.22% for three performed exercises, and 85.33% for all four considered exercises.
Chakrabarty, Ayan; Wang, Feng; Sun, Kai; Wei, Qi-Huo
Prior studies have shown that low symmetry particles such as micro-boomerangs exhibit behaviour of Brownian motion rather different from that of high symmetry particles because convenient tracking points (TPs) are usually inconsistent with the center of hydrodynamic stress (CoH) where the translational and rotational motions are decoupled. In this paper we study the effects of the translation-rotation coupling on the displacement probability distribution functions (PDFs) of the boomerang colloid particles with symmetric arms. By tracking the motions of different points on the particle symmetry axis, we show that as the distance between the TP and the CoH is increased, the effects of translation-rotation coupling becomes pronounced, making the short-time 2D PDF for fixed initial orientation to change from elliptical to crescent shape and the angle averaged PDFs from ellipsoidal-particle-like PDF to a shape with a Gaussian top and long displacement tails. We also observed that at long times the PDFs revert to Gaussian. This crescent shape of 2D PDF provides a clear physical picture of the non-zero mean displacements observed in boomerangs particles.
International Nuclear Information System (INIS)
Musho, M.K.; Kozak, J.J.
1984-01-01
A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes
Probability density function of a puff dispersing from the wall of a turbulent channel
Nguyen, Quoc; Papavassiliou, Dimitrios
2015-11-01
Study of dispersion of passive contaminants in turbulence has proved to be helpful in understanding fundamental heat and mass transfer phenomena. Many simulation and experimental works have been carried out to locate and track motions of scalar markers in a flow. One method is to combine Direct Numerical Simulation (DNS) and Lagrangian Scalar Tracking (LST) to record locations of markers. While this has proved to be useful, high computational cost remains a concern. In this study, we develop a model that could reproduce results obtained by DNS and LST for turbulent flow. Puffs of markers with different Schmidt numbers were released into a flow field at a frictional Reynolds number of 150. The point of release was at the channel wall, so that both diffusion and convection contribute to the puff dispersion pattern, defining different stages of dispersion. Based on outputs from DNS and LST, we seek the most suitable and feasible probability density function (PDF) that represents distribution of markers in the flow field. The PDF would play a significant role in predicting heat and mass transfer in wall turbulence, and would prove to be helpful where DNS and LST are not always available.
Probability distribution functions for ELM bursts in a series of JET tokamak discharges
International Nuclear Information System (INIS)
Greenhough, J; Chapman, S C; Dendy, R O; Ward, D J
2003-01-01
A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour
Representation of Probability Density Functions from Orbit Determination using the Particle Filter
Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell
2012-01-01
Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.
A joint probability density function of wind speed and direction for wind energy analysis
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Bueno, Celia
2008-01-01
A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy
I. Fission Probabilities, Fission Barriers, and Shell Effects. II. Particle Structure Functions
Energy Technology Data Exchange (ETDEWEB)
Jing, Kexing [Univ. of California, Berkeley, CA (United States)
1999-05-01
In Part I, fission excitation functions of osmium isotopes ^{185,186, 187, 189} Os produced in ^{3}He +^{182,183, 184, 186}W reactions, and of polonium isotopes ^{209,210, 211, 212}Po produced in ^{3}He/^{4}He + ^{206, 207, 208}Pb reactions, were measured with high precision. These excitation functions have been analyzed in detail based upon the transition state formalism. The fission barriers, and shell effects for the corresponding nuclei are extracted from the detailed analyses. A novel approach has been developed to determine upper limits of the transient time of the fission process. The upper limits are constrained by the fission probabilities of neighboring isotopes. The upper limits for the transient time set with this new method are 15x 10^{–21} sec and 25x 10^{–21} sec for 0s and Po compound nuclei, respectively. In Part II, we report on a search for evidence of the optical modulations in the energy spectra of alpha particles emitted from hot compound nuclei. The optical modulations are expected to arise from the ~-particle interaction with the rest of the nucleus as the particle prepares to exit. Some evidence for the modulations has been observed in the alpha spectra measured in the 3He-induced reactions, ^{3}He + ^{nat}Ag in particular. The identification of the modulations involves a technique that subtracts the bulk statistical background from the measured alpha spectra, in order for the modulations to become visible in the residuals. Due to insufficient knowledge of the background spectra, however, the presented evidence should only be regarded as preliminary and tentative.
Non-Maxwellian electron energy probability functions in the plume of a SPT-100 Hall thruster
Giono, G.; Gudmundsson, J. T.; Ivchenko, N.; Mazouffre, S.; Dannenmayer, K.; Loubère, D.; Popelier, L.; Merino, M.; Olentšenko, G.
2018-01-01
We present measurements of the electron density, the effective electron temperature, the plasma potential, and the electron energy probability function (EEPF) in the plume of a 1.5 kW-class SPT-100 Hall thruster, derived from cylindrical Langmuir probe measurements. The measurements were taken on the plume axis at distances between 550 and 1550 mm from the thruster exit plane, and at different angles from the plume axis at 550 mm for three operating points of the thruster, characterized by different discharge voltages and mass flow rates. The bulk of the electron population can be approximated as a Maxwellian distribution, but the measured distributions were seen to decline faster at higher energy. The measured EEPFs were best modelled with a general EEPF with an exponent α between 1.2 and 1.5, and their axial and angular characteristics were studied for the different operating points of the thruster. As a result, the exponent α from the fitted distribution was seen to be almost constant as a function of the axial distance along the plume, as well as across the angles. However, the exponent α was seen to be affected by the mass flow rate, suggesting a possible relationship with the collision rate, especially close to the thruster exit. The ratio of the specific heats, the γ factor, between the measured plasma parameters was found to be lower than the adiabatic value of 5/3 for each of the thruster settings, indicating the existence of non-trivial kinetic heat fluxes in the near collisionless plume. These results are intended to be used as input and/or testing properties for plume expansion models in further work.
Taking potential probability function maps to the local scale and matching them with land use maps
Garg, Saryu; Sinha, Vinayak; Sinha, Baerbel
2013-04-01
Source-Receptor models have been developed using different methods. Residence-time weighted concentration back trajectory analysis and Potential Source Contribution Function (PSCF) are the two most popular techniques for identification of potential sources of a substance in a defined geographical area. Both techniques use back trajectories calculated using global models and assign values of probability/concentration to various locations in an area. These values represent the probability of threshold exceedances / the average concentration measured at the receptor in air masses with a certain residence time over a source area. Both techniques, however, have only been applied to regional and long-range transport phenomena due to inherent limitation with respect to both spatial accuracy and temporal resolution of the of back trajectory calculations. Employing the above mentioned concepts of residence time weighted concentration back-trajectory analysis and PSCF, we developed a source-receptor model capable of identifying local and regional sources of air pollutants like Particulate Matter (PM), NOx, SO2 and VOCs. We use 1 to 30 minute averages of concentration values and wind direction and speed from a single receptor site or from multiple receptor sites to trace the air mass back in time. The model code assumes all the atmospheric transport to be Lagrangian and linearly extrapolates air masses reaching the receptor location, backwards in time for a fixed number of steps. We restrict the model run to the lifetime of the chemical species under consideration. For long lived species the model run is limited to 180 trees/gridsquare); moderate concentrations for agricultural lands with low tree density (1.5-2.5 ppbv for 250 μg/m3 for traffic hotspots in Chandigarh City are observed. Based on the validation against the land use maps, the model appears to do an excellent job in source apportionment and identifying emission hotspots. Acknowledgement: We thank the IISER
Probability distribution functions of turbulence in seepage-affected alluvial channel
Energy Technology Data Exchange (ETDEWEB)
Sharma, Anurag; Kumar, Bimlesh, E-mail: anurag.sharma@iitg.ac.in, E-mail: bimk@iitg.ac.in [Department of Civil Engineering, Indian Institute of Technology Guwahati, 781039 (India)
2017-02-15
The present experimental study is carried out on the probability distribution functions (PDFs) of turbulent flow characteristics within near-bed-surface and away-from-bed surfaces for both no seepage and seepage flow. Laboratory experiments were conducted in the plane sand bed for no seepage (NS), 10% seepage (10%S) and 15% seepage (15%) cases. The experimental calculation of the PDFs of turbulent parameters such as Reynolds shear stress, velocity fluctuations, and bursting events is compared with theoretical expression obtained by Gram–Charlier (GC)-based exponential distribution. Experimental observations follow the computed PDF distributions for both no seepage and seepage cases. Jensen-Shannon divergence (JSD) method is used to measure the similarity between theoretical and experimental PDFs. The value of JSD for PDFs of velocity fluctuation lies between 0.0005 to 0.003 while the JSD value for PDFs of Reynolds shear stress varies between 0.001 to 0.006. Even with the application of seepage, the PDF distribution of bursting events, sweeps and ejections are well characterized by the exponential distribution of the GC series, except that a slight deflection of inward and outward interactions is observed which may be due to weaker events. The value of JSD for outward and inward interactions ranges from 0.0013 to 0.032, while the JSD value for sweep and ejection events varies between 0.0001 to 0.0025. The theoretical expression for the PDF of turbulent intensity is developed in the present study, which agrees well with the experimental observations and JSD lies between 0.007 and 0.015. The work presented is potentially applicable to the probability distribution of mobile-bed sediments in seepage-affected alluvial channels typically characterized by the various turbulent parameters. The purpose of PDF estimation from experimental data is that it provides a complete numerical description in the areas of turbulent flow either at a single or finite number of points
Yura, Harold T; Hanson, Steen G
2012-04-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.
Beghein, Caroline; Trampert, Jeannot
2004-01-01
The presence of radial anisotropy in the upper mantle, transition zone and top of the lower mantle is investigated by applying a model space search technique to Rayleigh and Love wave phase velocity models. Probability density functions are obtained independently for S-wave anisotropy, P-wave anisotropy, intermediate parameter η, Vp, Vs and density anomalies. The likelihoods for P-wave and S-wave anisotropy beneath continents cannot be explained by a dry olivine-rich upper mantle at depths larger than 220 km. Indeed, while shear-wave anisotropy tends to disappear below 220 km depth in continental areas, P-wave anisotropy is still present but its sign changes compared to the uppermost mantle. This could be due to an increase with depth of the amount of pyroxene relative to olivine in these regions, although the presence of water, partial melt or a change in the deformation mechanism cannot be ruled out as yet. A similar observation is made for old oceans, but not for young ones where VSH> VSV appears likely down to 670 km depth and VPH> VPV down to 400 km depth. The change of sign in P-wave anisotropy seems to be qualitatively correlated with the presence of the Lehmann discontinuity, generally observed beneath continents and some oceans but not beneath ridges. Parameter η shows a similar age-related depth pattern as shear-wave anisotropy in the uppermost mantle and it undergoes the same change of sign as P-wave anisotropy at 220 km depth. The ratio between dln Vs and dln Vp suggests that a chemical component is needed to explain the anomalies in most places at depths greater than 220 km. More tests are needed to infer the robustness of the results for density, but they do not affect the results for anisotropy.
International Nuclear Information System (INIS)
Bakosi, Jozsef; Ristorcelli, Raymond J.
2010-01-01
Probability density function (PDF) methods are extended to variable-density pressure-gradient-driven turbulence. We apply the new method to compute the joint PDF of density and velocity in a non-premixed binary mixture of different-density molecularly mixing fluids under gravity. The full time-evolution of the joint PDF is captured in the highly non-equilibrium flow: starting from a quiescent state, transitioning to fully developed turbulence and finally dissipated by molecular diffusion. High-Atwood-number effects (as distinguished from the Boussinesq case) are accounted for: both hydrodynamic turbulence and material mixing are treated at arbitrary density ratios, with the specific volume, mass flux and all their correlations in closed form. An extension of the generalized Langevin model, originally developed for the Lagrangian fluid particle velocity in constant-density shear-driven turbulence, is constructed for variable-density pressure-gradient-driven flows. The persistent small-scale anisotropy, a fundamentally 'non-Kolmogorovian' feature of flows under external acceleration forces, is captured by a tensorial diffusion term based on the external body force. The material mixing model for the fluid density, an active scalar, is developed based on the beta distribution. The beta-PDF is shown to be capable of capturing the mixing asymmetry and that it can accurately represent the density through transition, in fully developed turbulence and in the decay process. The joint model for hydrodynamics and active material mixing yields a time-accurate evolution of the turbulent kinetic energy and Reynolds stress anisotropy without resorting to gradient diffusion hypotheses, and represents the mixing state by the density PDF itself, eliminating the need for dubious mixing measures. Direct numerical simulations of the homogeneous Rayleigh-Taylor instability are used for model validation.
Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan
2017-08-01
We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
ON PROBABILITY FUNCTION OF TRIP ROUTE CHOICE IN PASSENGER TRANSPORT SYSTEM OF CITIES
Directory of Open Access Journals (Sweden)
N. Nefedof
2014-02-01
Full Text Available The results of statistical processing of experimental research data in Kharkiv, aimed at determining the relation between the passenger trip choice probability and the actual vehicles waiting time at bus terminals are presented.
International Nuclear Information System (INIS)
Zegong, Zhou; Changhong, Liu
1995-01-01
On the basis of the research into original distribution function as the importance function after shifting an appropriate distance, this paper takes the variation of similar ratio of the original function to the importance function as the objective function, the optimum shifting distance obtained by use of an optimization method. The optimum importance function resulting from the optimization method can ensure that the number of Monte Carlo simulations is decreased and at the same time the good estimates of the yearly failure probabilities are obtained
International Nuclear Information System (INIS)
Mysina, N Yu; Maksimova, L A; Ryabukho, V P; Gorbatenko, B B
2015-01-01
Investigated are statistical properties of the phase difference of oscillations in speckle-fields at two points in the far-field diffraction region, with different shapes of the scatterer aperture. Statistical and spatial nonuniformity of the probability density function of the field phase difference is established. Numerical experiments show that, for the speckle-fields with an oscillating alternating-sign transverse correlation function, a significant nonuniformity of the probability density function of the phase difference in the correlation region of the field complex amplitude, with the most probable values 0 and p, is observed. A natural statistical interference experiment using Young diagrams has confirmed the results of numerical experiments. (laser applications and other topics in quantum electronics)
Liu, Z.; Kar, J.; Zeng, S.; Tackett, J. L.; Vaughan, M.; Trepte, C. R.; Omar, A. H.; Hu, Y.; Winker, D. M.
2017-12-01
In the CALIPSO retrieval algorithm, detection layers in the lidar measurements is followed by their classification as a "cloud" or "aerosol" using 5-dimensional probability density functions (PDFs). The five dimensions are the mean attenuated backscatter at 532 nm, the layer integrated total attenuated color ratio, the mid-layer altitude, integrated volume depolarization ratio and latitude. The new version 4 (V4) level 2 (L2) data products, released in November 2016, are the first major revision to the L2 product suite since May 2010. Significant calibration changes in the V4 level 1 data necessitated substantial revisions to the V4 L2 CAD algorithm. Accordingly, a new set of PDFs was generated to derive the V4 L2 data products. The V4 CAD algorithm is now applied to layers detected in the stratosphere, where volcanic layers and occasional cloud and smoke layers are observed. Previously, these layers were designated as `stratospheric', and not further classified. The V4 CAD algorithm is also applied to all layers detected at single shot (333 m) resolution. In prior data releases, single shot detections were uniformly classified as clouds. The CAD PDFs used in the earlier releases were generated using a full year (2008) of CALIPSO measurements. Because the CAD algorithm was not applied to stratospheric features, the properties of these layers were not incorporated into the PDFs. When building the V4 PDFs, the 2008 data were augmented with additional data from June 2011, and all stratospheric features were included. The Nabro and Puyehue-Cordon volcanos erupted in June 2011, and volcanic aerosol layers were observed in the upper troposphere and lower stratosphere in both the northern and southern hemispheres. The June 2011 data thus provides the stratospheric aerosol properties needed for comprehensive PDF generation. In contrast to earlier versions of the PDFs, which were generated based solely on observed distributions, construction of the V4 PDFs considered the
Kinetic and dynamic probability-density-function descriptions of disperse turbulent two-phase flows
Minier, Jean-Pierre; Profeta, Christophe
2015-11-01
This article analyzes the status of two classical one-particle probability density function (PDF) descriptions of the dynamics of discrete particles dispersed in turbulent flows. The first PDF formulation considers only the process made up by particle position and velocity Zp=(xp,Up) and is represented by its PDF p (t ;yp,Vp) which is the solution of a kinetic PDF equation obtained through a flux closure based on the Furutsu-Novikov theorem. The second PDF formulation includes fluid variables into the particle state vector, for example, the fluid velocity seen by particles Zp=(xp,Up,Us) , and, consequently, handles an extended PDF p (t ;yp,Vp,Vs) which is the solution of a dynamic PDF equation. For high-Reynolds-number fluid flows, a typical formulation of the latter category relies on a Langevin model for the trajectories of the fluid seen or, conversely, on a Fokker-Planck equation for the extended PDF. In the present work, a new derivation of the kinetic PDF equation is worked out and new physical expressions of the dispersion tensors entering the kinetic PDF equation are obtained by starting from the extended PDF and integrating over the fluid seen. This demonstrates that, under the same assumption of a Gaussian colored noise and irrespective of the specific stochastic model chosen for the fluid seen, the kinetic PDF description is the marginal of a dynamic PDF one. However, a detailed analysis reveals that kinetic PDF models of particle dynamics in turbulent flows described by statistical correlations constitute incomplete stand-alone PDF descriptions and, moreover, that present kinetic-PDF equations are mathematically ill posed. This is shown to be the consequence of the non-Markovian characteristic of the stochastic process retained to describe the system and the use of an external colored noise. Furthermore, developments bring out that well-posed PDF descriptions are essentially due to a proper choice of the variables selected to describe physical systems
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Boundary conditions of the exact impulse wave function
International Nuclear Information System (INIS)
Gravielle, M.; Miraglia, J.E.
1997-01-01
The behavior of the exact impulse wave function is investigated at intermediate and high impact energies. Numerical details of the wave function and its perturbative potential are reported. We conclude that the impulse wave function does not tend to the proper Coulomb asymptotic limit. For electron capture, however, it is shown that the impulse wave function produces reliable probabilities even for intermediate velocities and symmetric collision systems. copyright 1997 The American Physical Society
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
Whitnah, A. M.; Howes, D. B.
1971-01-01
Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.
International Nuclear Information System (INIS)
Ahmed, M.A.
2013-01-01
There is growing body of evidence that usage of computers can adversely affect the visual health. Considering the rising number of computer users in Egypt, computer-related visual symptoms might take an epidemic form. In view of that, this study was undertaken to find out the magnitude of the visual problems in computer operators and its relationship with various personal and workplace factors. Aim: To evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some visual functions. Subjects and Methods: hundred fifty computer operators working in different institutes were randomly selected. They were asked to fill a pre-tested questionnaire (written in Arabic), after obtaining their verbal consent. The selected exposed subjects were were subjected to the following clinical assessment: 1-Visual acuity measurements 2-Refraction (using autorefractometer). 3- Measurements of the ocular dryness defects using the following different diagnostic tests: Schirmer test-,Fluorescein staining , Rose Bengal staining, Tear Break Up Time (TBUT) and LIPCOF test (lid parallel conjunctival fold). A control group included hundred fifty participants, they are working in a field does not necessitate exposure to video display terminals. Inclusion criteria of the subjects were as follows: minimum three symptoms of computer vision syndrome (CVS), minimum one year exposure to (VDT, s) and minimum 6 hs/day in 5working days/week. Exclusion criteria included candidates having ocular pathology like: glaucoma, optic atrophy, diabetic retinopathy, papilledema The following complaints were studied: 1-Tired eyes. 2- Burning eyes with excessive tear production. 3-Dry sore eyes 4-Blurred near vision (letters on the screen run together). 5-Asthenopia. 6-Neck, shoulder and back aches, overall bodily fatigue or tiredness. An interventional protective measure for the selected subjects from the exposed group was administered, it included the following (1
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Sensitivity analysis of limit state functions for probability-based plastic design
Frangopol, D. M.
1984-01-01
The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.
ERG review of containment failure probability and repository functional design criteria
International Nuclear Information System (INIS)
Gopal, S.
1986-06-01
The Engineering Review Group (ERG) was established by the Office of Nuclear Waste Isolation (ONWI) to help evaluate engineering-related issues in the US Department of Energy's nuclear waste repository program. The June 1984 meeting of the ERG considered two topics: (1) statistical probability for containment of nuclides within the waste package and (2) repository design criteria. This report documents the ERG's comments and recommendations on these two subjects and the ONWI response to the specific points raised by ERG
Protein distance constraints predicted by neural networks and probability density functions
DEFF Research Database (Denmark)
Lund, Ole; Frimand, Kenneth; Gorodkin, Jan
1997-01-01
We predict interatomic C-α distances by two independent data driven methods. The first method uses statistically derived probability distributions of the pairwise distance between two amino acids, whilst the latter method consists of a neural network prediction approach equipped with windows taki...... method based on the predicted distances is presented. A homepage with software, predictions and data related to this paper is available at http://www.cbs.dtu.dk/services/CPHmodels/...
Neokosmidis, Ioannis; Kamalakis, Thomas; Chipouras, Aristides; Sphicopoulos, Thomas
2005-01-01
The performance of high-powered wavelength-division multiplexed (WDM) optical networks can be severely degraded by four-wave-mixing- (FWM-) induced distortion. The multicanonical Monte Carlo method (MCMC) is used to calculate the probability-density function (PDF) of the decision variable of a receiver, limited by FWM noise. Compared with the conventional Monte Carlo method previously used to estimate this PDF, the MCMC method is much faster and can accurately estimate smaller error probabilities. The method takes into account the correlation between the components of the FWM noise, unlike the Gaussian model, which is shown not to provide accurate results.
Horne, Avril C.; Szemis, Joanna M.; Webb, J. Angus; Kaur, Simranjit; Stewardson, Michael J.; Bond, Nick; Nathan, Rory
2018-03-01
One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.
Horne, Avril C; Szemis, Joanna M; Webb, J Angus; Kaur, Simranjit; Stewardson, Michael J; Bond, Nick; Nathan, Rory
2018-03-01
One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.
Specht, Matt W; Nicotra, Cassandra M; Kelly, Laura M; Woods, Douglas W; Ricketts, Emily J; Perry-Parrish, Carisa; Reynolds, Elizabeth; Hankinson, Jessica; Grados, Marco A; Ostrander, Rick S; Walkup, John T
2014-03-01
Tic-suppression-based treatments (TSBTs) represent a safe and effective treatment option for Chronic Tic Disorders (CTDs). Prior research has demonstrated that treatment naive youths with CTDs have the capacity to safely and effectively suppress tics for prolonged periods. It remains unclear how tic suppression is achieved. The current study principally examines how effective suppression is achieved and preliminary correlates of the ability to suppress tics. Twelve youths, ages 10 to 17 years, with moderate-to-marked CTDs participated in an alternating sequence of tic freely and reinforced tic suppression conditions during which urge intensity and tic frequency were frequently assessed. Probability of tics occurring was half as likely following high-intensity urges during tic suppression (31%) in contrast to low-intensity urges during tic freely conditions (60%). Age was not associated with ability to suppress. Intelligence indices were associated with or trended toward greater ability to suppress tics. Attention difficulties were not associated with ability to suppress but were associated with tic severity. In contrast to our "selective suppression" hypothesis, we found participants equally capable of suppressing their tics regardless of urge intensity during reinforced tic suppression. Tic suppression was achieved with an "across-the-board" effort to resist urges. Preliminary data suggest that ability to suppress may be associated with general cognitive variables rather than age, tic severity, urge severity, and attention. Treatment naive youths appear to possess a capacity for robust tic suppression. TSBTs may bolster these capacities and/or enable their broader implementation, resulting in symptom improvement. © The Author(s) 2014.
Energy Technology Data Exchange (ETDEWEB)
Barrera, Manuel, E-mail: manuel.barrera@uca.es [Escuela Superior de Ingeniería, University of Cadiz, Avda, Universidad de Cadiz 10, 11519 Puerto Real, Cadiz (Spain); Suarez-Llorens, Alfonso [Facultad de Ciencias, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cadiz (Spain); Casas-Ruiz, Melquiades; Alonso, José J.; Vidal, Juan [CEIMAR, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cádiz (Spain)
2017-05-11
A generic theoretical methodology for the calculation of the efficiency of gamma spectrometry systems is introduced in this work. The procedure is valid for any type of source and detector and can be applied to determine the full energy peak and the total efficiency of any source-detector system. The methodology is based on the idea of underlying probability of detection, which describes the physical model for the detection of the gamma radiation at the particular studied situation. This probability depends explicitly on the direction of the gamma radiation, allowing the use of this dependence the development of more realistic and complex models than the traditional models based on the point source integration. The probability function that has to be employed in practice must reproduce the relevant characteristics of the detection process occurring at the particular studied situation. Once the probability is defined, the efficiency calculations can be performed in general by using numerical methods. Monte Carlo integration procedure is especially useful to perform the calculations when complex probability functions are used. The methodology can be used for the direct determination of the efficiency and also for the calculation of corrections that require this determination of the efficiency, as it is the case of coincidence summing, geometric or self-attenuation corrections. In particular, we have applied the procedure to obtain some of the classical self-attenuation correction factors usually employed to correct for the sample attenuation of cylindrical geometry sources. The methodology clarifies the theoretical basis and approximations associated to each factor, by making explicit the probability which is generally hidden and implicit to each model. It has been shown that most of these self-attenuation correction factors can be derived by using a common underlying probability, having this probability a growing level of complexity as it reproduces more precisely
Problems in probability theory, mathematical statistics and theory of random functions
Sveshnikov, A A
1979-01-01
Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim
Protein single-model quality assessment by feature-based probability density functions.
Cao, Renzhi; Cheng, Jianlin
2016-04-04
Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
1. Plant functional traits provide a mechanistic basis for understanding ecological variation among plant species and the implications of this variation for species distribution, community assembly and restoration. 2. The bulk of our functional trait understanding, however, is centered on traits rel...
Exponential functionals of Brownian motion, I: Probability laws at fixed time
Matsumoto, Hiroyuki; Yor, Marc
2005-01-01
This paper is the first part of our survey on various results about the distribution of exponential type Brownian functionals defined as an integral over time of geometric Brownian motion. Several related topics are also mentioned.
International Nuclear Information System (INIS)
Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.
1992-10-01
The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed
St. Clair, Caryn; Norwitz, Errol R.; Woensdregt, Karlijn; Cackovic, Michael; Shaw, Julia A.; Malkus, Herbert; Ehrenkranz, Richard A.; Illuzzi, Jessica L.
2011-01-01
We sought to define the risk of neonatal respiratory distress syndrome (RDS) as a function of both lecithin/sphingomyelin (L/S) ratio and gestational age. Amniotic fluid L/S ratio data were collected from consecutive women undergoing amniocentesis for fetal lung maturity at Yale-New Haven Hospital from January 1998 to December 2004. Women were included in the study if they delivered a live-born, singleton, nonanomalous infant within 72 hours of amniocentesis. The probability of RDS was modeled using multivariate logistic regression with L/S ratio and gestational age as predictors. A total of 210 mother-neonate pairs (8 RDS, 202 non-RDS) met criteria for analysis. Both gestational age and L/S ratio were independent predictors of RDS. A probability of RDS of 3% or less was noted at an L/S ratio cutoff of ≥3.4 at 34 weeks, ≥2.6 at 36 weeks, ≥1.6 at 38 weeks, and ≥1.2 at term. Under 34 weeks of gestation, the prevalence of RDS was so high that a probability of 3% or less was not observed by this model. These data describe a means of stratifying the probability of neonatal RDS using both gestational age and the L/S ratio and may aid in clinical decision making concerning the timing of delivery. PMID:18773379
An Improvement to DCPT: The Particle Transfer Probability as a Function of Particle's Age
International Nuclear Information System (INIS)
L. Pan; G. S. Bodvarsson
2001-01-01
Multi-scale features of transport processes in fractured porous media make numerical modeling a difficult task of both conceptualization and computation. Dual-continuum particle tracker (DCPT) is an attractive method for modeling large-scale problems typically encountered in the field, such as those in unsaturated zone (UZ) of Yucca Mountain, Nevada. The major advantage is its capability to capture the major features of flow and transport in fractured porous rock (i-e., a fast fracture sub-system combined with a slow matrix sub-system) with reasonable computational resources. However, like other conventional dual-continuum approach-based numerical methods, DCPT (v1.0) is often criticized for failing to capture the transient features of the diffusion depth into the matrix. It may overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, and predict artificial early breakthroughs. The objective of this study is to develop a new theory for calculating the particle transfer probability to captures the transient features of the diffusion depth into the matrix within the framework of the dual-continuum random walk particle method (RWPM)
Cieplak, Agnieszka; Slosar, Anze
2018-01-01
The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.
International Nuclear Information System (INIS)
Tutnov, A.; Alexeev, E.
2001-01-01
'PULSAR-2' and 'PULSAR+' codes make it possible to simulate thermo-mechanical and thermo-physical parameters of WWER fuel elements. The probabilistic approach is used instead of traditional deterministic one to carry out a sensitive study of fuel element behavior under steady-state operation mode. Fuel elements initial parameters are given as a density of the probability distributions. Calculations are provided for all possible combinations of initial data as fuel-cladding gap, fuel density and gas pressure. Dividing values of these parameters to intervals final variants for calculations are obtained . Intervals of permissible fuel-cladding gap size have been divided to 10 equal parts, fuel density and gas pressure - to 5 parts. Probability of each variant realization is determined by multiplying the probabilities of separate parameters, because the tolerances of these parameters are distributed independently. Simulation results are turn out in the probabilistic bar charts. The charts present probability distribution of the changes in fuel outer diameter, hoop stress kinetics and fuel temperature versus irradiation time. A normative safety factor is introduced for control of any criterion realization and for determination of a reserve to the criteria failure. A probabilistic analysis of fuel element behavior under Reactivity Initiating Accident (RIA) is also performed and probability fuel element depressurization under hypothetical RIA is presented
Cognitive Functioning and the Probability of Falls among Seniors in Havana, Cuba
Trujillo, Antonio J.; Hyder, Adnan A.; Steinhardt, Laura C.
2011-01-01
This article explores the connection between cognitive functioning and falls among seniors (greater than or equal to 60 years of age) in Havana, Cuba, after controlling for observable characteristics. Using the SABE (Salud, Bienestar, and Envejecimiento) cross-sectional database, we used an econometric strategy that takes advantage of available…
Directory of Open Access Journals (Sweden)
S. J. Azimzadeh
2017-03-01
Full Text Available Introduction In agricultural ecosystems, organic fertilizers play an important role in producing sustainable agricultural production. Considering this Sajjadi Nik et al (2011 reported that with increasing of vermicompost inoculation with nitroxin biofertilizer, capsule number per sesame plant increased, so that the most of capsule number per plant (124.7 was observed in 10 t/h vermicompost with nitroxin inoculation. Seyyedi and Rezvani Moghaddam (2011 reported that seed number per plant and the thousand kernel weight in treatment of 80 t/h mushroom compost in comparison with control were increased by 2.98 and 1.56 fold. In another experiment, Kato and Yamagishi (2011 reported that seed yield of wheat in application of manures equal to 80 t/h/ year more than 10 years in comparison with application of nitrogen fertilizer at the rate of 204 kg/h, showed significant increasing from 725 to 885 gr/m2. In another study, Rezvani Moghaddam et al (2010 reported that the most (74.08 and the least (60.94 seed number per capsule in sesame was obtained in the treatments of cow manure and control treatments respectively. The aim of this experiment was evaluation the effects of municipal waste compost, vermicompost and cow manure fertilizers in comparison with chemical fertilizer on yield and yield components of canola under two levels of deficit and full irrigation. Materials and Methods In order to evaluate the replacement probability of organic fertilizer with chemical fertilizers in canola cultivation, an experiment was conducted at research farm of Mashhad Faculty of Agriculture in year of 2013. Treatments were fertilizer and irrigation. Irrigation treatments included full and deficit irrigation. Fertilizer treatments included municipal waste compost, vermicompost, manure and chemical fertilizer. Chemical fertilizer included Nitrogen and Phosphorus. Experiment was conducted as split plot in randomized complete block design with three replications. Organic
Chowdhury, Shakhawat
2013-05-01
The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.
Huang, N. E.; Long, S. R.; Bliven, L. F.; Tung, C.-C.
1984-01-01
On the basis of the mapping method developed by Huang et al. (1983), an analytic expression for the non-Gaussian joint probability density function of slope and elevation for nonlinear gravity waves is derived. Various conditional and marginal density functions are also obtained through the joint density function. The analytic results are compared with a series of carefully controlled laboratory observations, and good agreement is noted. Furthermore, the laboratory wind wave field observations indicate that the capillary or capillary-gravity waves may not be the dominant components in determining the total roughness of the wave field. Thus, the analytic results, though derived specifically for the gravity waves, may have more general applications.
Luxton, Gary; Keall, Paul J; King, Christopher R
2008-01-07
To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within approximately 0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD(50), and conversely m and TD(50) are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d(ref), n, v(eff) and the Niemierko equivalent uniform dose (EUD), where d(ref) and v(eff) are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data.
International Nuclear Information System (INIS)
Luxton, Gary; Keall, Paul J; King, Christopher R
2008-01-01
To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within ∼0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD 50 , and conversely m and TD 50 are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d ref , n, v eff and the Niemierko equivalent uniform dose (EUD), where d ref and v eff are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data
Directory of Open Access Journals (Sweden)
Panpan Zhao
2017-05-01
Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.
Lin, Yi-Shin; Heinke, Dietmar; Humphreys, Glyn W
2015-04-01
In this study, we applied Bayesian-based distributional analyses to examine the shapes of response time (RT) distributions in three visual search paradigms, which varied in task difficulty. In further analyses we investigated two common observations in visual search-the effects of display size and of variations in search efficiency across different task conditions-following a design that had been used in previous studies (Palmer, Horowitz, Torralba, & Wolfe, Journal of Experimental Psychology: Human Perception and Performance, 37, 58-71, 2011; Wolfe, Palmer, & Horowitz, Vision Research, 50, 1304-1311, 2010) in which parameters of the response distributions were measured. Our study showed that the distributional parameters in an experimental condition can be reliably estimated by moderate sample sizes when Monte Carlo simulation techniques are applied. More importantly, by analyzing trial RTs, we were able to extract paradigm-dependent shape changes in the RT distributions that could be accounted for by using the EZ2 diffusion model. The study showed that Bayesian-based RT distribution analyses can provide an important means to investigate the underlying cognitive processes in search, including stimulus grouping and the bottom-up guidance of attention.
Optimization of functionalization conditions for protein analysis by AFM
Energy Technology Data Exchange (ETDEWEB)
Arroyo-Hernández, María, E-mail: maria.arroyo@ctb.upm.es [Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, 28223 Pozuelo de Alarcón, Madrid (Spain); Departamento de Ciencia de Materiales, ETSI Caminos, Canales y Puertos, Universidad Politécnica de Madrid, 28040 Madrid (Spain); Daza, Rafael; Pérez-Rigueiro, Jose; Elices, Manuel; Nieto-Márquez, Jorge; Guinea, Gustavo V. [Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, 28223 Pozuelo de Alarcón, Madrid (Spain); Departamento de Ciencia de Materiales, ETSI Caminos, Canales y Puertos, Universidad Politécnica de Madrid, 28040 Madrid (Spain)
2014-10-30
Highlights: • Highest fluorescence is obtained for central conditions. • Largest primary amine contribution is obtained for central conditions. • RMS roughness is smaller than 1 nm for all functional films. • Selected deposition conditions lead to proper RMS and functionality values. • LDH proteins adsorbed on AVS-films were observed by AFM. - Abstract: Activated vapor silanization (AVS) is used to functionalize silicon surfaces through deposition of amine-containing thin films. AVS combines vapor silanization and chemical vapor deposition techniques and allows the properties of the functionalized layers (thickness, amine concentration and topography) to be controlled by tuning the deposition conditions. An accurate characterization is performed to correlate the deposition conditions and functional-film properties. In particular, it is shown that smooth surfaces with a sufficient surface density of amine groups may be obtained with this technique. These surfaces are suitable for the study of proteins with atomic force microscopy.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
Energy Technology Data Exchange (ETDEWEB)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
2018-01-01
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.
Liang, Yingjie; Chen, Wen
2018-04-01
The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.
International Nuclear Information System (INIS)
Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.; Fitzgerald, R.
2010-01-01
Numerical values of charged-particle thermonuclear reaction rates for nuclei in the A=14 to 40 region are tabulated. The results are obtained using a method, based on Monte Carlo techniques, that has been described in the preceding paper of this issue (Paper I). We present a low rate, median rate and high rate which correspond to the 0.16, 0.50 and 0.84 quantiles, respectively, of the cumulative reaction rate distribution. The meaning of these quantities is in general different from the commonly reported, but statistically meaningless expressions, 'lower limit', 'nominal value' and 'upper limit' of the total reaction rate. In addition, we approximate the Monte Carlo probability density function of the total reaction rate by a lognormal distribution and tabulate the lognormal parameters μ and σ at each temperature. We also provide a quantitative measure (Anderson-Darling test statistic) for the reliability of the lognormal approximation. The user can implement the approximate lognormal reaction rate probability density functions directly in a stellar model code for studies of stellar energy generation and nucleosynthesis. For each reaction, the Monte Carlo reaction rate probability density functions, together with their lognormal approximations, are displayed graphically for selected temperatures in order to provide a visual impression. Our new reaction rates are appropriate for bare nuclei in the laboratory. The nuclear physics input used to derive our reaction rates is presented in the subsequent paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.
Energy Technology Data Exchange (ETDEWEB)
Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)
2016-10-15
This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.
International Nuclear Information System (INIS)
Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo
2016-01-01
This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage
Directory of Open Access Journals (Sweden)
Jong Kyeom Lee
2016-10-01
Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.
International Nuclear Information System (INIS)
Halenka, J.; Olchawa, W.
2005-01-01
From experiments, see e.g. [W. Wiese, D. Kelleher, and D. Paquette, Phys. Rev. A 6, 1132 (1972); V. Helbig and K. Nich, J. Phys. B 14, 3573 (1981).; J. Halenka, Z. Phys. D 16, 1 (1990); . Djurovic, D. Nikolic, I. Savic, S. Sorge, and A.V. Demura, Phys. Rev. E 71, 036407 (2005)], results that the hydrogen lines formed in plasma with N e φ 10 16 cm -3 are asymmetrical. The inhomogeneity of ionic micro field and the higher order corrections (quadratic and next ones) in perturbation theory are the reason for such asymmetry. So far, the ion-emitter quadrupole interaction and the quadratic Stark effect have been included in calculations. The recent work shows that a significant discrepancy between calculations and measurements occurs in the wings of H-beta line in plasmas with cm -3 . It should be stressed here that e.g. for the energy operator the correction raised by the quadratic Stark effect is proportional to (where is the emitter-perturber distance) similarly as the correction caused by the emitter-perturber octupole interaction and the quadratic correction from emitter-perturber quadrupole interaction. Thus, it is obvious that a model of the profile calculation is consistent one if all the aforementioned corrections are simultaneously included. Such calculations are planned in the future paper. A statistics of the octupole inhomogeneity tensor in a plasma is necessarily needed in the first step of such calculations. For the first time the distribution functions of the octupole inhomogeneity have been calculated in this paper using the Mayer-Mayer cluster expansion method similarly as for the quadrupole function in the paper [J. Halenka, Z. Phys. D 16, 1 (1990)]. The quantity is the reduced scale of the micro field strength, where is the Holtsmark normal field and is the mean distance defined by the relationship, that is approximately equal to the mean ion-ion distance; whereas is the screening parameter, where is the electronic Debye radius. (author)
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Analysis of Cleaning Process for Several Kinds of Soil by Probability Density Functional Method.
Fujimoto, Akihiro; Tanaka, Terumasa; Oya, Masaru
2017-10-01
A method of analyzing the detergency of various soils by assuming normal distributions for the soil adhesion and soil removal forces was developed by considering the relationship between the soil type and the distribution profile of the soil removal force. The effect of the agitation speed on the soil removal was also analyzed by this method. Washing test samples were prepared by soiling fabrics with individual soils such as particulate soils, oily dyes, and water-soluble dyes. Washing tests were conducted using a Terg-O-Tometer and four repetitive washing cycles of 5 min each. The transition of the removal efficiencies was recorded in order to calculate the mean value (μ rl ) and the standard deviation (σ rl ) of the removal strength distribution. The level of detergency and the temporal alteration in the detergency can be represented by μ rl and σ rl , respectively. A smaller σ rl indicates a smaller increase in the detergency with time, which also indicates the existence of a certain amount of soil with a strong adhesion force. As a general trend, the values of σ rl were the greatest for the oily soils, followed by those of the water-soluble soils and particulate soils in succession. The relationship between the soil removal processes and the soil adhesion force was expressed on the basis of the transition of the distribution of residual soil. Evaluation of the effects of the agitation speed on µ rl and ơ rl showed that σ rl was not affected by the agitation speed; the value of µ rl for solid soil and oily soil increased with increasing agitation, and the µ rl of water-soluble soil was not specifically affected by the agitation speed. It can be assumed that the parameter ơ rl is related to the characteristics of the soil and the adhesion condition, and can be applied to estimating the soil removal mechanism.
Pure state condition for the semi-classical Wigner function
International Nuclear Information System (INIS)
Ozorio de Almeida, A.M.
1982-01-01
The Wigner function W(p,q) is a symmetrized Fourier transform of the density matrix e(q 1 ,q 2 ), representing quantum-mechanical states or their statistical mixture in phase space. Identification of these two alternatives in the case of density matrices depends on the projection identity e 2 = e; its Wigner correspondence is the pure state condition. This criterion is applied to the Wigner functions botained from standard semiclassical wave functions, determining as pure states those whose classical invariant tori satisfy the generalized Bohr-Sommerfeld conditions. Superpositions of eigenstates are then examined and it is found that the Wigner function corresponding to Gaussian random wave functions are smoothed out in the manner of mixedstate Wigner functions. Attention is also given to the pure-state condition in the case where an angular coordinate is used. (orig.)
Directory of Open Access Journals (Sweden)
Chaojiao Sun
2016-01-01
Full Text Available An adaptive neural control scheme is proposed for nonaffine nonlinear system without using the implicit function theorem or mean value theorem. The differential conditions on nonaffine nonlinear functions are removed. The control-gain function is modeled with the nonaffine function probably being indifferentiable. Furthermore, only a semibounded condition for nonaffine nonlinear function is required in the proposed method, and the basic idea of invariant set theory is then constructively introduced to cope with the difficulty in the control design for nonaffine nonlinear systems. It is rigorously proved that all the closed-loop signals are bounded and the tracking error converges to a small residual set asymptotically. Finally, simulation examples are provided to demonstrate the effectiveness of the designed method.
Hand-related physical function in rheumatic hand conditions
DEFF Research Database (Denmark)
Klokker, Louise; Terwee, Caroline; Wæhrens, Eva Elisabet Ejlersen
2016-01-01
INTRODUCTION: There is no consensus about what constitutes the most appropriate patient-reported outcome measurement (PROM) instrument for measuring physical function in patients with rheumatic hand conditions. Existing instruments lack psychometric testing and vary in feasibility...... and their psychometric qualities. We aim to develop a PROM instrument to assess hand-related physical function in rheumatic hand conditions. METHODS AND ANALYSIS: We will perform a systematic search to identify existing PROMs to rheumatic hand conditions, and select items relevant for hand-related physical function...... as well as those items from the Patient Reported Outcomes Measurement Information System (PROMIS) Physical Function (PF) item bank that are relevant to patients with rheumatic hand conditions. Selection will be based on consensus among reviewers. Content validity of selected items will be established...
Hand-related physical function in rheumatic hand conditions
DEFF Research Database (Denmark)
Klokker, Louise; Terwee, Caroline B; Wæhrens, Eva Ejlersen
2016-01-01
as well as those items from the Patient Reported Outcomes Measurement Information System (PROMIS) Physical Function (PF) item bank that are relevant to patients with rheumatic hand conditions. Selection will be based on consensus among reviewers. Content validity of selected items will be established......INTRODUCTION: There is no consensus about what constitutes the most appropriate patient-reported outcome measurement (PROM) instrument for measuring physical function in patients with rheumatic hand conditions. Existing instruments lack psychometric testing and vary in feasibility...... and their psychometric qualities. We aim to develop a PROM instrument to assess hand-related physical function in rheumatic hand conditions. METHODS AND ANALYSIS: We will perform a systematic search to identify existing PROMs to rheumatic hand conditions, and select items relevant for hand-related physical function...
International Nuclear Information System (INIS)
Cao Hong-Jun; Zhang Hui-Qiang; Lin Wen-Yi
2012-01-01
Four kinds of presumed probability-density-function (PDF) models for non-premixed turbulent combustion are evaluated in flames with various stoichiometric mixture fractions by using large eddy simulation (LES). The LES code is validated by the experimental data of a classical turbulent jet flame (Sandia flame D). The mean and rms temperatures obtained by the presumed PDF models are compared with the LES results. The β-function model achieves a good prediction for different flames. The predicted rms temperature by using the double-δ function model is very small and unphysical in the vicinity of the maximum mean temperature. The clip-Gaussian model and the multi-δ function model make a worse prediction of the extremely fuel-rich or fuel-lean side due to the clip at the boundary of the mixture fraction space. The results also show that the overall prediction performance of presumed PDF models is better at mediate stoichiometric mixture fractions than that at very small or very large ones. (fundamental areas of phenomenology(including applications))
Gómez Toledo, Verónica; Gutiérrez Farfán, Ileana; Verduzco-Mendoza, Antonio; Arch-Tirado, Emilio
Tinnitus is defined as the conscious perception of a sensation of sound that occurs in the absence of an external stimulus. This audiological symptom affects 7% to 19% of the adult population. The aim of this study is to describe the associated comorbidities present in patients with tinnitus usingjoint and conditional probability analysis. Patients of both genders, diagnosed with unilateral or bilateral tinnitus, aged between 20 and 45 years, and had a full computerised medical record, were selected. Study groups were formed on the basis of the following clinical aspects: 1) audiological findings; 2) vestibular findings; 3) comorbidities such as, temporomandibular dysfunction, tubal dysfunction, otosclerosis and, 4) triggering factors of tinnitus noise exposure, respiratory tract infection, use of ototoxic and/or drugs. Of the patients with tinnitus, 27 (65%) reported hearing loss, 11 (26.19%) temporomandibular dysfunction, and 11 (26.19%) with vestibular disorders. When performing the joint probability analysis, it was found that the probability that a patient with tinnitus having hearing loss was 2742 0.65, and 2042 0.47 for bilateral type. The result for P (A ∩ B)=30%. Bayes' theorem P (AiB) = P(Ai∩B)P(B) was used, and various probabilities were calculated. Therefore, in patients with temporomandibulardysfunction and vestibular disorders, a posterior probability of P (Aі/B)=31.44% was calculated. Consideration should be given to the joint and conditional probability approach as tools for the study of different pathologies. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.
Evaluation of the divided attention condition during functional analyses.
Fahmie, Tara A; Iwata, Brian A; Harper, Jill M; Querim, Angie C
2013-01-01
A common condition included in most functional analyses (FAs) is the attention condition, in which the therapist ignores the client by engaging in a solitary activity (antecedent event) but delivers attention to the client contingent on problem behavior (consequent event). The divided attention condition is similar, except that the antecedent event consists of the therapist conversing with an adult confederate. We compared the typical and divided attention conditions to determine whether behavior in general (Study 1) and problem behavior in particular (Study 2) were more sensitive to one of the test conditions. Results showed that the divided attention condition resulted in faster acquisition or more efficient FA results for 2 of 9 subjects, suggesting that the divided attention condition could be considered a preferred condition when resources are available. © Society for the Experimental Analysis of Behavior.
Harmonic Function of Poincare Cone Condition In Solving Dirichlet ...
African Journals Online (AJOL)
Harmonic Function of Poincare Cone Condition In Solving Dirichlet Problem. ... Journal of the Nigerian Association of Mathematical Physics ... theorem, the dirichlet problem and maximum principle where we conclude that the application of sums , differences and scalar multiples of harmonic functions are again harmonic.
Boundary conditions for quasiclassical Green's function for superfluid Fermi systems
International Nuclear Information System (INIS)
Nagai, K.; Hara, J.
1988-01-01
The authors show that the quasiclassical Green's Function for Fermi liquids can be constructed from the solutions of the Bogoliubov-de Gennes equation within the Andreev approximation and derive self-consistent relations to be satisfied by the quasiclassical Green's function at the surfaces. The so-called normalization condition for the quasiclassical Green's function is obtained from this self-consistent relation. They consider a specularly reflecting wall, a randomly rippled wall, and a proximity boundary as model surfaces. Their boundary condition for the randomly rippled wall is different from that derived by Buchholtz and Rainer and Buchholtz
International Nuclear Information System (INIS)
de Jong, F.; Malfliet, R.
1991-01-01
Starting from a relativistic Lagrangian we derive a ''conserving'' approximation for the description of nuclear matter. We show this to be a nontrivial extension over the relativistic Dirac-Brueckner scheme. The saturation point of the equation of state calculated agrees very well with the empirical saturation point. The conserving character of the approach is tested by means of the Hugenholtz--van Hove theorem. We find the theorem fulfilled very well around saturation. A new value for compression modulus is derived, K=310 MeV. Also we calculate the occupation probabilities at normal nuclear matter densities by means of the spectral function. The average depletion κ of the Fermi sea is found to be κ∼0.11
International Nuclear Information System (INIS)
Beshtoev, Kh.M.
2006-01-01
I have considered three-neutrino vacuum transitions and oscillations in the general case and obtained expressions for neutrino wave functions in three cases: with CP violation, without CP violation and in the case when direct ν e - ν τ transitions are absent β(θ 13 ) = 0 (some works indicate this possibility). Then using the existing experimental data some analysis has been fulfilled. This analysis definitely has shown that direct transitions ν e - ν τ cannot be closed for the Solar neutrinos, i. e., β(θ 13 ) ≠ 0. It is also shown that the possibility that β(θ 13 ) = 0 cannot be realized by using the mechanism of resonance enhancement of neutrino oscillations in matter (the Sun). It was found out that the probability of ν e - ν e neutrino transitions is a positive defined value, if in reality neutrino oscillations take place, only if the angle of ν e , ν τ mixing β ≤ 15 - 17 deg
International Nuclear Information System (INIS)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J.
2012-01-01
Greenhouse gas (CO 2 , CH 4 and N 2 O, hereinafter GHG) and criteria air pollutant (CO, NO x , VOC, PM 10 , PM 2.5 and SO x , hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life-cycle modeling with GREET.
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
International Nuclear Information System (INIS)
Hosoma, Takashi
2017-01-01
In the previous research (JAEA-Research 2015-009), essentials of neutron multiplicity counting mathematics were reconsidered where experiences obtained at the Plutonium Conversion Development Facility were taken into, and formulae of multiplicity distribution were algebraically derived up to septuplet using a probability generating function to make a strategic move in the future. Its principle was reported by K. Böhnel in 1985, but such a high-order expansion was the first case due to its increasing complexity. In this research, characteristics of the high-order correlation were investigated. It was found that higher-order correlation increases rapidly in response to the increase of leakage multiplication, crosses and leaves lower-order correlations behind, when leakage multiplication is > 1.3 that depends on detector efficiency and counter setting. In addition, fission rates and doubles count rates by fast neutron and by thermal neutron in their coexisting system were algebraically derived using a probability generating function again. Its principle was reported by I. Pázsit and L. Pál in 2012, but such a physical interpretation, i.e. associating their stochastic variables with fission rate, doubles count rate and leakage multiplication, is the first case. From Rossi-alpha combined distribution and measured ratio of each area obtained by Differential Die-Away Self-Interrogation (DDSI) and conventional assay data, it is possible to estimate: the number of induced fissions per unit time by fast neutron and by thermal neutron; the number of induced fissions (< 1) by one source neutron; and individual doubles count rates. During the research, a hypothesis introduced in their report was proved to be true. Provisional calculations were done for UO_2 of 1∼10 kgU containing ∼ 0.009 wt% "2"4"4Cm. (author)
Venturi, D.; Karniadakis, G. E.
2012-08-01
By using functional integral methods we determine new evolution equations satisfied by the joint response-excitation probability density function (PDF) associated with the stochastic solution to first-order nonlinear partial differential equations (PDEs). The theory is presented for both fully nonlinear and for quasilinear scalar PDEs subject to random boundary conditions, random initial conditions or random forcing terms. Particular applications are discussed for the classical linear and nonlinear advection equations and for the advection-reaction equation. By using a Fourier-Galerkin spectral method we obtain numerical solutions of the proposed response-excitation PDF equations. These numerical solutions are compared against those obtained by using more conventional statistical approaches such as probabilistic collocation and multi-element probabilistic collocation methods. It is found that the response-excitation approach yields accurate predictions of the statistical properties of the system. In addition, it allows to directly ascertain the tails of probabilistic distributions, thus facilitating the assessment of rare events and associated risks. The computational cost of the response-excitation method is order magnitudes smaller than the one of more conventional statistical approaches if the PDE is subject to high-dimensional random boundary or initial conditions. The question of high-dimensionality for evolution equations involving multidimensional joint response-excitation PDFs is also addressed.
International Nuclear Information System (INIS)
Benndorf, Matthias
2012-01-01
Bayes' theorem has proven to be one of the cornerstones in medical decision making. It allows for the derivation of post-test probabilities, which in case of a positive test result become positive predictive values. If several test results are observed successively Bayes' theorem may be used with assumed conditional independence of test results or with incorporated conditional dependencies. Herein it is examined whether radiographic image features should be considered conditionally independent diagnostic tests when post-test probabilities are to be derived. For this purpose the mammographic mass dataset from the UCI (University of California, Irvine) machine learning repository is analysed. It comprises the description of 961 (516 benign, 445 malignant) mammographic mass lesions according to the BI-RADS (Breast Imaging: Reporting and Data System) lexicon. Firstly, an exhaustive correlation matrix is presented for mammography BI-RADS features among benign and malignant lesions separately; correlation can be regarded as measure for conditional dependence. Secondly, it is shown that the derived positive predictive values for the conjunction of the two features “irregular shape” and “spiculated margin” differ significantly depending on whether conditional dependencies are incorporated into the decision process or not. It is concluded that radiographic image features should not generally be regarded as conditionally independent diagnostic tests.
Quantization conditions and functional equations in ABJ(M) theories
International Nuclear Information System (INIS)
Grassi, Alba; Marino, Marcos; Hatsuda, Yasuyuki
2014-12-01
The partition function of ABJ(M) theories on the three-sphere can be regarded as the canonical partition function of an ideal Fermi gas with a non-trivial Hamiltonian. We propose an exact expression for the spectral determinant of this Hamiltonian, which generalizes recent results obtained in the maximally supersymmetric case. As a consequence, we find an exact WKB quantization condition determining the spectrum which is in agreement with numerical results. In addition, we investigate the factorization properties and functional equations for our conjectured spectral determinants. These functional equations relate the spectral determinants of ABJ theories with consecutive ranks of gauge groups but the same Chern-Simons coupling.
Correlation functions for Hermitian many-body systems: Necessary conditions
International Nuclear Information System (INIS)
Brown, E.B.
1994-01-01
Lee [Phys. Rev. B 47, 8293 (1993)] has shown that the odd-numbered derivatives of the Kubo autocorrelation function vanish at t=0. We show that this condition is based on a more general property of nondiagonal Kubo correlation functions. This general property provides that certain functional forms (e.g., simple exponential decay) are not admissible for any symmetric or antisymmetric Kubo correlation function in a Hermitian many-body system. Lee's result emerges as a special case of this result. Applications to translationally invariant systems and systems with rotational symmetries are also demonstrated
Chowdhury, Snehaunshu; Boyette, Wesley; Roberts, William L.
2017-01-01
In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating
Thermal conditions and functional requirements for molten fuel containment
International Nuclear Information System (INIS)
Kang, C.S.; Torri, A.
1980-05-01
This paper discusses the configuration and functional requirements for the molten fuel containment system (MFCS) in the GCFR demonstration plant design. Meltdown conditions following a loss of shutdown cooling (LOSC) accident were studied to define the core debris volume for a realistic meltdown case. Materials and thicknesses of the molten fuel container were defined. Stainless steel was chosen as the sacrificial material and magnesium oxide was chosen as the crucible material. Thermal conditions for an expected quasi-steady state were analyzed. Highlights of the functional requirements which directly affect the MFCS design are discussed
Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R.
2015-01-01
LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information. PMID:25759807
Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R
2015-01-01
LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information.
Directory of Open Access Journals (Sweden)
Sujatha eSubramoni
2015-02-01
Full Text Available LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal, but are not associated with a cognate N-acyl homoserine lactone (AHL synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs available in the InterPro database (IPR005143, and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information.
Flórez-Salamanca, Ludwing; Secades-Villa, Roberto; Budney, Alan J; García-Rodríguez, Olaya; Wang, Shuai; Blanco, Carlos
2013-09-01
This study aims to estimate the odds and predictors of Cannabis Use Disorders (CUD) relapse among individuals in remission. Analyses were done on the subsample of individuals with lifetime history of a CUD (abuse or dependence) who were in full remission at baseline (Wave 1) of the National Epidemiological Survey of Alcohol and Related Conditions (NESARC) (n=2350). Univariate logistic regression models and hierarchical logistic regression model were implemented to estimate odds of relapse and identify predictors of relapse at 3 years follow up (Wave 2). The relapse rate of CUD was 6.63% over an average of 3.6 year follow-up period. In the multivariable model, the odds of relapse were inversely related to time in remission, whereas having a history of conduct disorder or a major depressive disorder after Wave 1 increased the risk of relapse. Our findings suggest that maintenance of remission is the most common outcome for individuals in remission from a CUD. Treatment approaches may improve rates of sustained remission of individuals with CUD and conduct disorder or major depressive disorder. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Conditionally exponential convex functions on locally compact groups
International Nuclear Information System (INIS)
Okb El-Bab, A.S.
1992-09-01
The main results of the thesis are: 1) The construction of a compact base for the convex cone of all conditionally exponential convex functions. 2) The determination of the extreme parts of this cone. Some supplementary lemmas are proved for this purpose. (author). 8 refs
Effects of drying conditions on the physicochemical and functional ...
African Journals Online (AJOL)
This study aimed to investigate ate the effect of drying conditions (freeze dryingng and hot-air oven drying at 40 and 60°C) onon the physicochemical and functional proper perties of red and yellow-fleshed watermelon rind rind flour. In comparison among the drying proceocesses used in this study, freeze drying method re ...
Conditional mode regression: Application to functional time series prediction
Dabo-Niang, Sophie; Laksaci, Ali
2008-01-01
We consider $\\alpha$-mixing observations and deal with the estimation of the conditional mode of a scalar response variable $Y$ given a random variable $X$ taking values in a semi-metric space. We provide a convergence rate in $L^p$ norm of the estimator. A useful and typical application to functional times series prediction is given.
International Nuclear Information System (INIS)
Liu, L.H.; Xu, X.; Chen, Y.L.
2004-01-01
The laminar flamelet equations in combination with the joint probability density function (PDF) transport equation of mixture fraction and turbulence frequency have been used to simulate turbulent jet diffusion flames. To check the suitability of the presumed shapes of the PDF for the modeling of turbulence-radiation interactions (TRI), two types of presumed joint PDFs are constructed by using the second-order moments of temperature and the species concentrations, which are derived by the laminar flamelet model. The time-averaged radiative source terms and the time-averaged absorption coefficients are calculated by the presumed joint PDF approaches, and compared with those obtained by the laminar flamelet model. By comparison, it is shown that there are obvious differences between the results of the independent PDF approach and the laminar flamelet model. Generally, the results of the dependent PDF approach agree better with those of the flamelet model. For the modeling of TRI, the dependent PDF approach is superior to the independent PDF approach
Energy Technology Data Exchange (ETDEWEB)
Qin, X.; Zhang, S. D. [Qufu Normal University, Qufu (China)
2014-12-15
The six doublet and the two quartet electronic states ({sup 2}Σ{sup +}(2), {sup 2}Σ{sup -}, {sup 2}Π(2), {sup 2}Δ, {sup 4}Σ{sup -}, and {sup 4}Π) of the OH radical have been studied using the multi-reference configuration interaction (MRCI) method where the Davidson correction, core-valence interaction and relativistic effect are considered with large basis sets of aug-cc-pv5z, aug-cc-pcv5z, and cc-pv5z-DK, respectively. Potential energy curves (PECs) and dipole moment functions are also calculated for these states for internuclear distances ranging from 0.05 nm to 0.80 nm. All possible vibrational levels and rotational constants for the bound state X{sup 2}Π and A{sup 2}Σ{sup +} of OH are predicted by numerical solving the radial Schroedinger equation through the Level program, and spectroscopic parameters, which are in good agreements with experimental results, are obtained. Transition dipole moments between the ground state X{sup 2}Π and other excited states are also computed using MRCI, and the transition probability, lifetime, and Franck-Condon factors for the A{sup 2}Σ{sup +} - X{sup 2}Π transition are discussed and compared with existing experimental values.
Directory of Open Access Journals (Sweden)
Vedenyapin Aleksandr Dmitrievich
2015-11-01
by the function r(x, kmin: Pnk=Сnk[P(r(x, kmin]k[1-P(r(x, kmin]n-k. We defined the probability that the total index change will be in the section [x, nS] as the sum of the probabilities of incompatible events, for which the number of successes satisfies to entered inequality. Then obviously we defined the probability that the total index changing will be in the interval [nI, x. Then the function F(x was introduced, defined on the whole line, which is identical to the amount (probability that the total index changing will be in the interval in the interval (nI, nS], and is identical to zero and one in the additions. Some properties of the distribution function F(x are satisfied automatically. A sufficient condition for the monotonicity is presented in the form of Theorem 2.
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Wang, Dingbao
2018-01-01
Following the Budyko framework, soil wetting ratio (the ratio between soil wetting and precipitation) as a function of soil storage index (the ratio between soil wetting capacity and precipitation) is derived from the SCS-CN method and the VIC type of model. For the SCS-CN method, soil wetting ratio approaches one when soil storage index approaches infinity, due to the limitation of the SCS-CN method in which the initial soil moisture condition is not explicitly represented. However, for the ...
DEFF Research Database (Denmark)
Yura, Harold; Hanson, Steen Grüner
2012-01-01
with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Aldars-García, Laila; Ramos, Antonio J; Sanchis, Vicente; Marín, Sonia
2015-10-01
Human exposure to aflatoxins in foods is of great concern. The aim of this work was to use predictive mycology as a strategy to mitigate the aflatoxin burden in pistachio nuts postharvest. The probability of growth and aflatoxin B1 (AFB1) production of aflatoxigenic Aspergillus flavus, isolated from pistachio nuts, under static and non-isothermal conditions was studied. Four theoretical temperature scenarios, including temperature levels observed in pistachio nuts during shipping and storage, were used. Two types of inoculum were included: a cocktail of 25 A. flavus isolates and a single isolate inoculum. Initial water activity was adjusted to 0.87. Logistic models, with temperature and time as explanatory variables, were fitted to the probability of growth and AFB1 production under a constant temperature. Subsequently, they were used to predict probabilities under non-isothermal scenarios, with levels of concordance from 90 to 100% in most of the cases. Furthermore, the presence of AFB1 in pistachio nuts could be correctly predicted in 70-81 % of the cases from a growth model developed in pistachio nuts, and in 67-81% of the cases from an AFB1 model developed in pistachio agar. The information obtained in the present work could be used by producers and processors to predict the time for AFB1 production by A. flavus on pistachio nuts during transport and storage. Copyright © 2015 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Ullrich, J.; Dangendorf, V.; Dexheimer, K.; Do, K.; Kelbch, C.; Kelbch, S.; Schadt, W.; Schmidt-Boecking, H.; Stiebing, K.E.; Roesel, F.; Trautmann, D.
1986-01-01
For 3.6 MeV He impact the Lsub(I) and Lsub(III) subshell ionization probabilities of Pt have been measured. Due to relativistic effects in the electron wave functions, the Lsub(I) subshell ionization probability Isub(LI)(b) is strong enhanced at small impact parameters exceeding even Isub(LIII)(b) in nice agreement with the SCA theory. (orig.)
Cannon, Alex J.
2018-01-01
Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin
STUDYING OF FUNCTIONAL CONDITION OF THE SMALL INTESTINE IN CHOLELITHIASIS
Directory of Open Access Journals (Sweden)
Ya. M. Vakhrushev
2015-01-01
Full Text Available Aim. Complex research of the functional condition of the small intestine in different stages of cholelithiasis.Materials and methods. 47 patients with different stages of cholelithiasis were examined. There were 29 patients with the first (prestone stage and 18 — with the second (stone stage of cholelithiasis. In an assessment of the functional condition of the small intestine were used clinical data and results of the load tests by sugars. Cavitary digestion was studied by load test with polysaccharide (soluble starch, membrane digestion — with disaccharide (sucrose, absorption — with monosaccharide (glucose. Glucose level in blood was determined on an empty stomach, then after oral reception of 50g of glucose, sucrose or starch in 30, 60 and 120 minutes.Results. Researchers showed that in the most of patients with cholelithiasis there were disturbances in clinical and functional condition of the small intestine. In an assessment of the cavitary digestion the level of glycemia was authentically lowered by 43% in prestone stage and by 66% in stone stage of cholelithiasis in comparison with control. In an assessment of membrane digestion in patients with the stone stage of cholelithiasis the level of glycemia was lowered in comparison with group of control and with the prestone stage by 30% and 19% respectively.Conclusion. In prestone stage of cholelithiasis there were decrease of the cavitary digestion primary, and in stone stage of cholelithiasis — all stages of hydrolysis-resorptive process in the small intestine were disturbed.
Psychosocial functioning in adults with congenital craniofacial conditions.
Roberts, R M; Mathias, J L
2012-05-01
To examine the psychosocial functioning of adults with congenital craniofacial conditions relative to normative data. Single sample cross-sectional design. The Australian Craniofacial Unit, Women's and Children's Hospital, Adelaide, which is one of the main craniofacial treatment centers in Australia. Adults (N = 93) with congenital craniofacial conditions (excluding cleft lip/palate) who were treated in the Australian Craniofacial Unit. All participants completed self-report scales assessing health-related quality of life (SF-36); life satisfaction, anxiety, and depression (HADS); self-esteem (Rosenberg); appearance-related concerns; perceived social support; and social anxiety. Overall, participants were very similar in psychosocial function to the general population. However, adults with craniofacial conditions were less likely to be married and have children (females), were more likely to be receiving a disability pension, and reported more appearance-related concerns and less social support from friends. They also reported more limitations in both their social activities, due to physical or emotional problems, and usual role activities, because of emotional problems, as well as poorer mental health. These results give cause to be very positive about the long-term outcomes of children who are undergoing treatment for craniofacial conditions, while also identifying specific areas that interventions could target.
International Nuclear Information System (INIS)
Jumarie, Guy
2009-01-01
A probability distribution of fractional (or fractal) order is defined by the measure μ{dx} = p(x)(dx) α , 0 α (D x α h α )f(x) provided by the modified Riemann Liouville definition, one can expand a probability calculus parallel to the standard one. A Fourier's transform of fractional order using the Mittag-Leffler function is introduced, together with its inversion formula; and it provides a suitable generalization of the characteristic function of fractal random variables. It appears that the state moments of fractional order are more especially relevant. The main properties of this fractional probability calculus are outlined, it is shown that it provides a sound approach to Fokker-Planck equation which are fractional in both space and time, and it provides new results in the information theory of non-random functions.
Chrystal, A.; Heikoop, J. M.; Davis, P.; Syme, J.; Hagerty, S.; Perkins, G.; Larson, T. E.; Longmire, P.; Fessenden, J. E.
2010-12-01
Elevated nitrate (NO3-) concentrations in drinking water pose a health risk to the public. The dual stable isotopic signatures of δ15N and δ18O in NO3- in surface- and groundwater are often used to identify and distinguish among sources of NO3- (e.g., sewage, fertilizer, atmospheric deposition). In oxic groundwaters where no denitrification is occurring, direct calculations of mixing fractions using a mass balance approach can be performed if three or fewer sources of NO3- are present, and if the stable isotope ratios of the source terms are defined. There are several limitations to this approach. First, direct calculations of mixing fractions are not possible when four or more NO3- sources may be present. Simple mixing calculations also rely upon treating source isotopic compositions as a single value; however these sources themselves exhibit ranges in stable isotope ratios. More information can be gained by using a probabilistic approach to account for the range and distribution of stable isotope ratios in each source. Fitting probability density functions (PDFs) to the isotopic compositions for each source term reveals that some values within a given isotopic range are more likely to occur than others. We compiled a data set of dual isotopes in NO3- sources by combining our measurements with data collected through extensive literature review. We fit each source term with a PDF, and show a new method to probabilistically solve multiple component mixing scenarios with source isotopic composition uncertainty. This method is based on a modified use of a tri-linear diagram. First, source term PDFs are sampled numerous times using a variation of stratified random sampling, Latin Hypercube Sampling. For each set of sampled source isotopic compositions, a reference point is generated close to the measured groundwater sample isotopic composition. This point is used as a vertex to form all possible triangles between all pairs of sampled source isotopic compositions
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Falck, Ryan S; Landry, Glenn J; Best, John R; Davis, Jennifer C; Chiu, Bryan K; Liu-Ambrose, Teresa
2017-10-01
Mild cognitive impairment (MCI) represents a transition between normal cognitive aging and dementia and may represent a critical time frame for promoting cognitive health through behavioral strategies. Current evidence suggests that physical activity (PA) and sedentary behavior are important for cognition. However, it is unclear whether there are differences in PA and sedentary behavior between people with probable MCI and people without MCI or whether the relationships of PA and sedentary behavior with cognitive function differ by MCI status. The aims of this study were to examine differences in PA and sedentary behavior between people with probable MCI and people without MCI and whether associations of PA and sedentary behavior with cognitive function differed by MCI status. This was a cross-sectional study. Physical activity and sedentary behavior in adults dwelling in the community (N = 151; at least 55 years old) were measured using a wrist-worn actigraphy unit. The Montreal Cognitive Assessment was used to categorize participants with probable MCI (scores of Cognitive function was indexed using the Alzheimer Disease Assessment Scale-Cognitive-Plus (ADAS-Cog Plus). Physical activity and sedentary behavior were compared based on probable MCI status, and relationships of ADAS-Cog Plus with PA and sedentary behavior were examined by probable MCI status. Participants with probable MCI (n = 82) had lower PA and higher sedentary behavior than participants without MCI (n = 69). Higher PA and lower sedentary behavior were associated with better ADAS-Cog Plus performance in participants without MCI (β = -.022 and β = .012, respectively) but not in participants with probable MCI (β cognitive function. The diagnosis of MCI was not confirmed with a physician; therefore, this study could not conclude how many of the participants categorized as having probable MCI would actually have been diagnosed with MCI by a physician. Participants with probable MCI were less active
Smooth conditional distribution function and quantiles under random censorship.
Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine
2002-09-01
We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
Frame conditions for a well-functioning end user market
International Nuclear Information System (INIS)
Livik, Klaus
1997-10-01
The aim of this report is to describe and define different frame conditions being necessary for the development of a well-functioning end user market. The report describes the sharing of roles between end users, grid owners, suppliers, system operators and market operators in the power market, and it points out how the interplay between these roles should be arranged. A particular attention is laid on how to involve the end user relations into the five different market roles during the development of a more active end user market. Products and eventual potentials are described and discussed being based on estimates as well as load measurements. 17 figs., 4 tabs
Lithuanian medical tourism cluster: conditions and background for functioning
Directory of Open Access Journals (Sweden)
Korol A. N.
2017-10-01
Full Text Available as the global economy develops, more and more attention is paid to the creation of tourist clusters, which are extremely important for the economy and national competitiveness. This article analyzes the cluster of medical tourism in Lithuania, and explores the conditions for its successful functioning. The creation of the medical tourism cluster is highly influenced by a number of factors: the regulation of tourist and medical services, the level of entrepreneurial activity, human resources, the experience of partnership. In addition, the article analyzes the structure of the medical tourism cluster, determines the prerequisites for the functioning of the Lithuanian medical tourism cluster, including a wide range of services, European standards for the provision of medical services, high qualification of specialists, etc. When writing the article, the methods of systematic and logical analysis of scientific literature were used.
The Sphagnum microbiome supports bog ecosystem functioning under extreme conditions.
Bragina, Anastasia; Oberauner-Wappis, Lisa; Zachow, Christin; Halwachs, Bettina; Thallinger, Gerhard G; Müller, Henry; Berg, Gabriele
2014-09-01
Sphagnum-dominated bogs represent a unique yet widely distributed type of terrestrial ecosystem and strongly contribute to global biosphere functioning. Sphagnum is colonized by highly diverse microbial communities, but less is known about their function. We identified a high functional diversity within the Sphagnum microbiome applying an Illumina-based metagenomic approach followed by de novo assembly and MG-RAST annotation. An interenvironmental comparison revealed that the Sphagnum microbiome harbours specific genetic features that distinguish it significantly from microbiomes of higher plants and peat soils. The differential traits especially support ecosystem functioning by a symbiotic lifestyle under poikilohydric and ombrotrophic conditions. To realise a plasticity-stability balance, we found abundant subsystems responsible to cope with oxidative and drought stresses, to exchange (mobile) genetic elements, and genes that encode for resistance to detrimental environmental factors, repair and self-controlling mechanisms. Multiple microbe-microbe and plant-microbe interactions were also found to play a crucial role as indicated by diverse genes necessary for biofilm formation, interaction via quorum sensing and nutrient exchange. A high proportion of genes involved in nitrogen cycle and recycling of organic material supported the role of bacteria for nutrient supply. 16S rDNA analysis indicated a higher structural diversity than that which had been previously detected using PCR-dependent techniques. Altogether, the diverse Sphagnum microbiome has the ability to support the life of the host plant and the entire ecosystem under changing environmental conditions. Beyond this, the moss microbiome presents a promising bio-resource for environmental biotechnology - with respect to novel enzymes or stress-protecting bacteria. © 2014 John Wiley & Sons Ltd.
On the Hitting Probability of Max-Stable Processes
Hofmann, Martin
2012-01-01
The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.
Chowdhury, Snehaunshu
2017-01-23
In this study, we demonstrate the use of a scanning mobility particle sizer (SMPS) as an effective tool to measure the probability density functions (PDFs) of soot nanoparticles in turbulent flames. Time-averaged soot PDFs necessary for validating existing soot models are reported at intervals of ∆x/D∆x/D = 5 along the centerline of turbulent, non-premixed, C2H4/N2 flames. The jet exit Reynolds numbers of the flames investigated were 10,000 and 20,000. A simplified burner geometry based on a published design was chosen to aid modelers. Soot was sampled directly from the flame using a sampling probe with a 0.5-mm diameter orifice and diluted with N2 by a two-stage dilution process. The overall dilution ratio was not evaluated. An SMPS system was used to analyze soot particle concentrations in the diluted samples. Sampling conditions were optimized over a wide range of dilution ratios to eliminate the effect of agglomeration in the sampling probe. Two differential mobility analyzers (DMAs) with different size ranges were used separately in the SMPS measurements to characterize the entire size range of particles. In both flames, the PDFs were found to be mono-modal in nature near the jet exit. Further downstream, the profiles were flatter with a fall-off at larger particle diameters. The geometric mean of the soot size distributions was less than 10 nm for all cases and increased monotonically with axial distance in both flames.
Directory of Open Access Journals (Sweden)
Dong Hyun Cho
2017-01-01
Full Text Available Using a simple formula for conditional expectations over continuous paths, we will evaluate conditional expectations which are types of analytic conditional Fourier-Feynman transforms and conditional convolution products of generalized cylinder functions and the functions in a Banach algebra which is the space of generalized Fourier transforms of the measures on the Borel class of L2[0,T]. We will then investigate their relationships. Particularly, we prove that the conditional transform of the conditional convolution product can be expressed by the product of the conditional transforms of each function. Finally we will establish change of scale formulas for the conditional transforms and the conditional convolution products. In these evaluation formulas and change of scale formulas, we use multivariate normal distributions so that the conditioning function does not contain present positions of the paths.
Directory of Open Access Journals (Sweden)
Farnoosh Basaligheh
2015-12-01
Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
International Nuclear Information System (INIS)
Burgazzi, Luciano
2011-01-01
PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest
Resistance of functional Lactobacillus plantarum strains against food stress conditions.
Ferrando, Verónica; Quiberoni, Andrea; Reinhemer, Jorge; Suárez, Viviana
2015-06-01
The survival of three Lactobacillus plantarum strains (Lp 790, Lp 813 and Lp 998) with functional properties was studied taking into account their resistance to thermal, osmotic and oxidative stress factors. Stress treatments applied were: 52 °C-15 min (Phosphate Buffer pH 7, thermal shock), H2O2 0.1% (p/v) - 30 min (oxidative shock) and NaCl aqueous solution at 17, 25 and 30% (p/v) (room temperature - 1 h, osmotic shock). The osmotic stress was also evaluated on cell growth in MRS broth added of 2, 4, 6, 8 and 10% (p/v) of NaCl, during 20 h at 30 °C. The cell thermal adaptation was performed in MRS broth, selecting 45 °C for 30 min as final conditions for all strains. Two strains (Lp 813 and Lp 998) showed, in general, similar behaviour against the three stress factors, being clearly more resistant than Lp 790. An evident difference in growth kinetics in presence of NaCl was observed between Lp 998 and Lp 813, Lp998 showing a higher optical density (OD570nm) than Lp 813 at the end of the assay. Selected thermal adaptation improved by 2 log orders the thermal resistance of both strains, but cell growth in presence of NaCl was enhanced only in Lp 813. Oxidative resistance was not affected with this thermal pre-treatment. These results demonstrate the relevance of cell technological resistance when selecting presumptive "probiotic" cultures, since different stress factors might considerably affect viability or/and performance of the strains. The incidence of stress conditions on functional properties of the strains used in this work are currently under research in our group. Copyright © 2014 Elsevier Ltd. All rights reserved.
Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk
2018-05-01
The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.
International Nuclear Information System (INIS)
Harvey, L D Danny
2007-01-01
Article 2 of the United Nations Framework Convention on Climate Change (UNFCCC) calls for stabilization of greenhouse gas (GHG) concentrations at levels that prevent dangerous anthropogenic interference (DAI) in the climate system. Until recently, the consensus viewpoint was that the climate sensitivity (the global mean equilibrium warming for a doubling of atmospheric CO 2 concentration) was 'likely' to fall between 1.5 and 4.5 K. However, a number of recent studies have generated probability distribution functions (pdfs) for climate sensitivity with the 95th percentile of the expected climate sensitivity as large as 10 K, while some studies suggest that the climate sensitivity is likely to fall in the lower half of the long-standing 1.5-4.5 K range. This paper examines the allowable CO 2 concentration as a function of the 95th percentile of the climate sensitivity pdf (ranging from 2 to 8 K) and for the following additional assumptions: (i) the 50th percentile for the pdf of the minimum sustained global mean warming that causes unacceptable harm equal to 1.5 or 2.5 K; and (ii) 1%, 5% or 10% allowable risks of unacceptable harm. For a 1% risk tolerance and the more stringent harm-threshold pdf, the allowable CO 2 concentration ranges from 323 to 268 ppmv as the 95th percentile of the climate sensitivity pdf increases from 2 to 8 K, while for a 10% risk tolerance and the less stringent harm-threshold pdf, the allowable CO 2 concentration ranges from 531 to 305 ppmv. In both cases it is assumed that non-CO 2 GHG radiative forcing can be reduced to half of its present value, otherwise; the allowable CO 2 concentration is even smaller. Accounting for the fact that the CO 2 concentration will gradually fall if emissions are reduced to zero, and that peak realized warming will then be less than the peak equilibrium warming (related to peak radiative forcing) allows the CO 2 concentration to peak at 10-40 ppmv higher than the limiting values given above for a climate
Kim, Jeonglae; Pope, Stephen B.
2014-05-01
A turbulent lean-premixed propane-air flame stabilised by a triangular cylinder as a flame-holder is simulated to assess the accuracy and computational efficiency of combined dimension reduction and tabulation of chemistry. The computational condition matches the Volvo rig experiments. For the reactive simulation, the Lagrangian Large-Eddy Simulation/Probability Density Function (LES/PDF) formulation is used. A novel two-way coupling approach between LES and PDF is applied to obtain resolved density to reduce its statistical fluctuations. Composition mixing is evaluated by the modified Interaction-by-Exchange with the Mean (IEM) model. A baseline case uses In Situ Adaptive Tabulation (ISAT) to calculate chemical reactions efficiently. Its results demonstrate good agreement with the experimental measurements in turbulence statistics, temperature, and minor species mass fractions. For dimension reduction, 11 and 16 represented species are chosen and a variant of Rate Controlled Constrained Equilibrium (RCCE) is applied in conjunction with ISAT to each case. All the quantities in the comparison are indistinguishable from the baseline results using ISAT only. The combined use of RCCE/ISAT reduces the computational time for chemical reaction by more than 50%. However, for the current turbulent premixed flame, chemical reaction takes only a minor portion of the overall computational cost, in contrast to non-premixed flame simulations using LES/PDF, presumably due to the restricted manifold of purely premixed flame in the composition space. Instead, composition mixing is the major contributor to cost reduction since the mean-drift term, which is computationally expensive, is computed for the reduced representation. Overall, a reduction of more than 15% in the computational cost is obtained.
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio
2008-01-01
Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases
Energy Technology Data Exchange (ETDEWEB)
Carta, Jose A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Las Palmas de Gran Canaria, Canary Islands (Spain); Ramirez, Penelope; Velazquez, Sergio [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain)
2008-10-15
Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error {epsilon} made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R{sup 2} statistic (R{sub a}{sup 2}). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R{sub a}{sup 2} statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R{sub a}{sup 2} increases. (author)
Directory of Open Access Journals (Sweden)
Đurović Aleksandar
2007-01-01
Full Text Available Background/Aim. Few authors are involved in home rehabilitation of amputees or their reintegration into the community. It has been remarked that there is a discontinuity between the phases of the amputee rehabilitation in Serbia. The aim of the study was to establish pain characteristics and functional status of amputees two months after the amputation and to determine their social function and the conditions of their habitation. Methods. This prospective observation study involved 38 elderly amputees with unilateral lower limb amputations. The patients were tested at the hospital on discharge and at their homes two months after the amputation. Pain intensity and functional status were measured by a visual analogue scale (VAS and by Functional Independence Measure (FIM. The patients’ social function was assessed using the Social Dysfunction Rating Scale (SDRS and conditions of their habitation by the self-created Scale of Conditions of Habitation (SCH. In statistic analysis we used the Student t test, χ2 test and Analysis of variance (ANOVA. Results. The majority of patients (63% underwent below knee amputation caused by diabetes (89%. A significant number of patients (84%, χ2 = 17.78; p < 0.01 was not visited by a physiotherapist nor an occupational therapist during two months at home. In this period, the majority of the amputees (68% had phantom pain or residual limb pain (21%. Two months after amputation the pain intensity was significantly lower (VAS = 4.07±2.19; 2.34±1.41; p < 0.001, and the functional status significantly better than on discharge (FIM = 75.13±16.52; 87.87±16.48; p < 0.001. The amputees had the average level of social dysfunction (SDRS = 62.00±11.68 and conditions of habitation (SCH = 7.81±1.97. Conclusion. A total 38 elderly amputees with unilateral lower limb amputations achieved significant functional improvement and reduction of pain, in spite of their social dysfunction, the absence of socio-medical support
Directory of Open Access Journals (Sweden)
Luis Vicente Chamorro Marcillllo
2013-06-01
Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.
Blazed Grating Resonance Conditions and Diffraction Efficiency Optical Transfer Function
Stegenburgs, Edgars
2017-01-08
We introduce a general approach to study diffraction harmonics or resonances and resonance conditions for blazed reflecting gratings providing knowledge of fundamental diffraction pattern and qualitative understanding of predicting parameters for the most efficient diffraction.
Blazed Grating Resonance Conditions and Diffraction Efficiency Optical Transfer Function
Stegenburgs, Edgars; Alias, Mohd Sharizal B.; Ng, Tien Khee; Ooi, Boon S.
2017-01-01
We introduce a general approach to study diffraction harmonics or resonances and resonance conditions for blazed reflecting gratings providing knowledge of fundamental diffraction pattern and qualitative understanding of predicting parameters for the most efficient diffraction.
Volkerts, ER; VanLaar, MW; Verbaten, MN; Mulder, G; Maes, RAA
1997-01-01
The primary research question in this investigation concerned whether arousal manipulation by a stimulant (phentermine 20 mg) and a depressant (pentobarbital 100 mg) will oppositely affect choice behaviour in a probability learning task and decision processes manipulated by pay-off. A 3-source
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Functional magnetic resonance in the conditions of a clinical department
International Nuclear Information System (INIS)
Obenberger, J.; Seidl, Z.; Krasensky, J.; Vitak, T.; Haberzettel, V.
1997-01-01
Functional magnetic resonance is a novel technique enabling non-invasive monitoring of the brain function and metabolism at a time resolution and spatial resolution unmatched by any other imaging technique. The principle of the method is outlined, and it is demonstrated that such demanding examinations can be performed using state-of-the-art MR instrumentation combined with conventional equipment and GE sequences available at normal clinical departments. The functional MR examination, which does not take a much longer time than routine examination, can be improved by fixing the patient's head. As a prerequisite for correlation, the MR instrument has to be interfaced to a computer, and suitable tools for mutual data correlation have to be created. (P.A.)
Anusavice, Kenneth J.; Jadaan, Osama M.; Esquivel–Upshaw, Josephine
2013-01-01
Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. Objective The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6 mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Materials and methods Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Results Predicted fracture probabilities (Pf) for centrally-loaded 1,6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8 mm/0.8 mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4 mm/1.2 mm). Conclusion CARES/Life results support the proposed crown design and load orientation hypotheses. Significance The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. PMID:24060349
Anusavice, Kenneth J; Jadaan, Osama M; Esquivel-Upshaw, Josephine F
2013-11-01
Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Predicted fracture probabilities (Pf) for centrally loaded 1.6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8mm/0.8mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4mm/1.2mm). CARES/Life results support the proposed crown design and load orientation hypotheses. The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. Copyright © 2013 Academy of Dental Materials. All rights reserved.
Harmonic Function of Poincare Cone Condition In Solving Dirichlet ...
African Journals Online (AJOL)
This paper describes the set of harmonic functions on a given open set U which can be seen as the kernel of the Laplace operator and is therefore a vector space over R .It also reviews the harmonic theorem, the dirichlet problem and maximum principle where we conclude that the application of sums , differences and ...
Energy Technology Data Exchange (ETDEWEB)
Kim, Jong-Min; Lee, Bong-Sang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2016-10-15
The round robin project was proposed by the PFM Research Subcommittee of the Japan Welding Engineering Society to Asian Society for Integrity of Nuclear Components (ASINCO) members, which is designated in Korea as Phase 2 of A-Pro2. The objective of this phase 2 of RR analysis is to compare the scheme and results related to the assessment of structural integrity of RPV for the events important to safety in the design consideration but relatively low fracture probability. In this study, probabilistic fracture mechanics analysis was performed for the round robin cases using PROFAS-RV code. The effects of key parameters such as different transient, fluence level, Cu and Ni content, initial RT{sub NDT} and RT{sub NDT} shift model on the failure probability were systematically compared and reviewed. These efforts can minimize the uncertainty of the integrity evaluation for the reactor pressure vessel.
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Random phenomena fundamentals of probability and statistics for engineers
Ogunnaike, Babatunde A
2009-01-01
PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...
Producing Conditional Mutants for Studying Plant Microtubule Function
Energy Technology Data Exchange (ETDEWEB)
Richard Cyr
2009-09-29
The cytoskeleton, and in particular its microtubule component, participates in several processes that directly affect growth and development in higher plants. Normal cytoskeletal function requires the precise and orderly arrangement of microtubules into several cell cycle and developmentally specific arrays. One of these, the cortical array, is notable for its role in directing the deposition of cellulose (the most prominent polymer in the biosphere). An understanding of how these arrays form, and the molecular interactions that contribute to their function, is incomplete. To gain a better understanding of how microtubules work, we have been working to characterize mutants in critical cytoskeletal genes. This characterization is being carried out at the subcellular level using vital microtubule gene constructs. In the last year of funding colleagues have discovered that gamma-tubulin complexes form along the lengths of cortical microtubules where they act to spawn new microtubules at a characteristic 40 deg angle. This finding complements nicely the finding from our lab (which was funded by the DOE) showing that microtubule encounters are angle dependent; high angles encounters results in catastrophic collisions while low angle encounters result in favorable zippering. The finding of a 40 deg spawn of new microtubules from extant microtubule, together with aforementioned rules of encounters, insures favorable co-alignment in the array. I was invited to write a New and Views essay on this topic and a PDF is attached (News and Views policy does not permit funding acknowledgments and so I was not allowed to acknowledge support from the DOE).
Pérez-Sánchez, Julio; Senent-Aparicio, Javier
2017-08-01
Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.
Testing of newly developed functional surfaces under pure sliding conditions
DEFF Research Database (Denmark)
Godi, Alessandro; Mohaghegh, Kamran; Grønbæk, J.
2013-01-01
the surfaces in an industrial context. In this paper, a number of experimental tests were performed using a novel test rig, called axial sliding test, simulating the contact of surfaces under pure sliding conditions. The aim of the experiments is to evaluate the frictional behavior of a new typology...... of textured surfaces, the so-called multifunctional surfaces, characterized by a plateau area able to bear loads and a deterministic pattern of lubricant pockets. Six surface typologies, namely three multifunctional and three machined using classical processes, were chosen to slide against a mirror....... The results comparison showed clearly how employing multifunctional surfaces can reduce friction forces up to 50 % at high normal loads compared to regularly ground or turned surfaces. Friction coefficients approximately equal to 0.12 were found for classically machined surfaces, whereas the values were 0...
A Derivation of Probabilities of Correct and Wrongful Conviction in a Criminal Trial
DEFF Research Database (Denmark)
Lando, Henrik
2006-01-01
probabilities are the probability of observing (any given) evidence against individual i given that individual j committed the crime (for any j including j equal to i). The variables are derived from the conditional probabilities as a function of the standard of the proof using simple Bayesian updating....
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Discrete Green's Theorem, Green's Functions and Stable Radiative FDTD Boundary Conditions
Arnold, J.M.; Hon, de B.P.
2007-01-01
We propose a radiative boundary condition for the discrete-grid formulation of Helmholtz’ equation, based on rational approximation in the frequency domain of a Green’s function for the discretised system. This boundary condition is free from instabilities.
Functional status of liverin conditions of radiation and chemical exposure
Directory of Open Access Journals (Sweden)
O. V. Severynovs’ka
2005-09-01
Full Text Available Chronic influences of low-intensity X-rays in doses of 0.15 and 0.25 Gr and mix of heavy metals salts in a dose of 2 EPC (extreme permissible concentrations for each metal, as a single factor or as a combination of factors, on the state of pro-/antioxidative system in a rat liver have been studied. Analysis of the data concerning combined influences allows to conclude that effects under these doses have some differences: a splash of processes of lipid peroxidation are observed in both causes, but under the lower dose an additivity takes place, and under the dose of 0.25 Gr a synergism of the agent effects in relation to the development of peroxidative reactions is registered. The results testify that technogenic contamination of water with heavy metals worsens the action of radiation factor, specifically, eliminates a hormetic splash of antioxidative activity at 0.15 Gr. Biochemical indexes of the liver activity, as a central organ of a general metabolism, and a structure of morbidity have been studied in liquidators of the Chernobyl accident from industrial Prydnieprovie region. Disturbances of liver functions have been shown, especially in persons obtained the exposure dose about 0.25 Gr. A comparison of these results and data of tests with laboratory animals reveals their mutual accordance and supports a relevancy of extrapolation of data of model experiments on a person health state, which undergone a similar influence.
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
Approximation of Measurement Results of “Emergency” Signal Reception Probability
Directory of Open Access Journals (Sweden)
Gajda Stanisław
2017-08-01
Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Conditions for the existence of control functions in nonseparable simultaneous equations models
Blundell, Richard; Matzkin, Rosa L.
2010-01-01
The control function approach (Heckman and Robb (1985)) in a system of linear simultaneous equations provides a convenient procedure to estimate one of the functions in the system using reduced form residuals from the other functions as additional regressors. The conditions on the structural system under which this procedure can be used in nonlinear and nonparametric simultaneous equations has thus far been unknown. In this note, we define a new property of functions called control function s...
Smoothed Conditional Scale Function Estimation in AR(1-ARCH(1 Processes
Directory of Open Access Journals (Sweden)
Lema Logamou Seknewna
2018-01-01
Full Text Available The estimation of the Smoothed Conditional Scale Function for time series was taken out under the conditional heteroscedastic innovations by imitating the kernel smoothing in nonparametric QAR-QARCH scheme. The estimation was taken out based on the quantile regression methodology proposed by Koenker and Bassett. And the proof of the asymptotic properties of the Conditional Scale Function estimator for this type of process was given and its consistency was shown.
Nakashima, Takahiro
2006-01-01
The functional specification of mean-standard deviation approach is examined under location and scale parameter condition. Firstly, the full set of restrictions imposed on the mean-standard deviation function under the location and scale parameter condition are made clear. Secondly, the examination based on the restrictions mentioned in the previous sentence derives the new properties of the mean-standard deviation function on the applicability of additive separability and the curvature of ex...
δ'-function perturbations and Neumann boundary-conditions by path integration
International Nuclear Information System (INIS)
Grosche, C.
1994-02-01
δ'-function perturbations and Neumann boundary conditions are incorporated into the path integral formalism. The starting point is the consideration of the path integral representation for the one dimensional Dirac particle together with a relativistic point interaction. The non-relativistic limit yields either a usual δ-function or a δ'-function perturbation; making their strengths infinitely repulsive one obtains Dirichlet, respectively Neumann boundary conditions in the path integral. (orig.)
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Gray, Kristen E.; Katon, Jodie G.; Rillamas-Sun, Eileen; Bastian, Lori A.; Nelson, Karin M.; LaCroix, Andrea Z.; Reiber, Gayle E.
2016-01-01
Abstract Purpose of the Study: To compare the number of chronic conditions among a list of 12 and their association with physical function among postmenopausal non-Veteran and Veteran women with diabetes. Design and Methods: Among women with diabetes from the Women’s Health Initiative, we compared the average number of chronic conditions between non-Veterans and Veterans and the association between total number of chronic conditions on subsequent RAND-36 physical function. To examine associations between each condition and subsequent physical function, we compared women with diabetes plus one chronic condition to women with diabetes alone using linear regression in separate models for each condition and for non-Veterans and Veterans. Results: Both non-Veterans ( N = 23,542) and Veterans ( N = 618) with diabetes had a median of 3 chronic conditions. Decreases in physical function for each additional condition were larger among Veterans than non-Veterans (−6.3 vs. −4.1 points). Decreases in physical function among women with diabetes plus one chronic condition were greater than that reported for diabetes alone for all combinations and were more pronounced among Veterans (non-Veterans: −11.1 to −24.2, Veterans: −16.6 to −40.4 points). Hip fracture, peripheral artery disease, cerebrovascular disease, and coronary disease in combination with diabetes were associated with the greatest decreases in physical function. Implications: Chronic conditions were common among postmenopausal women with diabetes and were associated with large declines in physical function, particularly among Veterans. Interventions to prevent and reduce the impact of these conditions and facilitate coordination of care among women with diabetes may help them maintain physical function. PMID:26768385
Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente
2017-04-29
Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.
Institute of Scientific and Technical Information of China (English)
陆宏伟; 陈亚珠; 卫青
2004-01-01
Probability density function (PDF) method is proposed for analysing the structure of the reconstructed attractor in computing the correlation dimensions of RR intervals of ten normal old men.PDF contains important information about the spatial distribution of the phase points in the reconstructed attractor.To the best of our knowledge, it is the first time that the PDF method is put forward for the analysis of the reconstructed attractor structure.Numerical simulations demonstrate that the cardiac systems of healthy old men are about 6-6.5 dimensional complex dynamical systems.It is found that PDF is not symmetrically distributed when time delay is small, while PDF satisfies Gaussian distribution when time delay is big enough.A cluster effect mechanism is presented to explain this phenomenon.By studying the shape of PDFs, that the roles played by time delay are more important than embedding dimension in the reconstruction is clearly indicated.Results have demonstrated that the PDF method represents a promising numerical approach for the observation of the reconstructed attractor structure and may provide more information and new diagnostic potential of the analyzed cardiac system.
Energy Technology Data Exchange (ETDEWEB)
Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor
2015-01-01
SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.
International Nuclear Information System (INIS)
Rasin, I.M.; Sarapul'tsev, I.A.
1975-01-01
The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Changes in working conditions and physical health functioning among midlife and ageing employees.
Mänty, Minna; Kouvonen, Anne; Lallukka, Tea; Lahti, Jouni; Lahelma, Eero; Rahkonen, Ossi
2015-11-01
The aim this study was to examine the effect of changes in physical and psychosocial working conditions on physical health functioning among ageing municipal employees. Follow-up survey data were collected from midlife employees of the City of Helsinki, Finland, at three time points: wave 1 (2000-2002), wave 2 (2007), and wave 3 (2012). Changes in physical and psychosocial working conditions were assessed between waves 1 and 2. Physical health functioning was measured by the physical component summary (PCS) of the Short-Form 36 questionnaire at each of the three waves. In total, 2784 respondents (83% women) who remained employed over the follow-up were available for the analyses. Linear mixed-effect models were used to assess the associations and adjust for key covariates (age, gender, obesity, chronic diseases, and health behaviors). Repeated and increased exposure to adverse physical working conditions was associated with greater decline in physical health functioning over time. In contrast, decrease in exposures reduced the decline. Of the psychosocial working conditions, changes in job demands had no effects on physical health functioning. However, decreased job control was associated with greater decline and repeated high or increased job control reduced the decline in physical health functioning over time. Adverse changes in physical working conditions and job control were associated with greater decline in physical health functioning over time, whereas favorable changes in these exposures reduced the decline. Preventing deterioration and promoting improvement of working conditions are likely to help maintain better physical health functioning among ageing employees.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Directory of Open Access Journals (Sweden)
Dhakne Machindra B.
2017-04-01
Full Text Available In this paper we discuss the existence of mild and strong solutions of abstract nonlinear mixed functional integrodifferential equation with nonlocal condition by using Sadovskii’s fixed point theorem and theory of fractional power of operators.
Knafl, Kathleen A; Deatrick, Janet A; Knafl, George J; Gallo, Agatha M; Grey, Margaret; Dixon, Jane
2013-01-01
Understanding patterns of family response to childhood chronic conditions provides a more comprehensive understanding of their influence on family and child functioning. In this paper, we report the results of a cluster analysis based on the six scales comprising the Family Management Measure (FaMM) and the resulting typology of family management. The sample of 575 parents (414 families) of children with diverse chronic conditions fell into four patterns of response (Family Focused, Somewhat Family Focused, Somewhat Condition Focused, Condition Focused) that differed in the extent family life was focused on usual family routines or the demands of condition management. Most (57%) families were in either the Family Focused or Somewhat Family Focused pattern. Patterns of family management were related significantly to family and child functioning, with families in the Family Focused and Somewhat Family Focused patterns demonstrating significantly better family and child functioning than families in the other two patterns. Copyright © 2013. Published by Elsevier Inc.
Functional expansion for evolution operators in a system of many fermions with many conditions
International Nuclear Information System (INIS)
Barrios, S.C.
1985-01-01
We present a mean field expansion for many body system, using integral functionals. The problem is formulated as a initial conditions one and it is studied the effective dynamics of the body density with given initial conditions. (M.W.O.) [pt
Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A
2017-03-21
It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multiple-event probability in general-relativistic quantum mechanics
International Nuclear Information System (INIS)
Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo
2007-01-01
We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse
Janković, Bojan
2009-10-01
The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Effects of common mental disorders and physical conditions on role functioning in Spain.
Barbaglia, Gabriela; Duran, Núria; Vilagut, Gemma; Forero, Carlos García; Haro, Josep Maria; Alonso, Jordi
2013-01-01
To examine the effects of common mental disorders and physical conditions on role functioning in Spain. Cross-sectional study of the general adult population of Spain (n = 2,121). Non-psychotic mental disorders were assessed with the Composite International Diagnostic Interview (CIDI 3.0) and physical conditions with a checklist. The role functioning dimension of the WHO-Disability Assessment Schedule (WHODAS) was used to asses the number of days in the past month in which respondents were fully or partially limited to perform daily activities. Generalized linear models were used to estimate individual-level associations of specific conditions and role functioning, controlling for co-morbidity. Societal level estimates were calculated using population attributable risk proportions (PARP). Mental disorders and physical conditions showed similar number of days with full role limitation (about 20 days per year); in contrast mental disorders were responsible for twice as many days with partial role limitation than physical conditions (42 vs 21 days, respectively). If the population were entirely unexposed to mental and physical conditions, days with full limitation would be reduced by 73% and days with partial limitation by 41%. Common health conditions in Spain are associated with considerably more days with role limitation than other Western countries. There is need of mainstreaming disability in the Spanish public health agenda in order to reduce role limitation among individuals with common conditions. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.
Troitskaya, Yuliya; Abramov, Victor; Ermoshkin, Alexey; Zuikova, Emma; Kazakov, Vassily; Sergeev, Daniil; Kandaurov, Alexandr
2014-05-01
(friction velocity and roughness height) were retrieved by velocity profiling and subsequent data processing based on self-similarity of the turbulent boundary layer and 10-m wind speed was calculated. The wind wave field parameters in the flume were measured by three wire gauges. The measured data on wind waves were used for estimation of the short wave spectra and slope probability density function for "long waves" within composite Bragg theory of microwave radar return. Estimations showed that for co-polarized radar returns the difference between measurements and the predictions of the model is about 1-2 dB and it can be explained by our poor knowledge about the short wave part of the spectrum. For cross-polarized return the difference exceeds 10 dB, and it indicates that some non-Bragg mechanisms (short-crested waves, foam, sprays, etc) are responsible for the depolarization of the returned signal. It seems reasonable then to suppose that the cross-polarized radar return in X- and C-bands will demonstrate similar dependence on wind speed. We compared the dependence of cross-polarized X-band radar cross-section on 10-m wind speed obtained in laboratory conditions with the similar dependence obtained in [2] from the field data for C-band radar cross-section and found out that the laboratory data follow the median of the field data with the constant bias -11 dB. Basing on laboratory data an empirical polynomial geophysical model function was suggested for retrieving wind speed up to 40 m/s from cross-polarized microwave return, which is in good agreement with the direct measurements. This work was carried out under financial support of the RFBR (project codes ¹ 13-05-00865, 12-05-12093) and by grant from the Government of the Russian Federation (project code 11.G34.31.0048). References [1] B. Zhang, W. Perrie Bull. Amer. Meteor. Soc., 93, 531-541, 2012. [2] G.-J. van Zadelhoff, et.al. Atmos. Meas. Tech. Discuss., 6, 7945-7984, doi:10.5194/amtd-6-7945-2013, 2013.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Emptiness formation probability of XX-chain in diffusion process
International Nuclear Information System (INIS)
Ogata, Yoshiko
2004-01-01
We study the distribution of emptiness formation probability of XX-model in the diffusion process. There exits a Gaussian decay as well as an exponential decay. The Gaussian decay is caused by the existence of zero point in the Fermi distribution function. The correlation length for each point of scaling factor varies up to the initial condition, monotonically or non-monotonically
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Sex-related differences in amygdala functional connectivity during resting conditions.
Kilpatrick, L A; Zald, D H; Pardo, J V; Cahill, L F
2006-04-01
Recent neuroimaging studies have established a sex-related hemispheric lateralization of amygdala involvement in memory for emotionally arousing material. Here, we examine the possibility that sex-related differences in amygdala involvement in memory for emotional material develop from differential patterns of amygdala functional connectivity evident in the resting brain. Seed voxel partial least square analyses of regional cerebral blood flow data revealed significant sex-related differences in amygdala functional connectivity during resting conditions. The right amygdala was associated with greater functional connectivity in men than in women. In contrast, the left amygdala was associated with greater functional connectivity in women than in men. Furthermore, the regions displaying stronger functional connectivity with the right amygdala in males (sensorimotor cortex, striatum, pulvinar) differed from those displaying stronger functional connectivity with the left amygdala in females (subgenual cortex, hypothalamus). These differences in functional connectivity at rest may link to sex-related differences in medical and psychiatric disorders.
Directory of Open Access Journals (Sweden)
Ahmad El Sayed
2015-01-01
Full Text Available A lifted hydrogen/nitrogen turbulent jet flame issuing into a vitiated coflow is investigated using the conditional moment closure (CMC supplemented by the presumed mapping function (PMF approach for the modelling of conditional mixing and velocity statistics. Using a prescribed reference field, the PMF approach yields a presumed probability density function (PDF for the mixture fraction, which is then used in closing the conditional scalar dissipation rate (CSDR and conditional velocity in a fully consistent manner. These closures are applied to a lifted flame and the findings are compared to previous results obtained using β-PDF-based closures over a range of coflow temperatures (Tc. The PMF results are in line with those of the β-PDF and compare well to measurements. The transport budgets in mixture fraction and physical spaces and the radical history ahead of the stabilisation height indicate that the stabilisation mechanism is susceptible to Tc. As in the previous β-PDF calculations, autoignition around the “most reactive” mixture fraction remains the controlling mechanism for sufficiently high Tc. Departure from the β-PDF predictions is observed when Tc is decreased as PMF predicts stabilisation by means of premixed flame propagation. This conclusion is based on the observation that lean mixtures are heated by downstream burning mixtures in a preheat zone developing ahead of the stabilization height. The spurious sources, which stem from inconsistent CSDR modelling, are further investigated. The findings reveal that their effect is small but nonnegligible, most notably within the flame zone.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Buurman, Bianca M.; Hoogerduijn, Jita G.; de Haan, Rob J.; Abu-Hanna, Ameen; Lagaay, A. Margot; Verhaar, Harald J.; Schuurmans, Marieke J.; Levi, Marcel; de Rooij, Sophia E.
2011-01-01
Background To study the prevalence of eighteen geriatric conditions in older patients at admission, their reporting rate in discharge summaries and the impact of these conditions on mortality and functional decline one year after admission. Method A prospective multicenter cohort study conducted between 2006 and 2008 in two tertiary university teaching hospitals and one regional teaching hospital in the Netherlands. Patients of 65 years and older, acutely admitted and hospitalized for at least 48 hours, were invited to participate. Eighteen geriatric conditions were assessed at hospital admission, and outcomes (mortality, functional decline) were assessed one year after admission. Results 639 patients were included, with a mean age of 78 years. IADL impairment (83%), polypharmacy (61%), mobility difficulty (59%), high levels of primary caregiver burden (53%), and malnutrition (52%) were most prevalent. Except for polypharmacy and cognitive impairment, the reporting rate of the geriatric conditions in discharge summaries was less than 50%. One year after admission, 35% had died and 33% suffered from functional decline. A high Charlson comorbidity index score, presence of malnutrition, high fall risk, presence of delirium and premorbid IADL impairment were associated with mortality and overall poor outcome (mortality or functional decline). Obesity lowered the risk for mortality. Conclusion Geriatric conditions were highly prevalent and associated with poor health outcomes after admission. Early recognition of these conditions in acutely hospitalized older patients and improving the handover to the general practitioner could lead to better health outcomes and reduce the burden of hospital admission for older patients. PMID:22110598
Body condition, diet and ecosystem function of red deer (Cervus elaphus in a fenced nature reserve
Directory of Open Access Journals (Sweden)
Camilla Fløjgaard
2017-07-01
Full Text Available Body condition, as a sign of animal welfare, is of management concern in rewilding projects where fenced animals are subject to winter starvation, which may conflict with animal welfare legislation. Investigating the relationship between body condition, age, sex, diet quality and diet composition is therefore relevant to increase understanding of herbivores' ecosystem function and to inform management. In this study, we focused on red deer, Cervus elaphus, in a fenced nature reserve in Denmark, where the deer are managed as ecosystem engineers to contribute to biodiversity conservation. We measured body mass and body size of 91 culled red deer, and determined diet composition using DNA metabarcoding and diet quality using fecal nitrogen on 246 fecal samples. We found that body condition was predicted by age and diet composition, but not diet quality. We also found that individuals of different body condition had different diets, i.e., the fecal samples of red deer in poorer body condition contained significantly more Ericaceae sequences than red deer in good body condition. This may imply that certain functions of red deer in ecosystems, such as regeneration of heather by grazing, may depend on variation in body condition within the population. Our findings call for the need to consider the consequences of management practices, including culling or supplemental feeding, on the outcomes of habitat restoration, and more broadly underline the importance of preserving the overall breath of herbivore ecosystem functions for effective biodiversity conservation.
Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew
2016-07-01
Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Realizing ecosystem services: wetland hydrologic function along a gradient of ecosystem condition.
McLaughlin, Daniel L; Cohen, Matthew J
2013-10-01
Wetlands provide numerous ecosystem services, from habitat provision to pollutant removal, floodwater storage, and microclimate regulation. Delivery of particular services relies on specific ecological functions, and thus to varying degree on wetland ecological condition, commonly quantified as departure from minimally impacted reference sites. Condition assessments are widely adopted as regulatory indicators of ecosystem function, and for some services (e.g., habitat) links between condition and function are often direct. For others, however, links are more tenuous, and using condition alone to enumerate ecosystem value (e.g., for compensatory mitigation) may underestimate important services. Hydrologic function affects many services cited in support of wetland protection both directly (floodwater retention, microclimate regulation) and indirectly (biogeochemical cycling, pollutant removal). We investigated links between condition and hydrologic function to test the hypothesis, embedded in regulatory assessment of wetland value, that condition predicts function. Condition was assessed using rapid and intensive approaches, including Florida's official wetland assessment tool, in 11 isolated forested wetlands in north Florida (USA) spanning a land use intensity gradient. Hydrologic function was assessed using hydrologic regime (mean, variance, and rates of change of water depth), and measurements of groundwater exchange and evapotranspiration (ET). Despite a wide range in condition, no systematic variation in hydrologic regime was observed; indeed reference sites spanned the full range of variation. In contrast, ET was affected by land use, with higher rates in intensive (agriculture and urban) landscapes in response to higher leaf area. ET determines latent heat exchange, which regulates microclimate, a valuable service in urban heat islands. Higher ET also indicates higher productivity and thus carbon cycling. Groundwater exchange regularly reversed flow direction
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Oral health conditions affect functional and social activities of terminally-ill cancer patients
Fischer, D.J.; Epstein, J.B.; Yao, Y.; Wilkie, D.J.
2013-01-01
Purpose Oral conditions are established complications in terminally-ill cancer patients. Yet despite significant morbidity, the characteristics and impact of oral conditions in these patients are poorly documented. The study objective was to characterize oral conditions in terminally-ill cancer patients to determine the presence, severity, and the functional and social impact of these oral conditions. Methods This was an observational clinical study including terminally-ill cancer patients (2.5–3 week life expectancy). Data were obtained via the Oral Problems Scale (OPS) that measures the presence of subjective xerostomia, orofacial pain, taste change, and the functional/social impact of oral conditions and a demographic questionnaire. A standardized oral examination was used to assess objective salivary hypofunction, fungal infection, mucosal erythema, and ulceration. Regression analysis and t test investigated the associations between measures. Results Of 104 participants, most were ≥50 years of age, female, and high-school educated; 45% were African American, 43% Caucasian, and 37% married. Oral conditions frequencies were: salivary hypofunction (98%), mucosal erythema (50%), ulceration (20%), fungal infection (36%), and other oral problems (46%). Xerostomia, taste change, and orofacial pain all had significant functional impact; poral ulcerations had significantly more orofacial pain with a social impact than patients without ulcers (p=.003). Erythema was significantly associated with fungal infection and with mucosal ulceration (pOral conditions significantly affect functional and social activities in terminally-ill cancer patients. Identification and management of oral conditions in these patients should therefore be an important clinical consideration. PMID:24232310
Hygienic, sanitary, physical, and functional conditions of Brazilian public school food services
Almeida,Kênia Machado de; André,Maria Cláudia Porfirio; Campos,Maria Raquel Hidalgo; Díaz,Mário Ernesto Piscoya
2014-01-01
OBJECTIVE: To verify the physical, functional, hygienic, and sanitary conditions of the food services of municipal schools located in the Brazilian Midwest region. METHODS: This is a cross-sectional study of 296 school food services conducted from February to June 2012. The food services were assessed by a semi-structured check list divided into the following sections: physical conditions, available equipment, food handlers' conduct, and food service cleaning processes and procedures. Th...
The Conditions for Functional Mechanisms of Compensation and Reward for Environmental Services
Directory of Open Access Journals (Sweden)
Brent M. Swallow
2010-12-01
Full Text Available Mechanisms of compensation and reward for environmental services (CRES are becoming increasingly contemplated as means for managing human-environment interactions. Most of the functional mechanisms in the tropics have been developed within the last 15 years; many developing countries still have had little experience with functional mechanisms. We consider the conditions that foster the origin and implementation of functional mechanisms. Deductive and inductive approaches are combined. Eight hypotheses are derived from theories of institution and policy change. Five case studies, from Latin America, Africa, and Asia, are then reviewed according to a common framework. The results suggest the following to be important conditions for functional CRES mechanisms: (1 localized scarcity for particular environmental services, (2 influence from international environmental agreements and international organizations, (3 government policies and public attitudes favoring a mixture of regulatory and market-based instruments, and (4 security of individual and group property rights.
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Ahmed M. A. El-Sayed; Ebtisam O. Bin-Taher
2011-01-01
In this article, we prove the existence of positive nondecreasing solutions for a multi-term fractional-order functional differential equations. We consider Cauchy boundary problems with: nonlocal conditions, two-point boundary conditions, integral conditions, and deviated arguments.
Directory of Open Access Journals (Sweden)
Disha Gupta-Ostermann
2015-03-01
Full Text Available In a previous Method Article, we have presented the ‘Structure-Activity Relationship (SAR Matrix’ (SARM approach. The SARM methodology is designed to systematically extract structurally related compound series from screening or chemical optimization data and organize these series and associated SAR information in matrices reminiscent of R-group tables. SARM calculations also yield many virtual candidate compounds that form a “chemical space envelope” around related series. To further extend the SARM approach, different methods are developed to predict the activity of virtual compounds. In this follow-up contribution, we describe an activity prediction method that derives conditional probabilities of activity from SARMs and report representative results of first prospective applications of this approach.
Directory of Open Access Journals (Sweden)
Disha Gupta-Ostermann
2015-04-01
Full Text Available In a previous Method Article, we have presented the ‘Structure-Activity Relationship (SAR Matrix’ (SARM approach. The SARM methodology is designed to systematically extract structurally related compound series from screening or chemical optimization data and organize these series and associated SAR information in matrices reminiscent of R-group tables. SARM calculations also yield many virtual candidate compounds that form a “chemical space envelope” around related series. To further extend the SARM approach, different methods are developed to predict the activity of virtual compounds. In this follow-up contribution, we describe an activity prediction method that derives conditional probabilities of activity from SARMs and report representative results of first prospective applications of this approach.
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
M.J. Crop (Meindert); C.C. Baan (Carla); S.S. Korevaar (Sander); J.N.M. IJzermans (Jan); M. Pescatori (Mario); A. Stubbs (Andrew); W.F.J. van IJcken (Wilfred); M.H. Dahlke (Marc); E. Eggenhofer (Elke); W. Weimar (Willem); M.J. Hoogduijn (Martin)
2010-01-01
textabstractThere is emerging interest in the application of mesenchymal stem cells (MSC) for the prevention and treatment of autoimmune diseases, graft-versus-host disease and allograft rejection. It is, however, unknown how inflammatory conditions affect phenotype and function of MSC. Adipose
Communication: Two types of flat-planes conditions in density functional theory.
Yang, Xiaotian Derrick; Patel, Anand H G; Miranda-Quintana, Ramón Alain; Heidar-Zadeh, Farnaz; González-Espinoza, Cristina E; Ayers, Paul W
2016-07-21
Using results from atomic spectroscopy, we show that there are two types of flat-planes conditions. The first type of flat-planes condition occurs when the energy as a function of the number of electrons of each spin, Nα and Nβ, has a derivative discontinuity on a line segment where the number of electrons, Nα + Nβ, is an integer. The second type of flat-planes condition occurs when the energy has a derivative discontinuity on a line segment where the spin polarization, Nα - Nβ, is an integer, but does not have a discontinuity associated with an integer number of electrons. Type 2 flat planes are rare-we observed just 15 type 2 flat-planes conditions out of the 4884 cases we tested-but their mere existence has implications for the design of exchange-correlation energy density functionals. To facilitate the development of functionals that have the correct behavior with respect to both fractional number of electrons and fractional spin polarization, we present a dataset for the chromium atom and its ions that can be used to test new functionals.
Communication: Two types of flat-planes conditions in density functional theory
Energy Technology Data Exchange (ETDEWEB)
Yang, Xiaotian Derrick; Patel, Anand H. G.; González-Espinoza, Cristina E.; Ayers, Paul W. [Department of Chemistry and Chemical Biology, McMaster University, Hamilton, Ontario LBS 4M1 (Canada); Miranda-Quintana, Ramón Alain [Department of Chemistry and Chemical Biology, McMaster University, Hamilton, Ontario LBS 4M1 (Canada); Laboratory of Computational and Theoretical Chemistry, Faculty of Chemistry, University of Havana, Havana (Cuba); Heidar-Zadeh, Farnaz [Department of Chemistry and Chemical Biology, McMaster University, Hamilton, Ontario LBS 4M1 (Canada); Department of Inorganic and Physical Chemistry, Ghent University, Krijgslaan 281 (S3), 9000 Gent (Belgium); Center for Molecular Modeling, Ghent University, Technologiepark 903, 9052 Zwijnaarde (Belgium)
2016-07-21
Using results from atomic spectroscopy, we show that there are two types of flat-planes conditions. The first type of flat-planes condition occurs when the energy as a function of the number of electrons of each spin, N{sub α} and N{sub β}, has a derivative discontinuity on a line segment where the number of electrons, N{sub α} + N{sub β}, is an integer. The second type of flat-planes condition occurs when the energy has a derivative discontinuity on a line segment where the spin polarization, N{sub α} – N{sub β}, is an integer, but does not have a discontinuity associated with an integer number of electrons. Type 2 flat planes are rare—we observed just 15 type 2 flat-planes conditions out of the 4884 cases we tested—but their mere existence has implications for the design of exchange-correlation energy density functionals. To facilitate the development of functionals that have the correct behavior with respect to both fractional number of electrons and fractional spin polarization, we present a dataset for the chromium atom and its ions that can be used to test new functionals.
Hill, R; Larkum, A W D; Frankart, C; Kühl, M; Ralph, P J
2004-01-01
Mass coral bleaching is linked to elevated sea surface temperatures, 1-2 degrees C above average, during periods of intense light. These conditions induce the expulsion of zooxanthellae from the coral host in response to photosynthetic damage in the algal symbionts. The mechanism that triggers this release has not been clearly established and to further our knowledge of this process, fluorescence rise kinetics have been studied for the first time. Corals that were exposed to elevated temperature (33 degrees C) and light (280 mumol photons m(-2) s(-1)), showed distinct changes in the fast polyphasic induction of chlorophyll-a fluorescence, indicating biophysical changes in the photochemical processes. The fluorescence rise over the first 2000ms was monitored in three species of corals for up to 8 h, with a PEA fluorometer and an imaging-PAM. Pocillopora damicornis showed the least impact on photosynthetic apparatus, while Acropora nobilis was the most sensitive, with Cyphastrea serailia intermediate between the other two species. A. nobilis showed a remarkable capacity for recovery from bleaching conditions. For all three species, a steady decline in the slope of the initial rise and the height of the J-transient was observed, indicating the loss of functional Photosystem II (PS II) centres under elevated-temperature conditions. A significant loss of PS II centres was confirmed by a decline in photochemical quenching when exposed to bleaching stress. Non-photochemical quenching was identified as a significant mechanism for dissipating excess energy as heat under the bleaching conditions. Photophosphorylation could explain this decline in PS II activity. State transitions, a component of non-photochemical quenching, was a probable cause of the high non-photochemical quenching during bleaching and this mechanism is associated with the phosphorylation-induced dissociation of the light harvesting complexes from the PS II reaction centres. This reversible process may
Role of Vitamin D in Maintaining Renal Epithelial Barrier Function in Uremic Conditions
Directory of Open Access Journals (Sweden)
Milos Mihajlovic
2017-11-01
Full Text Available As current kidney replacement therapies are not efficient enough for end-stage renal disease (ESRD treatment, a bioartificial kidney (BAK device, based on conditionally immortalized human proximal tubule epithelial cells (ciPTEC, could represent an attractive solution. The active transport activity of such a system was recently demonstrated. In addition, endocrine functions of the cells, such as vitamin D activation, are relevant. The organic anion transporter 1 (OAT-1 overexpressing ciPTEC line presented 1α-hydroxylase (CYP27B1, 24-hydroxylase (CYP24A1 and vitamin D receptor (VDR, responsible for vitamin D activation, degradation and function, respectively. The ability to produce and secrete 1α,25-dihydroxy-vitamin D3, was shown after incubation with the precursor, 25-hydroxy-vitamin D3. The beneficial effect of vitamin D on cell function and behavior in uremic conditions was studied in the presence of an anionic uremic toxins mixture. Vitamin D could restore cell viability, and inflammatory and oxidative status, as shown by cell metabolic activity, interleukin-6 (IL-6 levels and reactive oxygen species (ROS production, respectively. Finally, vitamin D restored transepithelial barrier function, as evidenced by decreased inulin-FITC leakage in biofunctionalized hollow fiber membranes (HFM carrying ciPTEC-OAT1. In conclusion, the protective effects of vitamin D in uremic conditions and proven ciPTEC-OAT1 endocrine function encourage the use of these cells for BAK application.
Tamburrini, M; Romano, M; Giardina, B; di Prisco, G
1999-02-01
In the framework of a study on molecular adaptations of the oxygen-transport and storage systems to extreme conditions in Antarctic marine organisms, we have investigated the structure/function relationship in Emperor penguin (Aptenodytes forsteri) myoglobin, in search of correlation with the bird life style. In contrast with previous reports, the revised amino acid sequence contains one additional residue and 15 differences. The oxygen-binding parameters seem well adapted to the diving behaviour of the penguin and to the environmental conditions of the Antarctic habitat. Addition of lactate has no major effect on myoglobin oxygenation over a large temperature range. Therefore, metabolic acidosis does not impair myoglobin function under conditions of prolonged physical effort, such as diving.
Directory of Open Access Journals (Sweden)
Westerholm Roger
2010-07-01
Full Text Available Abstract Background Traffic emissions including diesel engine exhaust are associated with increased respiratory and cardiovascular morbidity and mortality. Controlled human exposure studies have demonstrated impaired vascular function after inhalation of exhaust generated by a diesel engine under idling conditions. Objectives To assess the vascular and fibrinolytic effects of exposure to diesel exhaust generated during urban-cycle running conditions that mimic ambient 'real-world' exposures. Methods In a randomised double-blind crossover study, eighteen healthy male volunteers were exposed to diesel exhaust (approximately 250 μg/m3 or filtered air for one hour during intermittent exercise. Diesel exhaust was generated during the urban part of the standardized European Transient Cycle. Six hours post-exposure, vascular vasomotor and fibrinolytic function was assessed during venous occlusion plethysmography with intra-arterial agonist infusions. Measurements and Main Results Forearm blood flow increased in a dose-dependent manner with both endothelial-dependent (acetylcholine and bradykinin and endothelial-independent (sodium nitroprusside and verapamil vasodilators. Diesel exhaust exposure attenuated the vasodilatation to acetylcholine (P Conclusion Exposure to diesel exhaust generated under transient running conditions, as a relevant model of urban air pollution, impairs vasomotor function and endogenous fibrinolysis in a similar way as exposure to diesel exhaust generated at idling. This indicates that adverse vascular effects of diesel exhaust inhalation occur over different running conditions with varying exhaust composition and concentrations as well as physicochemical particle properties. Importantly, exposure to diesel exhaust under ETC conditions was also associated with a novel finding of impaired of calcium channel-dependent vasomotor function. This implies that certain cardiovascular endpoints seem to be related to general diesel
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Functional brain networks involved in decision-making under certain and uncertain conditions
Energy Technology Data Exchange (ETDEWEB)
Farrar, Danielle C.; Moss, Mark B.; Killiany, Ronald J. [Boston University School of Medicine, Department of Anatomy and Neurobiology, Boston, MA (United States); Mian, Asim Z. [Boston University School of Medicine, Department of Radiology, Boston, MA (United States); Budson, Andrew E. [VA Boston Healthcare System, Boston, MA (United States)
2018-01-15
The aim of this study was to describe imaging markers of decision-making under uncertain conditions in normal individuals, in order to provide baseline activity to compare to impaired decision-making in pathological states. In this cross-sectional study, 19 healthy subjects ages 18-35 completed a novel decision-making card-matching task using a Phillips T3 Scanner and a 32-channel head coil. Functional data were collected in six functional runs. In one condition of the task, the participant was certain of the rule to apply to match the cards; in the other condition, the participant was uncertain. We performed cluster-based comparison of the two conditions using FSL fMRI Expert Analysis Tool and network-based analysis using MATLAB. The uncertain > certain comparison yielded three clusters - a midline cluster that extended through the midbrain, the thalamus, bilateral prefrontal cortex, the striatum, and bilateral parietal/occipital clusters. The certain > uncertain comparison yielded bilateral clusters in the insula, parietal and temporal lobe, as well as a medial frontal cluster. A larger, more connected functional network was found in the uncertain condition. The involvement of the insula, parietal cortex, temporal cortex, ventromedial prefrontal cortex, and orbitofrontal cortex of the certain condition reinforces the notion that certainty is inherently rewarding. For the uncertain condition, the involvement of the prefrontal cortex, parietal cortex, striatum, thalamus, amygdala, and hippocampal involvement was expected, as these are areas involved in resolving uncertainty and rule updating. The involvement of occipital cortical involvement and midbrain involvement may be attributed to increased visual attention and increased motor control. (orig.)
Functional brain networks involved in decision-making under certain and uncertain conditions
International Nuclear Information System (INIS)
Farrar, Danielle C.; Moss, Mark B.; Killiany, Ronald J.; Mian, Asim Z.; Budson, Andrew E.
2018-01-01
The aim of this study was to describe imaging markers of decision-making under uncertain conditions in normal individuals, in order to provide baseline activity to compare to impaired decision-making in pathological states. In this cross-sectional study, 19 healthy subjects ages 18-35 completed a novel decision-making card-matching task using a Phillips T3 Scanner and a 32-channel head coil. Functional data were collected in six functional runs. In one condition of the task, the participant was certain of the rule to apply to match the cards; in the other condition, the participant was uncertain. We performed cluster-based comparison of the two conditions using FSL fMRI Expert Analysis Tool and network-based analysis using MATLAB. The uncertain > certain comparison yielded three clusters - a midline cluster that extended through the midbrain, the thalamus, bilateral prefrontal cortex, the striatum, and bilateral parietal/occipital clusters. The certain > uncertain comparison yielded bilateral clusters in the insula, parietal and temporal lobe, as well as a medial frontal cluster. A larger, more connected functional network was found in the uncertain condition. The involvement of the insula, parietal cortex, temporal cortex, ventromedial prefrontal cortex, and orbitofrontal cortex of the certain condition reinforces the notion that certainty is inherently rewarding. For the uncertain condition, the involvement of the prefrontal cortex, parietal cortex, striatum, thalamus, amygdala, and hippocampal involvement was expected, as these are areas involved in resolving uncertainty and rule updating. The involvement of occipital cortical involvement and midbrain involvement may be attributed to increased visual attention and increased motor control. (orig.)
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Makhtar, Siti Noormiza; Senik, Mohd Harizal
2018-02-01
The availability of massive amount of neuronal signals are attracting widespread interest in functional connectivity analysis. Functional interactions estimated by multivariate partial coherence analysis in the frequency domain represent the connectivity strength in this study. Modularity is a network measure for the detection of community structure in network analysis. The discovery of community structure for the functional neuronal network was implemented on multi-electrode array (MEA) signals recorded from hippocampal regions in isoflurane-anaesthetized Lister-hooded rats. The analysis is expected to show modularity changes before and after local unilateral kainic acid (KA)-induced epileptiform activity. The result is presented using color-coded graphic of conditional modularity measure for 19 MEA nodes. This network is separated into four sub-regions to show the community detection within each sub-region. The results show that classification of neuronal signals into the inter- and intra-modular nodes is feasible using conditional modularity analysis. Estimation of segregation properties using conditional modularity analysis may provide further information about functional connectivity from MEA data.
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Bohmian Conditional Wave Functions (and the status of the quantum state)
International Nuclear Information System (INIS)
Norsen, Travis
2016-01-01
The de Broglie - Bohm pilot-wave theory - uniquely among realistic candidate quantum theories - allows a straightforward and simple definition of the wave function of a subsystem of some larger system (such as the entire universe). Such sub-system wave functions are called “Conditional Wave Functions” (CWFs). Here we explain this concept and indicate the CWF's role in the Bohmian explanation of the usual quantum formalism, and then develop (and motivate) the more speculative idea that something like single-particle wave functions could replace the (ontologically problematical) universal wave function in some future, empirically adequate, pilot-wave-type theory. Throughout the presentation is pedagogical and points are illustrated with simple toy models. (paper)
Directory of Open Access Journals (Sweden)
Gildo Almeida da Silva
2011-06-01
Full Text Available The aim of this work was to study the production of functional protein in yeast culture. The cells of Saccharomyces cerevisiae Embrapa 1B (K+R+ killed a strain of Saccharomyces cerevisiae Embrapa 26B (K-R-in grape must and YEPD media. The lethal effect of toxin-containing supernatant and the effect of aeration upon functional killer production and the correlation between the products of anaerobic metabolism and the functional toxin formation were evaluated. The results showed that at low sugar concentration, the toxin of the killer strain of Sacch. cerevisiae was only produced under anaerobic conditions . The system of killer protein production showed to be regulated by Pasteur and Crabtree effects. As soon as the ethanol was formed, the functional killer toxin was produced. The synthesis of the active killer toxin seemed to be somewhat associated with the switch to fermentation process and with concomitant alcohol dehydrogenase (ADH activity.
International Nuclear Information System (INIS)
Aftab, S.
2013-01-01
Excellent electrical, mechanical, optical and thermal properties are attributed to carbon nanotubes. Carbon nanotubes need to be functionalized to form a homogeneous dispersion. In this work, catalytically produced carbon nanotubes have been functionalized under two different conditions using the same acid medium. The effect of the two reaction routes on the carbon nanotubes, in terms of the extent of covalent functionalization has been determined by several techniques. Scanning electron microscopy aided in the observation of their morphology and X-ray diffraction was used to ascertain their structure. Other analytical characterization tools employed were Fourier transform infrared spectroscopy, Zeta potential measurement, UV spectroscopy, Oxygen percentage analysis, Boehm's titration and visual dispersion. Results show that carbon nanotubes functionalized by refluxing in the acids are much better dispersed. (author)
Directory of Open Access Journals (Sweden)
Ariana E Sutton-Grier
2011-02-01
Full Text Available Global biodiversity loss has prompted research on the relationship between species diversity and ecosystem functioning. Few studies have examined how plant diversity impacts belowground processes; even fewer have examined how varying resource levels can influence the effect of plant diversity on microbial activity. In a field experiment in a restored wetland, we examined the role of plant trait diversity (or functional diversity, (FD and its interactions with natural levels of variability of soil properties, on a microbial process, denitrification potential (DNP. We demonstrated that FD significantly affected microbial DNP through its interactions with soil conditions; increasing FD led to increased DNP but mainly at higher levels of soil resources. Our results suggest that the effect of species diversity on ecosystem functioning may depend on environmental factors such as resource availability. Future biodiversity experiments should examine how natural levels of environmental variability impact the importance of biodiversity to ecosystem functioning.
Prefrontal-limbic Functional Connectivity during Acquisition and Extinction of Conditioned Fear.
Barrett, Douglas W; Gonzalez-Lima, F
2018-04-15
This study is a new analysis to obtain novel metabolic data on the functional connectivity of prefrontal-limbic regions in Pavlovian fear acquisition and extinction of tone-footshock conditioning. Mice were analyzed with the fluorodeoxyglucose (FDG) autoradiographic method to metabolically map regional brain activity. New FDG data were sampled from the nuclei of the habenula and other regions implicated in aversive conditioning, such as infralimbic cortex, amygdala and periaqueductal gray regions. The activity patterns among these regions were inter-correlated during acquisition, extinction or pseudorandom training to develop a functional connectivity model. Two subdivisions of the habenular complex showed increased activity after acquisition relative to extinction, with the pseudorandom group intermediate between the other two groups. Significant acquisition activation effects were also found in centromedial amygdala, dorsomedial and ventrolateral periaqueductal gray. FDG uptake increases during extinction were found only in dorsal and ventral infralimbic cortex. The overall pattern of activity correlations between these regions revealed extensive but differential functional connectivity during acquisition and extinction training, with less functional connectivity found after pseudorandom training. Interestingly, habenula nuclei showed a distinct pattern of inter-correlations with amygdala nuclei during extinction. The functional connectivity model revealed changing interactions among infralimbic cortex, amygdala, habenula and periaqueductal gray regions through the stages of Pavlovian fear acquisition and extinction. This study provided new data on the contributions of the habenula to fear conditioning, and revealed previously unreported infralimbic-amygdala-habenula-periaqueductal gray interactions implicated in acquisition and extinction of conditioned fear. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.
Facet-Dependent Oxidative Goethite Growth As a Function of Aqueous Solution Conditions.
Strehlau, Jennifer H; Stemig, Melissa S; Penn, R Lee; Arnold, William A
2016-10-04
Nitroaromatic compounds are groundwater pollutants that can be degraded through reactions with Fe(II) adsorbed on iron oxide nanoparticles, although little is known about the evolving reactivity of the minerals with continuous pollutant exposure. In this work, Fe(II)/goethite reactivity toward 4-chloronitrobenzene (4-ClNB) as a function of pH, organic matter presence, and reactant concentrations was explored using sequential-spike batch reactors. Reaction rate constants were smaller with lower pH, introduction of organic matter, and diluted reactant concentrations as compared to a reference condition. Reaction rate constants did not change with the number of 4-ClNB spikes for all reaction conditions. Under all conditions, oxidative goethite growth was demonstrated through X-ray diffraction, magnetic characterization, and transmission electron microscopy. Nonparametric statistics were applied to compare histograms of lengths and widths of goethite nanoparticles as a function of varied solution conditions. The conditions that slowed the reaction also resulted in statistically shorter and wider particles than for the faster reactions. Additionally, added organic matter interfered with particle growth on the favorable {021} faces to a greater extent, with statistically reduced rate of growth on the tip facets and increased rate of growth on the side facets. These data demonstrate that oxidative growth of goethite in aqueous systems is dependent on major groundwater variables, such as pH and the presence of organic matter, which could lead to the evolving reactivity of goethite particles in natural environments.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Lizen, Benoit; Moens, Charlotte; Mouheiche, Jinane; Sacré, Thomas; Ahn, Marie-Thérèse; Jeannotte, Lucie; Salti, Ahmad; Gofflot, Françoise
2017-01-01
Hoxa5 is a member of the Hox gene family that plays critical roles in successive steps of the central nervous system formation during embryonic and fetal development. In the mouse, Hoxa5 was recently shown to be expressed in the medulla oblongata and the pons from fetal stages to adulthood. In these territories, Hoxa5 transcripts are enriched in many precerebellar neurons and several nuclei involved in autonomic functions, while the HOXA5 protein is detected mainly in glutamatergic and GABAergic neurons. However, whether HOXA5 is functionally required in these neurons after birth remains unknown. As a first approach to tackle this question, we aimed at determining the molecular programs downstream of the HOXA5 transcription factor in the context of the postnatal brainstem. A comparative transcriptomic analysis was performed in combination with gene expression localization, using a conditional postnatal Hoxa5 loss-of-function mouse model. After inactivation of Hoxa5 at postnatal days (P)1–P4, we established the transcriptome of the brainstem from P21 Hoxa5 conditional mutants using RNA-Seq analysis. One major finding was the downregulation of several genes associated with synaptic function in Hoxa5 mutant specimens including different actors involved in glutamatergic synapse, calcium signaling pathway, and GABAergic synapse. Data were confirmed and extended by reverse transcription quantitative polymerase chain reaction analysis, and the expression of several HOXA5 candidate targets was shown to co-localize with Hoxa5 transcripts in precerebellar nuclei. Together, these new results revealed that HOXA5, through the regulation of key actors of the glutamatergic/GABAergic synapses and calcium signaling, might be involved in synaptogenesis, synaptic transmission, and synaptic plasticity of the cortico-ponto-cerebellar circuitry in the postnatal brainstem. PMID:29187810
Directory of Open Access Journals (Sweden)
Benoit Lizen
2017-11-01
Full Text Available Hoxa5 is a member of the Hox gene family that plays critical roles in successive steps of the central nervous system formation during embryonic and fetal development. In the mouse, Hoxa5 was recently shown to be expressed in the medulla oblongata and the pons from fetal stages to adulthood. In these territories, Hoxa5 transcripts are enriched in many precerebellar neurons and several nuclei involved in autonomic functions, while the HOXA5 protein is detected mainly in glutamatergic and GABAergic neurons. However, whether HOXA5 is functionally required in these neurons after birth remains unknown. As a first approach to tackle this question, we aimed at determining the molecular programs downstream of the HOXA5 transcription factor in the context of the postnatal brainstem. A comparative transcriptomic analysis was performed in combination with gene expression localization, using a conditional postnatal Hoxa5 loss-of-function mouse model. After inactivation of Hoxa5 at postnatal days (P1–P4, we established the transcriptome of the brainstem from P21 Hoxa5 conditional mutants using RNA-Seq analysis. One major finding was the downregulation of several genes associated with synaptic function in Hoxa5 mutant specimens including different actors involved in glutamatergic synapse, calcium signaling pathway, and GABAergic synapse. Data were confirmed and extended by reverse transcription quantitative polymerase chain reaction analysis, and the expression of several HOXA5 candidate targets was shown to co-localize with Hoxa5 transcripts in precerebellar nuclei. Together, these new results revealed that HOXA5, through the regulation of key actors of the glutamatergic/GABAergic synapses and calcium signaling, might be involved in synaptogenesis, synaptic transmission, and synaptic plasticity of the cortico-ponto-cerebellar circuitry in the postnatal brainstem.
Schrödinger functional boundary conditions and improvement for N > 3
DEFF Research Database (Denmark)
Hietanen, A.; Karavirta, T.; Vilaseca, P.
2014-01-01
The standard method to calculate non-perturbatively the evolution of the running coupling of a SU(N ) gauge theory is based on the Schrodinger functional (SF). In this paper we construct a family of boundary fields for general values of N which enter the standard definition of the SF coupling. We...... provide spatial boundary conditions for fermions in several representations which reduce the condition number of the squared Dirac operator. In addition, we calculate the improvement coefficients for N > 3 needed to remove boundary cutoff effects from the gauge action. After this, residual cutoff effects...
Energy Technology Data Exchange (ETDEWEB)
Lopez, J. Gonzalez [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Renner, D.B. [Jefferson Lab, Newport News, VA (United States); Shindler, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik
2012-08-23
The use of chirally rotated boundary conditions provides a formulation of the Schroedinger functional that is compatible with automatic O(a) improvement of Wilson fermions up to O(a) boundary contributions. The elimination of bulk O(a) effects requires the non-perturbative tuning of the critical mass and one additional boundary counterterm. We present the results of such a tuning in a quenched setup for several values of the renormalized gauge coupling, from perturbative to nonperturbative regimes, and for a range of lattice spacings. We also check that the correct boundary conditions and symmetries are restored in the continuum limit. (orig.)
International Nuclear Information System (INIS)
Lopez, J. Gonzalez; Jansen, K.; Renner, D.B.; Shindler, A.
2012-01-01
The use of chirally rotated boundary conditions provides a formulation of the Schroedinger functional that is compatible with automatic O(a) improvement of Wilson fermions up to O(a) boundary contributions. The elimination of bulk O(a) effects requires the non-perturbative tuning of the critical mass and one additional boundary counterterm. We present the results of such a tuning in a quenched setup for several values of the renormalized gauge coupling, from perturbative to nonperturbative regimes, and for a range of lattice spacings. We also check that the correct boundary conditions and symmetries are restored in the continuum limit. (orig.)
International Nuclear Information System (INIS)
Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K
2011-01-01
We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)
Directory of Open Access Journals (Sweden)
Ahmad Herison
2014-04-01
Full Text Available Mangrove ecosystem existence is important for environment and other organisms because of its ecological and economical values, so that management and preservation of mangrove ecosystem are needed. The purpose of this research was to determine the existing condition of mangrove, both its distribution and its functional transformation in Indah Kapuk Coastal Area. Avicennia marina becomes important as wave attenuation, a form of abrasion antidote. Transect-Square and Spot-Check methods were used to determine the existing condition of A.marina mangrove forests. Autocad program, coordinate converter, Google Earth, Google Map, and Arc View were applied in process of making mangrove distribution map. In western of research location exactly at Station 1 and Station 2, the density value of mangrove was 450 and 825 tree ha-1, respectively with sparse category because they were contaminated by waste and litter. In eastern of research location namely Station 3, Station 4, and Station 5 the mangroves grow well with density value of 650 (sparse, 1,500 (very dense, and 1,200 tree ha-1 (fair, respectively, eventhough the contamination still happened. The mangrove forests around the stations do not function as wave attenuation because there were many waterfront constructions which have replaced the function of mangrove forests to damp the wave. In short, it can be stated that the mangrove's function has changed in a case of wave attenuation. The function of mangrove forests is not determined by mangrove forest density but it is determined by mangrove's free position.
The Impact of Different Environmental Conditions on Cognitive Function: A Focused Review
Taylor, Lee; Watkins, Samuel L.; Marshall, Hannah; Dascombe, Ben J.; Foster, Josh
2016-01-01
Cognitive function defines performance in objective tasks that require conscious mental effort. Extreme environments, namely heat, hypoxia, and cold can all alter human cognitive function due to a variety of psychological and/or biological processes. The aims of this Focused Review were to discuss; (1) the current state of knowledge on the effects of heat, hypoxic and cold stress on cognitive function, (2) the potential mechanisms underpinning these alterations, and (3) plausible interventions that may maintain cognitive function upon exposure to each of these environmental stressors. The available evidence suggests that the effects of heat, hypoxia, and cold stress on cognitive function are both task and severity dependent. Complex tasks are particularly vulnerable to extreme heat stress, whereas both simple and complex task performance appear to be vulnerable at even at moderate altitudes. Cold stress also appears to negatively impact both simple and complex task performance, however, the research in this area is sparse in comparison to heat and hypoxia. In summary, this focused review provides updated knowledge regarding the effects of extreme environmental stressors on cognitive function and their biological underpinnings. Tyrosine supplementation may help individuals maintain cognitive function in very hot, hypoxic, and/or cold conditions. However, more research is needed to clarify these and other postulated interventions. PMID:26779029
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
A Necessary Moment Condition for the Fractional Functional Central Limit Theorem
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Morten Ørregaard
We discuss the moment condition for the fractional functional central limit theorem (FCLT) for partial sums of x(t)=¿^(-d)u(t), where d ¿ (-1/2,1/2) is the fractional integration parameter and u(t) is weakly dependent. The classical condition is existence of q>max(2,(d+1/2)-¹) moments...... of the innovation sequence. When d is close to -1/2 this moment condition is very strong. Our main result is to show that under some relatively weak conditions on u(t), the existence of q=max(2,(d+1/2)-¹) is in fact necessary for the FCLT for fractionally integrated processes and that q>max(2,(d+1/2)-¹) moments...... are necessary and sufficient for more general fractional processes. Davidson and de Jong (2000) presented a fractional FCLT where only q>2 finite moments are assumed, which is remarkable because it is the only FCLT where the moment condition has been weakened relative to the earlier condition. As a corollary...
A necessary moment condition for the fractional functional central limit theorem
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Morten Ørregaard
We discuss the moment condition for the fractional functional central limit theorem (FCLT) for partial sums of x_{t}=Delta^{-d}u_{t}, where d in (-1/2,1/2) is the fractional integration parameter and u_{t} is weakly dependent. The classical condition is existence of q>max(2,(d+1/2)^{-1}) moments...... of the innovation sequence. When d is close to -1/2 this moment condition is very strong. Our main result is to show that under some relatively weak conditions on u_{t}, the existence of q≥max(2,(d+1/2)^{-1}) is in fact necessary for the FCLT for fractionally integrated processes and that q>max(2,(d+1....../2)^{-1}) moments are necessary and sufficient for more general fractional processes. Davidson and de Jong (2000) presented a fractional FCLT where only q>2 finite moments are assumed, which is remarkable because it is the only FCLT where the moment condition has been weakened relative to the earlier condition...
One-Pot Syntesis of 3-Functionalized 4-Hydroxycoumarin under Catalyst-Free Conditions
Directory of Open Access Journals (Sweden)
Yang Gao
2018-01-01
Full Text Available A concise and efficient one-pot synthesis of 3-functionalized 4-hydroxycoumarin derivatives via a three-component domino reaction of 4-hydroxycoumarin, phenylglyoxal and 3-arylaminocyclopent-2-enone or 4-arylaminofuran-2(5H-one under catalyst-free and microwave irradiation conditions is described. This synthesis involves a group-assisted purification process, which avoids traditional recrystallization and chromatographic purification methods.
A Research on Functional Status, Environmental Conditions, and Risk of Falls in Dementia
Eshkoor, Sima Ataollahi; Hamid, Tengku Aizan; Nudin, Siti Sa'adiah Hassan; Mun, Chan Yoke
2014-01-01
This study aimed to determine the effects of disability, physical activity, and functional status as well as environmental conditions on the risk of falls among the elderly with dementia after adjusting for sociodemographic factors. Data were derived from a group including 1210 Malaysian elderly who were demented and noninstitutionalized. The study was a national cross-sectional survey that was entitled “Determinants of Health Status among Older Malaysians.” Approximately 17% of subjects expe...
Mytych, Joanna; Ligarski, Mariusz J.
2018-03-01
The quality management systems compliant with the ISO 9001:2009 have been thoroughly researched and described in detail in the world literature. The accredited management systems used in the testing laboratories and compliant with the ISO/IEC 17025:2005 have been mainly described in terms of the system design and implementation. They have also been investigated from the analytical point of view. Unfortunately, a low number of studies concerned the management system functioning in the accredited testing laboratories. The aim of following study was to assess the management system functioning in the accredited testing laboratories in Poland. On 8 October 2015, 1,213 accredited testing laboratories were present in Poland. They investigated various scientific areas and substances/objects. There are more and more such laboratories that have various problems and different long-term experience when it comes to the implementation, maintenance and improvement of the management systems. The article describes the results of the conducted expert assessment (survey) carried out to examine the conditions for the functioning of a management system in an accredited laboratory. It also focuses on the characteristics of the accredited research laboratories in Poland. The authors discuss the selection of the external and internal conditions that may affect the accredited management system. They show how the experts assessing the selected conditions were chosen. The survey results are also presented.
Functional approach to the problem of self-gravitating systems: Conditions of integrability
International Nuclear Information System (INIS)
Filippi, Simonetta; Ruffini, Remo; Sepulveda, Alonso
2002-01-01
Using a functional method based on the introduction of a velocity potential to solve the Euler, continuity and Poisson equations, a new analytic study of the equilibrium of self-gravitating rotating systems with a polytropic equation of state has permitted the formulation of the conditions of integrability. For the polytropic index n=1 in the incompressible case (∇·v(vector sign)=0), we are able to find the conditions for solving the problem of the equilibrium of polytropic self-gravitating systems that rotate and have nonuniform vorticity. This work contains the conditions which give analytic and quasi-analytic solutions for the equilibrium of polytropic stars and galactic systems in Newtonian gravity. In special cases, explicit analytic solutions are presented
Chen, Cong; Zhang, Ning; Li, Weizhong; Song, Yongchen
2015-12-15
Functional groups on silica surfaces under CO2 sequestration conditions are complex due to reactions among supercritical CO2, brine and silica. Molecular dynamics simulations have been performed to investigate the effects of hydroxyl functional groups on wettability. It has been found that wettability shows a strong dependence on functional groups on silica surfaces: silanol number density, space distribution, and deprotonation/protonation degree. For neutral silica surfaces with crystalline structure (Q(3), Q(3)/Q(4), Q(4)), as silanol number density decreases, contact angle increases from 33.5° to 146.7° at 10.5 MPa and 318 K. When Q(3) surface changes to an amorphous structure, water contact angle increases 20°. Water contact angle decreases about 12° when 9% of silanol groups on Q(3) surface are deprotonated. When the deprotonation degree increases to 50%, water contact angle decreases to 0. The dependence of wettability on silica surface functional groups was used to analyze contact angle measurement ambiguity in literature. The composition of silica surfaces is complicated under CO2 sequestration conditions, the results found in this study may help to better understand wettability of CO2/brine/silica system.
Estimated conditional score function for missing mechanism model with nonignorable nonresponse
Institute of Scientific and Technical Information of China (English)
CUI Xia; ZHOU Yong
2017-01-01
Missing data mechanism often depends on the values of the responses,which leads to nonignorable nonresponses.In such a situation,inference based on approaches that ignore the missing data mechanism could not be valid.A crucial step is to model the nature of missingness.We specify a parametric model for missingness mechanism,and then propose a conditional score function approach for estimation.This approach imputes the score function by taking the conditional expectation of the score function for the missing data given the available information.Inference procedure is then followed by replacing unknown terms with the related nonparametric estimators based on the observed data.The proposed score function does not suffer from the non-identifiability problem,and the proposed estimator is shown to be consistent and asymptotically normal.We also construct a confidence region for the parameter of interest using empirical likelihood method.Simulation studies demonstrate that the proposed inference procedure performs well in many settings.We apply the proposed method to a data set from research in a growth hormone and exercise intervention study.
Directory of Open Access Journals (Sweden)
Tao Ma
2017-03-01
Full Text Available Surface functionalization of sensor chip for probe immobilization is crucial for the biosensing applications of surface plasmon resonance (SPR sensors. In this paper, we report a method circulating the dopamine aqueous solution to coat polydopamine film on sensing surface for surface functionalization of SPR chip. The polydopamine film with available thickness can be easily prepared by controlling the circulation time and the biorecognition elements can be immobilized on the polydopamine film for specific molecular interaction analysis. These operations are all performed under flow condition in the fluidic system, and have the advantages of easy implementation, less time consuming, and low cost, because the reagents and devices used in the operations are routinely applied in most laboratories. In this study, the specific absorption between the protein A probe immobilized on the sensing surface and human immunoglobulin G in the buffer is monitored based on this surface functionalization strategy to demonstrated its feasibility for SPR biosensing applications.
A Research on Functional Status, Environmental Conditions, and Risk of Falls in Dementia
Directory of Open Access Journals (Sweden)
Sima Ataollahi Eshkoor
2014-01-01
Full Text Available This study aimed to determine the effects of disability, physical activity, and functional status as well as environmental conditions on the risk of falls among the elderly with dementia after adjusting for sociodemographic factors. Data were derived from a group including 1210 Malaysian elderly who were demented and noninstitutionalized. The study was a national cross-sectional survey that was entitled “Determinants of Health Status among Older Malaysians.” Approximately 17% of subjects experienced falls. The results showed that ethnic non-Malay (OR=1.73 and functional decline (OR=1.67 significantly increased the risk of falls in samples (P0.05. It was concluded that functional decline and ethnic non-Malay increased the risk of falls but the increased environmental quality reduced falls.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Nuclear data uncertainties: I, Basic concepts of probability
International Nuclear Information System (INIS)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Osteocalcin protects pancreatic beta cell function and survival under high glucose conditions
Energy Technology Data Exchange (ETDEWEB)
Kover, Karen, E-mail: kkover@cmh.edu [Division of Endocrine/Diabetes, Children' s Mercy Hospital & Clinics, Kansas City, MO 64108 (United States); University of Missouri-Kansas City School of Medicine, Kansas City, MO 64108 (United States); Yan, Yun; Tong, Pei Ying; Watkins, Dara; Li, Xiaoyu [Division of Endocrine/Diabetes, Children' s Mercy Hospital & Clinics, Kansas City, MO 64108 (United States); University of Missouri-Kansas City School of Medicine, Kansas City, MO 64108 (United States); Tasch, James; Hager, Melissa [Kansas City University Medical Biosciences, Kansas City, MO (United States); Clements, Mark; Moore, Wayne V. [Division of Endocrine/Diabetes, Children' s Mercy Hospital & Clinics, Kansas City, MO 64108 (United States); University of Missouri-Kansas City School of Medicine, Kansas City, MO 64108 (United States)
2015-06-19
Diabetes is characterized by progressive beta cell dysfunction and loss due in part to oxidative stress that occurs from gluco/lipotoxicity. Treatments that directly protect beta cell function and survival in the diabetic milieu are of particular interest. A growing body of evidence suggests that osteocalcin, an abundant non-collagenous protein of bone, supports beta cell function and proliferation. Based on previous gene expression data by microarray, we hypothesized that osteocalcin protects beta cells from glucose-induced oxidative stress. To test our hypothesis we cultured isolated rat islets and INS-1E cells in the presence of normal, high, or high glucose ± osteocalcin for up to 72 h. Oxidative stress and viability/mitochondrial function were measured by H{sub 2}O{sub 2} assay and Alamar Blue assay, respectively. Caspase 3/7 activity was also measured as a marker of apoptosis. A functional test, glucose stimulated insulin release, was conducted and expression of genes/protein was measured by qRT-PCR/western blot/ELISA. Osteocalcin treatment significantly reduced high glucose-induced H{sub 2}O{sub 2} levels while maintaining viability/mitochondrial function. Osteocalcin also significantly improved glucose stimulated insulin secretion and insulin content in rat islets after 48 h of high glucose exposure compared to untreated islets. As expected sustained high glucose down-regulated gene/protein expression of INS1 and BCL2 while increasing TXNIP expression. Interestingly, osteocalcin treatment reversed the effects of high glucose on gene/protein expression. We conclude that osteocalcin can protect beta cells from the negative effects of glucose-induced oxidative stress, in part, by reducing TXNIP expression, thereby preserving beta cell function and survival. - Highlights: • Osteocalcin reduces glucose-induced oxidative stress in beta cells. • Osteocalcin preserves beta cell function and survival under stress conditions. • Osteocalcin reduces glucose
Osteocalcin protects pancreatic beta cell function and survival under high glucose conditions
International Nuclear Information System (INIS)
Kover, Karen; Yan, Yun; Tong, Pei Ying; Watkins, Dara; Li, Xiaoyu; Tasch, James; Hager, Melissa; Clements, Mark; Moore, Wayne V.
2015-01-01
Diabetes is characterized by progressive beta cell dysfunction and loss due in part to oxidative stress that occurs from gluco/lipotoxicity. Treatments that directly protect beta cell function and survival in the diabetic milieu are of particular interest. A growing body of evidence suggests that osteocalcin, an abundant non-collagenous protein of bone, supports beta cell function and proliferation. Based on previous gene expression data by microarray, we hypothesized that osteocalcin protects beta cells from glucose-induced oxidative stress. To test our hypothesis we cultured isolated rat islets and INS-1E cells in the presence of normal, high, or high glucose ± osteocalcin for up to 72 h. Oxidative stress and viability/mitochondrial function were measured by H 2 O 2 assay and Alamar Blue assay, respectively. Caspase 3/7 activity was also measured as a marker of apoptosis. A functional test, glucose stimulated insulin release, was conducted and expression of genes/protein was measured by qRT-PCR/western blot/ELISA. Osteocalcin treatment significantly reduced high glucose-induced H 2 O 2 levels while maintaining viability/mitochondrial function. Osteocalcin also significantly improved glucose stimulated insulin secretion and insulin content in rat islets after 48 h of high glucose exposure compared to untreated islets. As expected sustained high glucose down-regulated gene/protein expression of INS1 and BCL2 while increasing TXNIP expression. Interestingly, osteocalcin treatment reversed the effects of high glucose on gene/protein expression. We conclude that osteocalcin can protect beta cells from the negative effects of glucose-induced oxidative stress, in part, by reducing TXNIP expression, thereby preserving beta cell function and survival. - Highlights: • Osteocalcin reduces glucose-induced oxidative stress in beta cells. • Osteocalcin preserves beta cell function and survival under stress conditions. • Osteocalcin reduces glucose-induced TXNIP
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis
Institute of Scientific and Technical Information of China (English)
无
1990-01-01
A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Gornicka-Pawlak, Elzbieta; Jabłońska, Anna; Chyliński, Andrzej; Domańska-Janik, Krystyna
2009-01-01
The present study investigated influence of housing conditions on motor functions recovery and exploratory behavior following ouabain focal brain lesion in the rat. During 30 days post-surgery period rats were housed individually in standard cages (IS) or in groups in enriched environment (EE) and behaviorally tested. The EE lesioned rats showed enhanced recovery from motor impairments in walking beam task, comparing with IS animals. Contrarily, in the open field IS rats (both lesioned and control) traveled a longer distance, showed less habituation and spent less time resting at the home base than the EE animals. Unlike the EE lesioned animals, the lesioned IS rats, presented a tendency to hyperactivity in postinjury period. Turning tendency was significantly affected by unilateral brain lesion only in the EE rats. We can conclude that housing conditions distinctly affected the rat's behavior in classical laboratory tests.
Crosslinking of SAVY-4000 O-rings as a Function of Aging Conditions
Energy Technology Data Exchange (ETDEWEB)
Van Buskirk, Caleb Griffith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-09-08
SAVY-4000 containers were developed as a part of DOE M 441.1-1 to protect workers who handle stored nuclear material from exposure due to loss of containment.1 The SAVY-4000 is comprised of three parts: a lid, a container, and a cross-linked fluoropolymer O-ring. Degradation of the O-ring during use could limit the lifetime of the SAVY-4000. In order to quantify the chemical changes of the Oring over time, the molecular weight between crosslinks was determined as a function of aging conditions using a swelling technique. Because the O-ring is a cross-linked polymer, it will absorb solvent into its matrix without dissolving. The relative amount of solvent uptake can be related to the degree of crosslinking using an equation developed by Paul Flory and John Rehner Jr3. This method was used to analyze O-ring samples aged under thermal and ionizing-radiation conditions. It was found that at the harsher thermal gaining conditions in absence of ionizing-radiation the average molecular weight between crosslinks decreased, indicating a rise in crosslinks, which may be attributable to advanced aging with no ionizing radiation present. Inversely, in the presence of ionizing radiation it was found that material has a higher level of cross-linking with age. This information could be used to help predict the lifetime of the O-rings in SAVY-4000 containers under service conditions.
Hygienic, sanitary, physical, and functional conditions of Brazilian public school food services
Directory of Open Access Journals (Sweden)
Kênia Machado de Almeida
2014-06-01
Full Text Available OBJECTIVE: To verify the physical, functional, hygienic, and sanitary conditions of the food services of municipal schools located in the Brazilian Midwest region. METHODS: This is a cross-sectional study of 296 school food services conducted from February to June 2012. The food services were assessed by a semi-structured check list divided into the following sections: physical conditions, available equipment, food handlers' conduct, and food service cleaning processes and procedures. The study variables were classified as compliant or noncompliant with the regulations passed by the National Sanitary Surveillance Agency. RESULTS: Noncompliances were found in all study food services, especially with respect to food service conditions, and the wiring and plumbing in the food preparation area. In this section, 62.7 to 95.9% of the food services did not comply with nine out of the thirteen study items. The main problems were: poorly cleaned external areas, deteriorated walls, floors, ceilings, roofs, drains, and roof gutters; and unscreened doors and windows, allowing the entrance of insects; among others. The main noncompliance regarding processes and procedures was the uncontrolled temperature of the ready-to-eat foods. CONCLUSION: The conditions of the study food services are unsatisfactory for the production of safe meals, possibly compromising meal quality, food safety, and the effectiveness of the School Food Program.
Vázquez-Campos, Xabier; Kinsela, Andrew S; Bligh, Mark W; Harrison, Jennifer J; Payne, Timothy E; Waite, T David
2017-09-01
During the 1960s, small quantities of radioactive materials were codisposed with chemical waste at the Little Forest Legacy Site (Sydney, Australia) in 3-meter-deep, unlined trenches. Chemical and microbial analyses, including functional and taxonomic information derived from shotgun metagenomics, were collected across a 6-week period immediately after a prolonged rainfall event to assess the impact of changing water levels upon the microbial ecology and contaminant mobility. Collectively, results demonstrated that oxygen-laden rainwater rapidly altered the redox balance in the trench water, strongly impacting microbial functioning as well as the radiochemistry. Two contaminants of concern, plutonium and americium, were shown to transition from solid-iron-associated species immediately after the initial rainwater pulse to progressively more soluble moieties as reducing conditions were enhanced. Functional metagenomics revealed the potentially important role that the taxonomically diverse microbial community played in this transition. In particular, aerobes dominated in the first day, followed by an increase of facultative anaerobes/denitrifiers at day 4. Toward the mid-end of the sampling period, the functional and taxonomic profiles depicted an anaerobic community distinguished by a higher representation of dissimilatory sulfate reduction and methanogenesis pathways. Our results have important implications to similar near-surface environmental systems in which redox cycling occurs. IMPORTANCE The role of chemical and microbiological factors in mediating the biogeochemistry of groundwaters from trenches used to dispose of radioactive materials during the 1960s is examined in this study. Specifically, chemical and microbial analyses, including functional and taxonomic information derived from shotgun metagenomics, were collected across a 6-week period immediately after a prolonged rainfall event to assess how changing water levels influence microbial ecology and
Bayoumi, A
2003-01-01
All the existing books in Infinite Dimensional Complex Analysis focus on the problems of locally convex spaces. However, the theory without convexity condition is covered for the first time in this book. This shows that we are really working with a new, important and interesting field. Theory of functions and nonlinear analysis problems are widespread in the mathematical modeling of real world systems in a very broad range of applications. During the past three decades many new results from the author have helped to solve multiextreme problems arising from important situations, non-convex and
Singer-Dudek, Jessica; Oblak, Mara; Greer, R Douglas
2011-01-01
We tested the effects of an observational intervention (Greer & Singer-Dudek, 2008) on establishing children's books as conditioned reinforcers using a delayed multiple baseline design. Three preschool students with mild language and developmental delays served as the participants. Prior to the intervention, books did not function as reinforcers for any of the participants. The observational intervention consisted of a situation in which the participant observed a confederate being presented with access to books contingent on correct responses and the participant received nothing for correct responses. After several sessions of this treatment, the previously neutral books acquired reinforcing properties for maintenance and acquisition responses for all three participants.
Functions of Nitric Oxide (NO in Roots during Development and under Adverse Stress Conditions
Directory of Open Access Journals (Sweden)
Francisco J. Corpas
2015-05-01
Full Text Available The free radical molecule, nitric oxide (NO, is present in the principal organs of plants, where it plays an important role in a wide range of physiological functions. Root growth and development are highly regulated by both internal and external factors such as nutrient availability, hormones, pattern formation, cell polarity and cell cycle control. The presence of NO in roots has opened up new areas of research on the role of NO, including root architecture, nutrient acquisition, microorganism interactions and the response mechanisms to adverse environmental conditions, among others. Additionally, the exogenous application of NO throughout the roots has the potential to counteract specific damages caused by certain stresses. This review aims to provide an up-to-date perspective on NO functions in the roots of higher plants.
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
The probability of the false vacuum decay
International Nuclear Information System (INIS)
Kiselev, V.; Selivanov, K.
1983-01-01
The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given
Partition function zeros for the one-dimensional ordered plasma in Dirichlet boundary conditions
International Nuclear Information System (INIS)
Roumeliotis, J.; Smith, E.R.
1992-01-01
The authors consider the grand canonical partition function for the ordered one-dimensional, two-component plasma at fugacity ζ in an applied electric field E with Dirichlet boundary conditions. The system has a phase transition from a low-coupling phase with equally spaced particles to a high-coupling phase with particles clustered into dipolar pairs. An exact expression for the partition function is developed. In zero applied field the zeros in the ζ plane occupy the imaginary axis from -i∞ to -iζ c and iζ c to i∞ for some ζ c . They also occupy the diamond shape of four straight lines from ±iζ c to ζ c and from ±iζ c to -ζ c . The fugacity ζ acts like a temperature or coupling variable. The symmetry-breaking field is the applied electric field E. A finite-size scaling representation for the partition in scaled coupling and scaled electric field is developed. It has standard mean field form. When the scaled coupling is real, the zeros in the scaled field lie on the imaginary axis and pinch the real scaled field axis as the scaled coupling increases. The scaled partition function considered as a function of two complex variables, scaled coupling and scaled field, has zeros on a two-dimensional surface in a domain of four real variables. A numerical discussion of some of the properties of this surface is presented
Directory of Open Access Journals (Sweden)
João M Oliveira
Full Text Available Identifying the environmental gradients that control the functional structure of biological assemblages in reference conditions is fundamental to help river management and predict the consequences of anthropogenic stressors. Fish metrics (density of ecological guilds, and species richness from 117 least disturbed stream reaches in several western Iberia river basins were modelled with generalized linear models in order to investigate the importance of regional- and local-scale abiotic gradients to variation in functional structure of fish assemblages. Functional patterns were primarily associated with regional features, such as catchment elevation and slope, rainfall, and drainage area. Spatial variations of fish guilds were thus associated with broad geographic gradients, showing (1 pronounced latitudinal patterns, affected mainly by climatic factors and topography, or (2 at the basin level, strong upstream-downstream patterns related to stream position in the longitudinal gradient. Maximum native species richness was observed in midsize streams in accordance with the river continuum concept. The findings of our study emphasized the need to use a multi-scale approach in order to fully assess the factors that govern the functional organization of biotic assemblages in 'natural' streams, as well as to improve biomonitoring and restoration of fluvial ecosystems.
A new wall function boundary condition including heat release effect for supersonic combustion flows
International Nuclear Information System (INIS)
Gao, Zhen-Xun; Jiang, Chong-Wen; Lee, Chun-Hian
2016-01-01
Highlights: • A new wall function including heat release effect is theoretically derived. • The new wall function is a unified form holding for flows with/without combustion. • The new wall function shows good results for a supersonic combustion case. - Abstract: A new wall function boundary condition considering combustion heat release effect (denoted as CWFBC) is proposed, for efficient predictions of skin friction and heat transfer in supersonic combustion flows. Based on a standard flow model including boundary-layer combustion, the Shvab–Zeldovich coupling parameters are introduced to derive a new velocity law-of-the-wall including the influence of combustion. For the temperature law-of-the-wall, it is proposed to use the enthalpy–velocity relation, instead of the Crocco–Busemann equation, to eliminate explicit influence of chemical reactions. The obtained velocity and temperature law-of-the-walls constitute the CWFBC, which is a unified form simultaneously holding for single-species, multi-species mixing and multi-species reactive flows. The subsequent numerical simulations using this CWFBC on an experimental case indicate that the CWFBC could accurately reflect the influences on the skin friction and heat transfer by the chemical reactions and heat release, and show large improvements compared to previous WFBC. Moreover, the CWFBC can give accurate skin friction and heat flux for a coarse mesh with y"+ up to 200 for the experimental case, except for slightly larger discrepancy of the wall heat flux around ignition position.
THE FUNCTIONS OF THE CIVIL SOCIETY UNDER THE CONDITIONS OF MODERN RUSSIAN MODERNIZATION
Directory of Open Access Journals (Sweden)
Василий Вячеславович Рябев
2013-11-01
Full Text Available This article discusses the functions of the civil society in contemporary Russian realias. The purpose is to identify and classify the most important functions of the civil society under the conditions of comprehensive modern Russian modernization. The article presents the author's classification of the functions of the civil society, based on the analysis of significant studies of foreign and Russian researchers. Here is a detailed analysis of each function focused on the most relevant to a modern Russian society, the potential of Russian civil society is revealed dealing with following issues: the institutionalization of the civic activity, the anti-corruption policy, the consolidation of democratic forces, the formation of the legal culture. The conclusions can be used in studies related to the civil society, the specific mechanisms for dealing with current social issues by means of civic participation, presented in this article, may be the object of interest for government institutions.DOI: http://dx.doi.org/10.12731/2218-7405-2013-10-6
Generation of mice harbouring a conditional loss-of-function allele of Gata6
Directory of Open Access Journals (Sweden)
Duncan Stephen A
2006-04-01
Full Text Available Abstract The zinc finger transcription factor GATA6 is believed to have important roles in the development of several organs including the liver, gastrointestinal tract and heart. However, analyses of the contribution of GATA6 toward organogenesis have been hampered because Gata6-/- mice fail to develop beyond gastrulation due to defects in extraembryonic endoderm function. We have therefore generated a mouse line harbouring a conditional loss-of-function allele of Gata6 using Cre/loxP technology. LoxP elements were introduced into introns flanking exon 2 of the Gata6 gene by homologous recombination in ES cells. Mice containing this altered allele were bred to homozygosity and were found to be viable and fertile. To assess the functional integrity of the loxP sites and to confirm that we had generated a Gata6 loss-of-function allele, we bred Gata6 'floxed' mice to EIIa-Cre mice in which Cre is ubiquitously expressed, and to Villin-Cre mice that express Cre in the epithelial cells of the intestine. We conclude that we have generated a line of mice in which GATA6 activity can be ablated in a cell type specific manner by expression of Cre recombinase. This line of mice can be used to establish the role of GATA6 in regulating embryonic development and various aspects of mammalian physiology.
Louca, Stilianos; Jacques, Saulo M S; Pires, Aliny P F; Leal, Juliana S; González, Angélica L; Doebeli, Michael; Farjalla, Vinicius F
2017-08-01
Phytotelmata in tank-forming Bromeliaceae plants are regarded as potential miniature models for aquatic ecology, but detailed investigations of their microbial communities are rare. Hence, the biogeochemistry in bromeliad tanks remains poorly understood. Here we investigate the structure of bacterial and archaeal communities inhabiting the detritus within the tanks of two bromeliad species, Aechmea nudicaulis and Neoregelia cruenta, from a Brazilian sand dune forest. We used metagenomic sequencing for functional community profiling and 16S sequencing for taxonomic profiling. We estimated the correlation between functional groups and various environmental variables, and compared communities between bromeliad species. In all bromeliads, microbial communities spanned a metabolic network adapted to oxygen-limited conditions, including all denitrification steps, ammonification, sulfate respiration, methanogenesis, reductive acetogenesis and anoxygenic phototrophy. Overall, CO2 reducers dominated in abundance over sulfate reducers, and anoxygenic phototrophs largely outnumbered oxygenic photoautotrophs. Functional community structure correlated strongly with environmental variables, between and within a single bromeliad species. Methanogens and reductive acetogens correlated with detrital volume and canopy coverage, and exhibited higher relative abundances in N. cruenta. A comparison of bromeliads to freshwater lake sediments and soil from around the world, revealed stark differences in terms of taxonomic as well as functional microbial community structure. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.
Probability of Survival Decision Aid (PSDA)
National Research Council Canada - National Science Library
Xu, Xiaojiang; Amin, Mitesh; Santee, William R
2008-01-01
A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...
Computation of Probabilities in Causal Models of History of Science
Directory of Open Access Journals (Sweden)
Osvaldo Pessoa Jr.
2006-12-01
Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
International Nuclear Information System (INIS)
Balderson, Michael; Brown, Derek; Johnson, Patricia; Kirkby, Charles
2016-01-01
The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic–based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for the different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15 mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT.
Balderson, Michael; Brown, Derek; Johnson, Patricia; Kirkby, Charles
2016-01-01
The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic-based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for the different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT. Copyright © 2016 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for
Weaver, David R; van der Vinne, Vincent; Giannaris, E Lela; Vajtay, Thomas J; Holloway, Kristopher L; Anaclet, Christelle
2018-04-01
Mice with targeted gene disruption have provided important information about the molecular mechanisms of circadian clock function. A full understanding of the roles of circadian-relevant genes requires manipulation of their expression in a tissue-specific manner, ideally including manipulation with high efficiency within the suprachiasmatic nuclei (SCN). To date, conditional manipulation of genes within the SCN has been difficult. In a previously developed mouse line, Cre recombinase was inserted into the vesicular GABA transporter (Vgat) locus. Since virtually all SCN neurons are GABAergic, this Vgat-Cre line seemed likely to have high efficiency at disrupting conditional alleles in SCN. To test this premise, the efficacy of Vgat-Cre in excising conditional (fl, for flanked by LoxP) alleles in the SCN was examined. Vgat-Cre-mediated excision of conditional alleles of Clock or Bmal1 led to loss of immunostaining for products of the targeted genes in the SCN. Vgat-Cre + ; Clock fl/fl ; Npas2 m/m mice and Vgat-Cre + ; Bmal1 fl/fl mice became arrhythmic immediately upon exposure to constant darkness, as expected based on the phenotype of mice in which these genes are disrupted throughout the body. The phenotype of mice with other combinations of Vgat-Cre + , conditional Clock, and mutant Npas2 alleles also resembled the corresponding whole-body knockout mice. These data indicate that the Vgat-Cre line is useful for Cre-mediated recombination within the SCN, making it useful for Cre-enabled technologies including gene disruption, gene replacement, and opto- and chemogenetic manipulation of the SCN circadian clock.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Carlsson, Daniel; Pettersson, Hans; Burström, Lage; Nilsson, Tohr; Wahlström, Jens
2016-01-01
This study aimed to examine the effects of 14 months of military training comprising cold winter conditions on neurosensory and vascular function in the hands and feet. Military conscripts (N=54) were assessed with quantitative sensory testing comprising touch, temperature, and vibration perception thresholds and finger systolic blood pressure (FSBP) after local cooling and a questionnaire on neurosensory and vascular symptoms at both baseline and follow-up. Ambient air temperature was recorded with body worn temperature loggers. The subjects showed reduced sensitivity to perception of touch, warmth, cold and vibrations in both the hands and feet except from vibrotactile perception in digit two of the right hand (right dig 2). Cold sensations, white fingers, and pain/discomfort when exposed to cold as well as pain increased in both prevalence and severity. There were no statistically significant changes in FSBP after local cooling. Fourteen months of winter military training comprising cold winter conditions reduced sensation from touch, warmth, cold, and vibrotactile stimulus in both hands and feet and increased the severity and prevalence of symptoms and pain. The vascular function in the hands, measured by FSBP after local cooling, was not affected.
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Gravity and count probabilities in an expanding universe
Bouchet, Francois R.; Hernquist, Lars
1992-01-01
The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Directory of Open Access Journals (Sweden)
Еprintsev А.Т.
2006-12-01
Full Text Available Salt-induced changes in malatdehydrogenase system activity make the essential contribution to cell adaptation to stress condition. The enzyme systems of C4-plants are most interesting due to their ability for adaptation to environment conditions. The role of separate components of malatdehydrogenase complex of mesophyll and bundle sheath cells of corn in formation of adaptive reaction in stressful conditions is investigated in presented work.The activation of all enzymes of malatdehydrogenase system and the subsequent decrease in their activity was observed in mesophyll durring the first stage of adaptation to salt influence. In bundle sheath cells such parameters are differed from control less essentially. Fast accumulation of piruvate in cells and malate in both investigated tissues was induced. The further salinity led to falling of concentration this intermediate. The concentration of piruvate was below control level, and it was raised by the end of an exposition.The results show that sodium chloride causes induction of Krebs-cycle in mesophyll and bundle sheath cells of corn and intensification of Hatch-Slack cycle. The described differences in function malatdehydrogenase systems of mesophyll and bundle sheath cells of leaves of corn under salinity mainly consist of the activity of enzymes of a studied complex in bundle sheath cells is subject to the minimal changes in comparison with mesophyll. Role of this enzymesystem in mechanisms of adaptive reaction of various tissues of corn to salt stress is discussed.
International Nuclear Information System (INIS)
Wang Guo; Staunton, Siobhan
2005-01-01
A thorough understanding of the dynamics of radiostrontium in soil is required to allow accurate long-term predictions of its mobility. We have followed the soil solution distribution of 85 Sr as a function of time under controlled conditions over 4 months and studied the effect of soil moisture content and organic matter amendments. Data have been compared to redox conditions and soil pH. To fuel the ongoing debate on the validity of distribution coefficient (K d ) values measured in dilute suspension, we have compared values obtained from the activity concentration in soil solution obtained by centrifugation to data obtained in suspension with or without air-drying of the soil samples after incubation. The 85 Sr adsorption properties of soil, incubated without prior contamination were also measured. There is some time-dependent adsorption of Sr. This is partly due to changing soil composition due to the decomposition of added organic matter and anaerobic conditions induced by flooding. There is also a kinetic effect, but adsorption remains largely reversible. Most of the observed effects are lost when soil is suspended in electrolyte solution
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Directory of Open Access Journals (Sweden)
Ahmed M. A. El-Sayed
2011-12-01
Full Text Available In this article, we prove the existence of positive nondecreasing solutions for a multi-term fractional-order functional differential equations. We consider Cauchy boundary problems with: nonlocal conditions, two-point boundary conditions, integral conditions, and deviated arguments.
Progranulin facilitates conversion and function of regulatory T cells under inflammatory conditions.
Directory of Open Access Journals (Sweden)
Fanhua Wei
Full Text Available The progranulin (PGRN is known to protect regulatory T cells (Tregs from a negative regulation by TNF-α, and its levels are elevated in various kinds of autoimmune diseases. Whether PGRN directly regulates the conversion of CD4+CD25-T cells into Foxp3-expressing regulatory T cells (iTreg, and whether PGRN affects the immunosuppressive function of Tregs, however, remain unknown. In this study we provide evidences demonstrating that PGRN is able to stimulate the conversion of CD4+CD25-T cells into iTreg in a dose-dependent manner in vitro. In addition, PGRN showed synergistic effects with TGF-β1 on the induction of iTreg. PGRN was required for the immunosuppressive function of Tregs, since PGRN-deficient Tregs have a significant decreased ability to suppress the proliferation of effector T cells (Teff. In addition, PGRN deficiency caused a marked reduction in Tregs number in the course of inflammatory arthritis, although no significant difference was observed in the numbers of Tregs between wild type and PGRN deficient mice during development. Furthermore, PGRN deficiency led to significant upregulation of the Wnt receptor gene Fzd2. Collectively, this study reveals that PGRN directly regulates the numbers and function of Tregs under inflammatory conditions, and provides new insight into the immune regulatory mechanism of PGRN in the pathogenesis of inflammatory and immune-related diseases.
A modular open platform for systematic functional studies under physiological conditions
Mulholland, Christopher B.; Smets, Martha; Schmidtmann, Elisabeth; Leidescher, Susanne; Markaki, Yolanda; Hofweber, Mario; Qin, Weihua; Manzo, Massimiliano; Kremmer, Elisabeth; Thanisch, Katharina; Bauer, Christina; Rombaut, Pascaline; Herzog, Franz; Leonhardt, Heinrich; Bultmann, Sebastian
2015-01-01
Any profound comprehension of gene function requires detailed information about the subcellular localization, molecular interactions and spatio-temporal dynamics of gene products. We developed a multifunctional integrase (MIN) tag for rapid and versatile genome engineering that serves not only as a genetic entry site for the Bxb1 integrase but also as a novel epitope tag for standardized detection and precipitation. For the systematic study of epigenetic factors, including Dnmt1, Dnmt3a, Dnmt3b, Tet1, Tet2, Tet3 and Uhrf1, we generated MIN-tagged embryonic stem cell lines and created a toolbox of prefabricated modules that can be integrated via Bxb1-mediated recombination. We used these functional modules to study protein interactions and their spatio-temporal dynamics as well as gene expression and specific mutations during cellular differentiation and in response to external stimuli. Our genome engineering strategy provides a versatile open platform for efficient generation of multiple isogenic cell lines to study gene function under physiological conditions. PMID:26007658
Effect of heroin-conditioned auditory stimuli on cerebral functional activity in rats
Energy Technology Data Exchange (ETDEWEB)
Trusk, T.C.; Stein, E.A.
1988-08-01
Cerebral functional activity was measured as changes in distribution of the free fatty acid (1-14C)octanoate in autoradiograms obtained from rats during brief presentation of a tone previously paired to infusions of heroin or saline. Rats were trained in groups of three consisting of one heroin self-administering animal and two animals receiving yoked infusions of heroin or saline. Behavioral experiments in separate groups of rats demonstrated that these training parameters imparts secondary reinforcing properties to the tone for animals self-administering heroin while the tone remains behaviorally neutral in yoked-infusion animals. The optical densities of thirty-seven brain regions were normalized to a relative index for comparisons between groups. Previous pairing of the tone to heroin infusions irrespective of behavior (yoked-heroin vs. yoked-saline groups) produced functional activity changes in fifteen brain areas. In addition, nineteen regional differences in octanoate labeling density were evident when comparison was made between animals previously trained to self-administer heroin to those receiving yoked-heroin infusions, while twelve differences were noted when comparisons were made between the yoked vehicle and self administration group. These functional activity changes are presumed related to the secondary reinforcing capacity of the tone acquired by association with heroin, and may identify neural substrates involved in auditory signalled conditioning of positive reinforcement to opiates.
Effect of heroin-conditioned auditory stimuli on cerebral functional activity in rats
International Nuclear Information System (INIS)
Trusk, T.C.; Stein, E.A.
1988-01-01
Cerebral functional activity was measured as changes in distribution of the free fatty acid [1-14C]octanoate in autoradiograms obtained from rats during brief presentation of a tone previously paired to infusions of heroin or saline. Rats were trained in groups of three consisting of one heroin self-administering animal and two animals receiving yoked infusions of heroin or saline. Behavioral experiments in separate groups of rats demonstrated that these training parameters imparts secondary reinforcing properties to the tone for animals self-administering heroin while the tone remains behaviorally neutral in yoked-infusion animals. The optical densities of thirty-seven brain regions were normalized to a relative index for comparisons between groups. Previous pairing of the tone to heroin infusions irrespective of behavior (yoked-heroin vs. yoked-saline groups) produced functional activity changes in fifteen brain areas. In addition, nineteen regional differences in octanoate labeling density were evident when comparison was made between animals previously trained to self-administer heroin to those receiving yoked-heroin infusions, while twelve differences were noted when comparisons were made between the yoked vehicle and self administration group. These functional activity changes are presumed related to the secondary reinforcing capacity of the tone acquired by association with heroin, and may identify neural substrates involved in auditory signalled conditioning of positive reinforcement to opiates
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Leterrier, Marina; Corpas, Francisco J; Barroso, Juan B; Sandalio, Luisa M; del Río, Luis A
2005-08-01
In plant cells, ascorbate is a major antioxidant that is involved in the ascorbate-glutathione cycle. Monodehydroascorbate reductase (MDAR) is the enzymatic component of this cycle involved in the regeneration of reduced ascorbate. The identification of the intron-exon organization and the promoter region of the pea (Pisum sativum) MDAR 1 gene was achieved in pea leaves using the method of walking polymerase chain reaction on genomic DNA. The nuclear gene of MDAR 1 comprises nine exons and eight introns, giving a total length of 3,770 bp. The sequence of 544 bp upstream of the initiation codon, which contains the promoter and 5' untranslated region, and 190 bp downstream of the stop codon were also determined. The presence of different regulatory motifs in the promoter region of the gene might indicate distinct responses to various conditions. The expression analysis in different plant organs by northern blots showed that fruits had the highest level of MDAR. Confocal laser scanning microscopy analysis of pea leaves transformed with Agrobacterium tumefaciens having the binary vectors pGD, which contain the autofluorescent proteins enhanced green fluorescent protein and enhanced yellow fluorescent protein with the full-length cDNA for MDAR 1 and catalase, indicated that the MDAR 1 encoded the peroxisomal isoform. The functional analysis of MDAR by activity and protein expression was studied in pea plants grown under eight stress conditions, including continuous light, high light intensity, continuous dark, mechanical wounding, low and high temperature, cadmium, and the herbicide 2,4-dichlorophenoxyacetic acid. This functional analysis is representative of all the MDAR isoforms present in the different cell compartments. Results obtained showed a significant induction by high light intensity and cadmium. On the other hand, expression studies, performed by semiquantitative reverse transcription-polymerase chain reaction demonstrated differential expression patterns of
Braaker, Sonja; Obrist, Martin Karl; Ghazoul, Jaboury; Moretti, Marco
2017-05-01
Increasing development of urban environments creates high pressure on green spaces with potential negative impacts on biodiversity and ecosystem services. There is growing evidence that green roofs - rooftops covered with vegetation - can contribute mitigate the loss of urban green spaces by providing new habitats for numerous arthropod species. Whether green roofs can contribute to enhance taxonomic and functional diversity and increase connectivity across urbanized areas remains, however, largely unknown. Furthermore, only limited information is available on how environmental conditions shape green roof arthropod communities. We investigated the community composition of arthropods (Apidae, Curculionidae, Araneae and Carabidae) on 40 green roofs and 40 green sites at ground level in the city of Zurich, Switzerland. We assessed how the site's environmental variables (such as area, height, vegetation, substrate and connectivity among sites) affect species richness and functional diversity using generalized linear models. We used an extension of co-inertia analysis (RLQ) and fourth-corner analysis to highlight the mechanism underlying community assemblages across taxonomic groups on green roof and ground communities. Species richness was higher at ground-level sites, while no difference in functional diversity was found between green roofs and ground sites. Green roof arthropod diversity increased with higher connectivity and plant species richness, irrespective of substrate depth, height and area of green roofs. The species trait analysis reviewed the mechanisms related to the environmental predictors that shape the species assemblages of the different taxa at ground and roof sites. Our study shows the important contribution of green roofs in maintaining high functional diversity of arthropod communities across different taxonomic groups, despite their lower species richness compared with ground sites. Species communities on green roofs revealed to be characterized
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
Ye, Dong; Wu, Shu-Qun; Yu, Yao; Liu, Lin; Lu, Xin-Pei; Wu, Yue
2014-03-01
In this work, a mask-free method is introduced for patterned nitrogen doping of graphene using a micro-plasma jet under ambient condition. Raman and X-ray photoelectron spectroscopy spectra indicate that nitrogen atoms are incorporated into the graphene lattice with the two-dimensional spatial distribution precisely controlled in the range of mm down to 10 μm. Since the chemistry of the micro-plasma jet can be controlled by the choice of the gas mixture, this direct writing process with micro-plasma jet can be a versatile approach for patterned functionalization of graphene with high spatial resolution. This could have promising applications in graphene-based electronics.