A bimodal flexible distribution for lifetime data
Ramires, Thiago G.; Ortega, Edwin M. M.; Cordeiro, Gauss M.; Hens, Niel
2016-01-01
A four-parameter extended bimodal lifetime model called the exponentiated log-sinh Cauchy distribution is proposed. It extends the log-sinh Cauchy and folded Cauchy distributions. We derive some of its mathematical properties including explicit expressions for the ordinary moments and generating and quantile functions. The method of maximum likelihood is used to estimate the model parameters. We implement the fit of the model in the GAMLSS package and provide the codes. The flexibility of the...
Doubly localized surface plasmon resonance in bimodally distributed silver nanoparticles.
Ranjan, M
2012-06-01
Growth of bimodally distributed silver nanoparticles using sequential physical vapour deposition (PVD) is reported. Growth conditions of nanoparticles are defined in the following three steps: In the first step, nanoparticles are grown at a heated substrate and then exposed to atmosphere, in the second step, nanoparticles are vacuum annealed and finally re-deposition of silver is performed in the third step. This special way of deposition leads to the formation of bimodally distributed nanoparticles. It has been investigated that by changing the deposition time, different sets of bimodally distributed nanoparticles can be grown. Localized surface plasmon resonance (LSPR) of such bimodally distributed nanoparticles generates double plasmon resonance peaks with overlapped absorption spectra. Double plasmon resonance peaks provide a quick indication of the existence of two sets of nanoparticles. LSPR spectra of such bimodally distributed nanoparticles could be modeled with double Lorentz oscillator model. Inclusion of double Lorentz oscillator model indicates that there exist two sets of non-interacting nanoparticles resonating at different plasma frequencies. It is also reported that silver nanoparticles grown at a heated substrate, again attain the new shape while being exposed to atmosphere, followed by vacuum annealing at the same temperature. This is because of physisorption of oxygen at the silver surface and change in surface free energy. The re-shaping due to the adsorbed oxygen on the surface is responsible for bimodal size distribution of nanoparticles.
Asymmetric Bimodal Exponential Power Distribution on the Real Line
Directory of Open Access Journals (Sweden)
Mehmet Niyazi Çankaya
2018-01-01
Full Text Available The asymmetric bimodal exponential power (ABEP distribution is an extension of the generalized gamma distribution to the real line via adding two parameters that fit the shape of peakedness in bimodality on the real line. The special values of peakedness parameters of the distribution are a combination of half Laplace and half normal distributions on the real line. The distribution has two parameters fitting the height of bimodality, so capacity of bimodality is enhanced by using these parameters. Adding a skewness parameter is considered to model asymmetry in data. The location-scale form of this distribution is proposed. The Fisher information matrix of these parameters in ABEP is obtained explicitly. Properties of ABEP are examined. Real data examples are given to illustrate the modelling capacity of ABEP. The replicated artificial data from maximum likelihood estimates of parameters of ABEP and other distributions having an algorithm for artificial data generation procedure are provided to test the similarity with real data. A brief simulation study is presented.
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
The Bimodal Color Distribution of Small Kuiper Belt Objects
Wong, Ian; Brown, Michael E.
2017-04-01
We conducted a two-night photometric survey of small Kuiper Belt objects (KBOs) near opposition using the wide-field Hyper Suprime-Cam instrument on the 8.2 m Subaru Telescope. The survey covered about 90 deg2 of sky, with each field imaged in the g and I bands. We detected 356 KBOs, ranging in absolute magnitude from 6.5 to 10.4. Filtering for high-inclination objects within the hot KBO population, we show that the g - I color distribution is strongly bimodal, indicative of two color classes—the red and very red subpopulations. After categorizing objects into the two subpopulations by color, we present the first dedicated analysis of the magnitude distributions of the individual color subpopulations and demonstrate that the two distributions are roughly identical in shape throughout the entire size range covered by our survey. Comparing the color distribution of small hot KBOs with that of Centaurs, we find that they have similar bimodal shapes, thereby providing strong confirmation of previous explanations for the attested bimodality of Centaurs. We also show that the magnitude distributions of the two KBO color subpopulations and the two color subpopulations observed in the Jupiter Trojans are statistically indistinguishable. Finally, we discuss a hypothesis describing the origin of the KBO color bimodality based on our survey results. Based on data collected at Subaru Telescope, which is operated by the National Astronomical Observatory of Japan.
Rapid intensification and the bimodal distribution of tropical cyclone intensity.
Lee, Chia-Ying; Tippett, Michael K; Sobel, Adam H; Camargo, Suzana J
2016-02-03
The severity of a tropical cyclone (TC) is often summarized by its lifetime maximum intensity (LMI), and the climatological LMI distribution is a fundamental feature of the climate system. The distinctive bimodality of the LMI distribution means that major storms (LMI >96 kt) are not very rare compared with less intense storms. Rapid intensification (RI) is the dramatic strengthening of a TC in a short time, and is notoriously difficult to forecast or simulate. Here we show that the bimodality of the LMI distribution reflects two types of storms: those that undergo RI during their lifetime (RI storms) and those that do not (non-RI storms). The vast majority (79%) of major storms are RI storms. Few non-RI storms (6%) become major storms. While the importance of RI has been recognized in weather forecasting, our results demonstrate that RI also plays a crucial role in the TC climatology.
Multifractal Characteristics of Bimodal Mercury Pore Size Distribution Curves
dos Santos Bonini, C.; Alves, M. C.; Paz González, A.
2012-04-01
Characterization of Hg pore size distribution (PSDs) curves by monofractal or multifractal analysis has been demonstrated to be an useful tool, which allows a better understanding of the organization of the soil pore space. There are also evidences that multiscale analysis of different segments found in bimodal pore size distributions measured by Hg intrusion can provide further valuable information. In this study we selected bimodal PSDs from samples taken from an experimental area in São Paulo state, Brazil, where a revegetation trial was set up over saprolitic material. The saprolite was left abandoned after decapitation of an Oxisol for building purposes. The field trial consisted of various treatments with different grass species and amendments. Pore size distribution of the sampled aggregates was measured in the equivalent diameter range from 0.005 to about 50 μm and it was characterized by a bimodal pattern, so that two compartments, i.e. 0.005 to 0.2 μm and 0.2 to 50 μm, could be distinguished. The multifractal theory was used to analyse both segments. The scaling properties of these two segments could be fitted reasonably well with multifractal models. Multifractal parameters obtained for equivalent diameters for the segments > 0.2 and pore size distributions studied.
Bimodal distribution of damage morphology generated by ion implantation
International Nuclear Information System (INIS)
Mok, K.R.C.; Jaraiz, M.; Martin-Bragado, I.; Rubio, J.E.; Castrillo, P.; Pinacho, R.; Srinivasan, M.P.; Benistant, F.
2005-01-01
A nucleation and evolution model of damage based on amorphous pockets (APs) has recently been developed and implemented in an atomistic kinetic Monte Carlo simulator. In the model, APs are disordered structures (I n V m ), which are agglomerates of interstitials (I) and vacancies (V). This model has been used to study the composition and size distribution of APs during different ion implantations. Depending strongly on the dose rate, ion mass and implant temperature, the APs can evolve to a defect population where the agglomerates have a similar number of I and V (n ∼ m), or to a defect population with pure I (m ∼ 0) and pure V (n ∼ 0) clusters, or a mixture of APs and clusters. This behaviour corresponds to a bimodal (APs/clusters) distribution of damage. As the AP have different thermal stability compared to the I and V clusters, the same damage concentration obtained through different implant conditions has a different damage morphology and, consequently, exhibit a different resistance to subsequent thermal treatments
The climatic imprint of bimodal distributions in vegetation cover for western Africa
Yin, Z.; Dekker, S. C.; van den Hurk, B. J. J. M.; Dijkstra, H. A.
2016-01-01
Observed bimodal distributions of woody cover in western Africa provide evidence that alternative ecosystem states may exist under the same precipitation regimes. In this study, we show that bimodality can also be observed in mean annual shortwave radiation and above-ground biomass, which might
Bimodal distribution of glucose is not universally useful for diagnosing diabetes
DEFF Research Database (Denmark)
Vistisen, Dorte; Colagiuri, Stephen; Borch-Johnsen, Knut
2009-01-01
OBJECTIVE: Bimodality in the distribution of glucose has been used to define the cut point for the diagnosis of diabetes. Previous studies on bimodality have primarily been in populations with a high prevalence of type 2 diabetes, including one study in a white Caucasian population. All studies i...
Pande, C. S.; DeGiorgi, V. G.; E Moser, A.
2018-02-01
An attractive processing route for enhancing the yield strength of high-strength nanocrystalline metals and alloys while maintaining high ductility is to develop a bimodal grain size distribution (GSD), in which, supposedly, the finer grains provide strength, and the coarser grains maintain or even enhance ductility. We present a theoretical model predicting the strength of such a system, and show, analytically, how the yield stress is related to the various parameters of the bimodal GSD, such as volume fraction of the two components of the bimodal distribution and their standard deviations.
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Ilenkov, R. Ya; Taichenachev, A. V.; Yudin, V. I.; Prudnikov, O. N.
2018-01-01
The work is devoted to the study of the features and parameters of the momentum distributions of atoms laser cooled on weak optical transitions. It was shown that atoms distributions are described by a bimodal momentum distribution whose characteristics depends on the parameters of the light field. In a strong field a velocity selective coherent population trapping effect is observed.
mocca code for star cluster simulations - VI. Bimodal spatial distribution of blue stragglers
Hypki, Arkadiusz; Giersz, Mirek
2017-11-01
The paper presents an analysis of formation mechanism and properties of spatial distributions of blue stragglers in evolving globular clusters, based on numerical simulations done with the mocca code. First, there are presented N-body and mocca simulations which try to reproduce the simulations presented by Ferraro et al. (2012). Then, we show the agreement between N-body and the mocca code. Finally, we discuss the formation process of the bimodal distribution. We report that we could not reproduce simulations from Ferraro et al. (2012). Moreover, we show that the so-called bimodal spatial distribution of blue stragglers is a very transient feature. It is formed for one snapshot in time and it can easily vanish in the next one. Moreover, we show that the radius of avoidance proposed by Ferraro et al. (2012) goes out of sync with the apparent minimum of the bimodal distribution after about two half-mass relaxation times (without finding out what is the reason for that). This finding creates a real challenge for the dynamical clock, which uses this radius to determine the dynamical age of globular clusters. Additionally, the paper discusses a few important problems concerning the apparent visibilities of the bimodal distributions, which have to be taken into account while studying the spatial distributions of blue stragglers.
Bimodal grain-size distribution of Chinese loess, and its palaeoclimatic implications
Sun, D.G.; Bloemendal, J.; Rea, D.K.; An, Z.S.; Vandenberghe, J.; Lu, H.; Su, R.; Liu, T.S.
2004-01-01
Grain-size analysis indicates that Chinese loess generally shows a bimodal distribution with a coarse and a fine component. The coarse component, comprising the main part of the loess, has pronounced kurtosis and is well sorted, which is interpreted to be the product of dust storms generated by
Looking for bimodal distributions in multi-fragmentation reactions
International Nuclear Information System (INIS)
Gulminelli, F.
2007-01-01
The presence of a phase transition in a finite system can be deduced, together with its order, from the form of the distribution of the order parameter. This issue has been extensively studied in multifragmentation experiments, with results that do not appear fully consistent. In this paper we discuss the effect of the statistical ensemble or sorting conditions on the form of fragment distributions, and propose a new method, which can be easily implemented experimentally, to discriminate between different fragmentation scenarios. This method, based on a re-weighting of the measured distribution to account for the experimental constraints linked to the energy deposit, is tested on different simple models, and appears to provide a powerful discrimination. (author)
Bimodal score distributions and the Myers-Briggs Type Indicator: fact or artifact?
Bess, Tammy L; Harvey, Robert J
2002-02-01
We examined Myers-Briggs Type Indicator (MBTI) score distributions computed using item response theory (IRT) to assess the generalizability of earlier bimodality reports that have been cited in support of the "type" versus "trait" view of personality. Using the BILOG IRT program to score a sample of approximately 12,000 individuals who participated in leadership development programs, theta score distributions for the 4 dimensions of the MBTI computed using 10 (the BILOG default) versus 50 quadrature points were compared. Results indicated that past reports of bimodality were artifacts caused by BILOG's default use of a small number of quadrature points; when larger numbers of points were used, score distributions became strongly center-weighted. Although our findings are not supportive of the "type"-based hypothesis, the extremely high correlations between theta scores (rs > .996) suggest that no practical differences would be expected as a function of the number-of-quadrature-points decision.
The Multivariate Gaussian Probability Distribution
DEFF Research Database (Denmark)
Ahrendt, Peter
2005-01-01
This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical...
Exact results for the Kuramoto model with a bimodal frequency distribution
DEFF Research Database (Denmark)
Martens, Erik Andreas; Barreto, E.; Strogatz, S. H.
2009-01-01
We analyze a large system of globally coupled phase oscillators whose natural frequencies are bimodally distributed. The dynamics of this system has been the subject of long-standing interest. In 1984 Kuramoto proposed several conjectures about its behavior; ten years later, Crawford obtained...... the first analytical results by means of a local center manifold calculation. Nevertheless, many questions have remained open, especially about the possibility of global bifurcations. Here we derive the system’s stability diagram for the special case where the bimodal distribution consists of two equally......, where all the oscillators are desynchronized; partial synchrony, where a macro- scopic group of phase-locked oscillators coexists with a sea of desynchronized ones; and a standing wave state, where two counter-rotating groups of phase-locked oscillators emerge. Analytical results are presented...
Bayesian optimization for computationally extensive probability distributions.
Tamura, Ryo; Hukushima, Koji
2018-01-01
An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.
Angel, M. C.; Mejia, J.; Chang, H. I.; Ochoa, C. A.; Castro, C. L.
2017-12-01
We examine the ability of a regional climate model (RCM) to simulate the observed annual and diurnal cycles of precipitation over Central America (CA), the Caribbean Sea and NW South America (NWSA). The region's annual cycle is dominated by a bimodal precipitation annual cycle: over CA and the Caribbean, the mid-summer drought suppresses the rainy season in July-August; over the NWSA, and more pronouncedly over the Andes, the ITCZ meridional migration is argue to dominate the bimodal distribution of the annual cycle. The intricate land-ocean distribution, the complex terrain, and the long lasting mesoscale convective systems have been related to intricate bimodal diurnal distribution of precipitation over the far eastern Pacific and NWSA. A CORDEX-CA simulation based on the Weather and Research Forecasting (WRF) model at 25 km grid size driven by ERA-Interim reanalysis for the period 1979-2015 was implemented. The simulations were evaluated using surface observations and the research-based High-Resolution Satellite Precipitation Product from TRMM (3B42) and CMORPH. We use spectral analysis to estimate the mean phase and amplitude spatial patterns at annual and diurnal time scales. We further contrast the consistencies and differences using more focus and higher resolution simulations based on WRF and ERA-Interim at 10km and convection-permitting simulations (<4 km grid sizes). The research discusses the flow and orographic dependencies on the ability to adequately simulate the annual and diurnal bimodal distributions of precipitation. These comparisons provide a high-level validation of the WRF-based CORDEX-CA's ability to simulate one of the rainniest region on earth and its basic but challenging small-scale spatial-temporal climate variations.
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the ...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions.......Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...
The climatic imprint of bimodal distributions in vegetation cover for western Africa
Yin, Zun; Dekker, Stefan C.; van den Hurk, Bart J. J. M.; Dijkstra, Henk A.
2016-06-01
Observed bimodal distributions of woody cover in western Africa provide evidence that alternative ecosystem states may exist under the same precipitation regimes. In this study, we show that bimodality can also be observed in mean annual shortwave radiation and above-ground biomass, which might closely relate to woody cover due to vegetation-climate interactions. Thus we expect that use of radiation and above-ground biomass enables us to distinguish the two modes of woody cover. However, through conditional histogram analysis, we find that the bimodality of woody cover still can exist under conditions of low mean annual shortwave radiation and low above-ground biomass. It suggests that this specific condition might play a key role in critical transitions between the two modes, while under other conditions no bimodality was found. Based on a land cover map in which anthropogenic land use was removed, six climatic indicators that represent water, energy, climate seasonality and water-radiation coupling are analysed to investigate the coexistence of these indicators with specific land cover types. From this analysis we find that the mean annual precipitation is not sufficient to predict potential land cover change. Indicators of climate seasonality are strongly related to the observed land cover type. However, these indicators cannot predict a stable forest state under the observed climatic conditions, in contrast to observed forest states. A new indicator (the normalized difference of precipitation) successfully expresses the stability of the precipitation regime and can improve the prediction accuracy of forest states. Next we evaluate land cover predictions based on different combinations of climatic indicators. Regions with high potential of land cover transitions are revealed. The results suggest that the tropical forest in the Congo basin may be unstable and shows the possibility of decreasing significantly. An increase in the area covered by savanna and grass
APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS
Directory of Open Access Journals (Sweden)
T. I. Aliev
2013-03-01
Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one is willing to assume Subjective Expected Utility....
An all-timescales rainfall probability distribution
Papalexiou, S. M.; Koutsoyiannis, D.
2009-04-01
The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.
International Nuclear Information System (INIS)
Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.
2015-01-01
High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal distribution of the magnetic dipole moment. Here, we test this assumption for different types of superparamagnetic iron oxide nanoparticles in the 5–20 nm range, by multimodal fitting of magnetization curves using the MINORIM inversion method. The particles are studied while in dilute colloidal dispersion in a liquid, thereby preventing hysteresis and diminishing the effects of magnetic anisotropy on the interpretation of the magnetization curves. For two different types of well crystallized particles, the magnetic distribution is indeed log-normal, as expected from the physical size distribution. However, two other types of particles, with twinning defects or inhomogeneous oxide phases, are found to have a bimodal magnetic distribution. Our qualitative explanation is that relatively low fields are sufficient to begin aligning the particles in the liquid on the basis of their net dipole moment, whereas higher fields are required to align the smaller domains or less magnetic phases inside the particles. - Highlights: • Multimodal fits of dilute ferrofluids reveal when the particles are multidomain. • No a priori shape of the distribution is assumed by the MINORIM inversion method. • Well crystallized particles have log-normal TEM and magnetic size distributions. • Defective particles can combine a monomodal size and a bimodal dipole moment
X-ray diffraction microstructural analysis of bimodal size distribution MgO nano powder
International Nuclear Information System (INIS)
Suminar Pratapa; Budi Hartono
2009-01-01
Investigation on the characteristics of x-ray diffraction data for MgO powdered mixture of nano and sub-nano particles has been carried out to reveal the crystallite-size-related microstructural information. The MgO powders were prepared by co-precipitation method followed by heat treatment at 500 degree Celsius and 1200 degree Celsius for 1 hour, being the difference in the temperature was to obtain two powders with distinct crystallite size and size-distribution. The powders were then blended in air to give the presumably bimodal-size- distribution MgO nano powder. High-quality laboratory X-ray diffraction data for the powders were collected and then analysed using Rietveld-based MAUD software using the lognormal size distribution. Results show that the single-mode powders exhibit spherical crystallite size (R) of 20(1) nm and 160(1) nm for the 500 degree Celsius and 1200 degree Celsius data respectively with the nano metric powder displays narrower crystallite size distribution character, indicated by lognormal dispersion parameter of 0.21 as compared to 0.01 for the sub-nano metric powder. The mixture exhibits relatively more asymmetric peak broadening. Analysing the x-ray diffraction data for the latter specimen using single phase approach give unrealistic results. Introducing two phase models for the double-phase mixture to accommodate the bimodal-size-distribution characteristics give R = 100(6) and σ = 0.62 for the nano metric phase and R = 170(5) and σ= 0.12 for the σ sub-nano metric phase. (author)
Sampling probability distributions of lesions in mammograms
Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.
2015-03-01
One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.
Yu, Wenfei; Li, Tipei; Wu, Mei
1999-01-01
We have investigated the bimodal distribution of the duration of BATSE gamma-ray bursts (GRBs) by analyzing light curves of 64 ms time resolution. We define the average pulse width of GRBs from the auto-correlation function of GRB profiles. The distribution of the average pulse width of GRBs is bimodal, suggesting that GRBs are composed of long-pulse GRBs and short-pulse GRBs. The average pulse width of long-pulse GRBs appears correlated with the peak flux, consistent with the time dilation e...
Calculating Cumulative Binomial-Distribution Probabilities
Scheuer, Ernest M.; Bowerman, Paul N.
1989-01-01
Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.
Superferromagnetism in mechanically alloyed fcc Fe23Cu77 with bimodal cluster size distribution
International Nuclear Information System (INIS)
Silva, N J O; Amaral, J S; Amaral, V S; Costa, B F O; Le Caer, G
2009-01-01
Magnetic measurements, x-ray diffraction and Moessbauer spectroscopy were used to characterize a nanostructured fcc Fe 23 Cu 77 at.% alloy prepared by high-energy ball-milling, addressing in particular the effect of clustering on the nature of the interacting magnetic entities. The interpretation of magnetization measurements leads to the conclusion that grains, whose mean size is ∼16 nm, contain two populations of magnetic Fe-rich nanoclusters with a bimodal size distribution. These two sets of clusters contain about 14 and 400 Fe atoms and have magnetic moments of 30 μ B and 860 μ B , respectively. The inter-cluster ferromagnetic interactions that lead to superferromagnetism with a Curie temperature T C ∼220 K can be described by a mean field determined by the smaller clusters only, which account for 90% of the magnetization.
Constraints on probability distributions of grammatical forms
Directory of Open Access Journals (Sweden)
Kostić Aleksandar
2007-01-01
Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain......Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...
Parametric probability distributions for anomalous change detection
Energy Technology Data Exchange (ETDEWEB)
Theiler, James P [Los Alamos National Laboratory; Foy, Bernard R [Los Alamos National Laboratory; Wohlberg, Brendt E [Los Alamos National Laboratory; Scovel, James C [Los Alamos National Laboratory
2010-01-01
The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.
Probability Distribution for Flowing Interval Spacing
International Nuclear Information System (INIS)
Kuzio, S.
2001-01-01
The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be
Directory of Open Access Journals (Sweden)
Yuqin Yang
Full Text Available The resting membrane potential (RP of vascular smooth muscle cells (VSMCs is a major determinant of cytosolic calcium concentration and vascular tone. The heterogeneity of RPs and its underlying mechanism among different vascular beds remain poorly understood. We compared the RPs and vasomotion properties between the guinea pig spiral modiolar artery (SMA, brain arterioles (BA and mesenteric arteries (MA. We found: 1 RPs showed a robust bimodal distribution peaked at -76 and -40 mV evenly in the SMA, unevenly at -77 and -51 mV in the BA and ~-71 and -52 mV in the MA. Ba(2+ 0.1 mM eliminated their high RP peaks ~-75 mV. 2 Cells with low RP (~-45 mV hyperpolarized in response to 10 mM extracellular K(+, while cells with a high RP depolarized, and cells with intermediate RP (~-58 mV displayed an initial hyperpolarization followed by prolonged depolarization. Moderate high K(+ typically induced dilation, constriction and a dilation followed by constriction in the SMA, MA and BA, respectively. 3 Boltzmann-fit analysis of the Ba(2+-sensitive inward rectifier K(+ (Kir whole-cell current showed that the maximum Kir conductance density significantly differed among the vessels, and the half-activation voltage was significantly more negative in the MA. 4 Corresponding to the whole-cell data, computational modeling simulated the three RP distribution patterns and the dynamics of RP changes obtained experimentally, including the regenerative swift shifts between the two RP levels after reaching a threshold. 5 Molecular works revealed strong Kir2.1 and Kir2.2 transcripts and Kir2.1 immunolabeling in all 3 vessels, while Kir2.3 and Kir2.4 transcript levels varied. We conclude that a dense expression of functional Kir2.X channels underlies the more negative RPs in endothelial cells and a subset of VSMC in these arterioles, and the heterogeneous Kir function is primarily responsible for the distinct bimodal RPs among these arterioles. The fast Kir
Yang, Yuqin; Chen, Fangyi; Karasawa, Takatoshi; Ma, Ke-Tao; Guan, Bing-Cai; Shi, Xiao-Rui; Li, Hongzhe; Steyger, Peter S.; Nuttall, Alfred L.; Jiang, Zhi-Gen
2015-01-01
The resting membrane potential (RP) of vascular smooth muscle cells (VSMCs) is a major determinant of cytosolic calcium concentration and vascular tone. The heterogeneity of RPs and its underlying mechanism among different vascular beds remain poorly understood. We compared the RPs and vasomotion properties between the guinea pig spiral modiolar artery (SMA), brain arterioles (BA) and mesenteric arteries (MA). We found: 1) RPs showed a robust bimodal distribution peaked at -76 and -40 mV evenly in the SMA, unevenly at -77 and -51 mV in the BA and ~-71 and -52 mV in the MA. Ba2+ 0.1 mM eliminated their high RP peaks ~-75 mV. 2) Cells with low RP (~-45 mV) hyperpolarized in response to 10 mM extracellular K+, while cells with a high RP depolarized, and cells with intermediate RP (~-58 mV) displayed an initial hyperpolarization followed by prolonged depolarization. Moderate high K+ typically induced dilation, constriction and a dilation followed by constriction in the SMA, MA and BA, respectively. 3) Boltzmann-fit analysis of the Ba2+-sensitive inward rectifier K+ (Kir) whole-cell current showed that the maximum Kir conductance density significantly differed among the vessels, and the half-activation voltage was significantly more negative in the MA. 4) Corresponding to the whole-cell data, computational modeling simulated the three RP distribution patterns and the dynamics of RP changes obtained experimentally, including the regenerative swift shifts between the two RP levels after reaching a threshold. 5) Molecular works revealed strong Kir2.1 and Kir2.2 transcripts and Kir2.1 immunolabeling in all 3 vessels, while Kir2.3 and Kir2.4 transcript levels varied. We conclude that a dense expression of functional Kir2.X channels underlies the more negative RPs in endothelial cells and a subset of VSMC in these arterioles, and the heterogeneous Kir function is primarily responsible for the distinct bimodal RPs among these arterioles. The fast Kir-based regenerative shifts
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Probability evolution method for exit location distribution
Zhu, Jinjie; Chen, Zhen; Liu, Xianbin
2018-03-01
The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.
Evidence of A Bimodal US GDP Growth Rate Distribution: A Wavelet Approach
Directory of Open Access Journals (Sweden)
Sandro Claudio Lera
2017-04-01
Full Text Available We present a quantitative characterisation of the fluctuations of the annualized growth rate of the real US GDP per capita at many scales, using a wavelet transform analysis of two data sets, quarterly data from 1947 to 2015 and annual data from 1800 to 2010. The chosen mother wavelet (first derivative of the Gaussian function applied to the logarithm of the real US GDP per capita provides a robust estimation of the instantaneous growth rate at different scales. Our main finding is that business cycles appear at all scales and the distribution of GDP growth rates can be well approximated by a bimodal function associated to a series of switches between regimes of strong growth rate $\\rho_\\text{high}$ and regimes of low growth rate $\\rho_\\text{low}$. The succession of such two regimes compounds to produce a remarkably stable long term average real annualized growth rate of 1.6% from 1800 to 2010 and $\\approx 2.0\\%$ since 1950, which is the result of a subtle compensation between the high and low growth regimes that alternate continuously. Thus, the overall growth dynamics of the US economy is punctuated, with phases of strong growth that are intrinsically unsustainable, followed by corrections or consolidation until the next boom starts. We interpret these findings within the theory of "social bubbles" and argue as a consequence that estimations of the cost of the 2008 crisis may be misleading. We also interpret the absence of strong recovery since 2008 as a protracted low growth regime $\\rho_\\text{low}$ associated with the exceptional nature of the preceding large growth regime.
Probability Distribution for Flowing Interval Spacing
International Nuclear Information System (INIS)
S. Kuzio
2004-01-01
Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be
The probability distribution of extreme precipitation
Korolev, V. Yu.; Gorshenin, A. K.
2017-12-01
On the basis of the negative binomial distribution of the duration of wet periods calculated per day, an asymptotic model is proposed for distributing the maximum daily rainfall volume during the wet period, having the form of a mixture of Frechet distributions and coinciding with the distribution of the positive degree of a random variable having the Fisher-Snedecor distribution. The method of proving the corresponding result is based on limit theorems for extreme order statistics in samples of a random volume with a mixed Poisson distribution. The adequacy of the models proposed and methods of their statistical analysis is demonstrated by the example of estimating the extreme distribution parameters based on real data.
Matrix-exponential distributions in applied probability
Bladt, Mogens
2017-01-01
This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...
Bulk rock and melt inclusion analyses indicate bimodal distribution in Calbuco volcano (Chile)
Montalbano, Salvatrice; Bolle, Olivier; Schiano, Pierre; Cluzel, Nicolas; Vander Auwera, Jacqueline
2014-05-01
Calbuco is an active stratovolcano situated in the central SVZ (Southern Volcanic Zone) of the Andes at 41.2°S. The dominant rock-type is basaltic andesite containing macrocrysts of plagioclase (An57-91), olivine (Fo60-81), clinopyroxene (Mg# 74-85), orthopyroxene (Mg# 66-75) and rare amphibole (mostly pargasitic) in a micro-crystalline matrix. Orthopyroxene frequently occurs as a reaction rim surrounding olivine suggestive of a peritectic reaction. The oldest lava unit (Calbuco 1) contains basaltic andesites that are notably lower in MgO and higher in Al2O3 than the other samples. Some dacitic compositions have also been identified. Bulk rock analyses define a low-K calc-alkaline trend with however two basalts plotting in the tholeiite field in the AFM diagram. Bulk rocks display a differentiation trend of decreasing CaO, FeOt and MgO and increasing K2O and P2O5 with increasing SiO2. Typical negative anomalies in Nb, Ta and Th are shown in spiderdiagrams whereas there is no Eu anomaly in REE patterns. In variation diagrams, a clear compositional gap occurs between 61 and 65 wt. % SiO2. Investigation of melt inclusions was performed on homogenized and naturally quenched inclusions hosted in olivine and clinopyroxene crystals. Their composition mimics the differentiation trend observed in the bulk samples, including a bimodal distribution. The melt inclusions analyzed in olivine range in composition from 45 to 58 wt. % SiO2 whereas those occurring in clinopyroxene range from 70 and 76 wt. % SiO2. The compositional gap of the melt inclusions thus overlaps that of the whole rocks. The observed differentiation trend from basalt to basaltic andesite (49 to 58 wt. % SiO2) perfectly fits published experimental trends acquired on hydrous basalts at different crustal pressures, water concentrations and oxygen fugacities at subduction zones and can be accounted for by a fractional crystallization process where a bulk cumulate made of plagioclase, olivine, clinopyroxene and
Eliciting Subjective Probability Distributions on Continuous Variables
1975-08-01
STATEMENT (3l Ihl» Riporl) Approved for Public Release; Distribiition Unlimited vT u.VH SUTiON STATEMENT (ol in, motif el oofnd In Block 20, II...Adjusting Proper Scoring Rule Fractile Subjective Probability Uncertainty Measures ZO. ABSTRACT (Conllnuo an r«v*r*« oido H nocoomtry and
ADVANCES IN DISTRIBUTED OPTIMIZATION USING PROBABILITY COLLECTIVES
DAVID H. WOLPERT; CHARLIE E. M. STRAUSS; DEV RAJNARAYAN
2006-01-01
Recent work has shown how information theory extends conventional full-rationality game theory to allow bounded rational agents. The associated mathematical framework can be used to solve distributed optimization and control problems. This is done by translating the distributed problem into an iterated game, where each agent's mixed strategy (i.e. its stochastically determined move) sets a different variable of the problem. So the expected value of the objective function of the distributed pr...
Probability distribution of machining center failures
International Nuclear Information System (INIS)
Jia Yazhou; Wang Molin; Jia Zhixin
1995-01-01
Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed
Probability distribution of flood flows in Tunisia
Abida, H.; Ellouze, M.
2008-05-01
L (Linear) moments are used in identifying regional flood frequency distributions for different zones Tunisia wide. 1134 site-years of annual maximum stream flow data from a total of 42 stations with an average record length of 27 years are considered. The country is divided into two homogeneous regions (northern and central/southern Tunisia) using a heterogeneity measure, based on the spread of the sample L-moments among the sites in a given region. Then, selection of the corresponding distribution is achieved through goodness-of-fit comparisons in L-moment diagrams and verified using an L moment based regional test that compares observed to theoretical values of L-skewness and L-kurtosis for various candidate distributions. The distributions used, which represent five of the most frequently used distributions in the analysis of hydrologic extreme variables are: (i) Generalized Extreme Value (GEV), (ii) Pearson Type III (P3), (iii) Generalized Logistic (GLO), (iv) Generalized Normal (GN), and (v) Generalized Pareto (GPA) distributions. Spatial trends, with respect to the best-fit flood frequency distribution, are distinguished: Northern Tunisia was shown to be represented by the GNO distribution while the GNO and GEV distributions give the best fit in central/southern Tunisia.
Shaposhnikov, Dmitry S.; Rodin, Alexander V.; Medvedev, Alexander S.; Fedorova, Anna A.; Kuroda, Takeshi; Hartogh, Paul
2018-02-01
We present a new implementation of the hydrological cycle scheme into a general circulation model of the Martian atmosphere. The model includes a semi-Lagrangian transport scheme for water vapor and ice and accounts for microphysics of phase transitions between them. The hydrological scheme includes processes of saturation, nucleation, particle growth, sublimation, and sedimentation under the assumption of a variable size distribution. The scheme has been implemented into the Max Planck Institute Martian general circulation model and tested assuming monomodal and bimodal lognormal distributions of ice condensation nuclei. We present a comparison of the simulated annual variations, horizontal and vertical distributions of water vapor, and ice clouds with the available observations from instruments on board Mars orbiters. The accounting for bimodality of aerosol particle distribution improves the simulations of the annual hydrological cycle, including predicted ice clouds mass, opacity, number density, and particle radii. The increased number density and lower nucleation rates bring the simulated cloud opacities closer to observations. Simulations show a weak effect of the excess of small aerosol particles on the simulated water vapor distributions.
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Probability distribution and entropy as a measure of uncertainty
International Nuclear Information System (INIS)
Wang Qiuping A
2008-01-01
The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship dI = d x-bar - dx-bar, a definition underlying the maximization of entropy for corresponding distribution
Comparative Descriptive Statistics of Skewed Probability Distributions
National Research Council Canada - National Science Library
Fewell, M
2004-01-01
...). Increasingly, OA studies involve simulations of varying levels of sophistication. A feature of all simulations is the use of random variables, and this immediately raises the question of what distribution to employ...
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2017-01-01
provide experimental evidence which corroborates our theoretical results. We conclude that for empirically plausible levels of risk aversion, one can reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one......Subjective beliefs are elicited routinely in economics experiments. However, such elicitation often suffers from two possible disadvantages. First, beliefs are recovered in the form of a summary statistic, usually the mean, of the underlying latent distribution. Second, recovered beliefs are biased...... significantly due to risk aversion. We characterize an approach for eliciting the entire subjective belief distribution that is minimally biased due to risk aversion. We offer simulated examples to demonstrate the intuition of our approach. We also provide theory to formally characterize our framework. And we...
Bimodal distribution of risk for childhood obesity in urban Baja California, Mexico.
Wojcicki, Janet M; Jimenez-Cruz, Arturo; Bacardi-Gascon, Montserrat; Schwartz, Norah; Heyman, Melvin B
2012-08-01
In Mexico, higher socioeconomic status (SES) has been found to be associated with increased risk for obesity in children. Within developed urban areas, however, there may be increased risk among lower SES children. Students in grades 4-6 from five public schools in Tijuana and Tecate, Mexico, were interviewed and weight, height and waist circumference (WC) measurements were taken. Interviews consisted of questions on food frequency, food insecurity, acculturation, physical activity and lifestyle practices. Multivariate logistic models were used to assess risk factors for obesity (having a body mass index [BMI] ≥95th percentile) and abdominal obesity (a WC >90th percentile) using Stata 11.0. Five hundred and ninety students were enrolled; 43.7% were overweight or obese, and 24.3% were obese and 20.2% had abdominal obesity. Independent risk factors for obesity included watching TV in English (odds ratio [OR] 1.60, 95% confidence interval [CI] 1.06-2.41) and perceived child food insecurity (OR 1.57, 95% CI 1.05-2.36). Decreased risk for obesity was associated with female sex (OR 0.64, 95% CI 0.43-0.96), as was regular multivitamin use (OR 0.63, 95% CI 0.42-0.94). Risk obesity was also decreased with increased taco consumption (≥1×/week; OR 0.64, 95% CI 0.43-0.96). Independent risk factors for abdominal obesity included playing video games ≥1×/week (OR 1.18, 95% CI 1.11-2.96) and older age group (10-11 years, OR 2.47, 95% CI 1.29-4.73 and ≥12 years, OR 2.21, 95% CI 1.09-4.49). Increased consumption of tacos was also associated with decreased risk for abdominal obesity (≥1×/week; OR 0.56, 95% CI 0.40-1.00). We found a bimodal distribution for risk of obesity and abdominal obesity in school aged children on the Mexican border with the United States. Increased risk for obesity and abdominal obesity were associated with factors indicative of lower and higher SES including watching TV in English, increased video game playing and perceived food insecurity
Most probable degree distribution at fixed structural entropy
Indian Academy of Sciences (India)
- works with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the net- work by keeping fixed the structural entropy. This degree ...
Incorporating Skew into RMS Surface Roughness Probability Distribution
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Probability distributions in risk management operations
Artikis, Constantinos
2015-01-01
This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...
How to Read Probability Distributions as Statements about Process
Directory of Open Access Journals (Sweden)
Steven A. Frank
2014-11-01
Full Text Available Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.
Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas
2016-06-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the
Information-theoretic methods for estimating of complicated probability distributions
Zong, Zhi
2006-01-01
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Energy Technology Data Exchange (ETDEWEB)
Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)
2014-06-19
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.
O'Connell, Neil E; Kamper, Steven J; Stevens, Matthew L; Li, Qiang
2017-08-01
The presence of bimodal outcome distributions has been used as a justification for conducting responder analyses, in addition to, or in place of analyses of the mean between-group difference, in clinical trials and systematic reviews of interventions for pain. The aim of this study was to investigate the distribution of participants' pain outcomes for evidence of bimodal distribution. We sourced data on participant outcomes from a convenience sample of 10 trials of nonsurgical interventions (exercise, manual therapy, medication) for spinal pain. We assessed normality using the Shapiro-Wilk test. When the Shapiro-Wilk test suggested non-normality we inspected distribution plots visually and attempted to classify them. To test whether responder analyses detected a meaningful number of additional patients experiencing substantial improvements we also calculated the risk difference and number needed to treat to benefit. We found no compelling evidence suggesting that outcomes were bimodally distributed for any of the intervention groups. Responder analysis would not meaningfully alter our interpretation of these data compared with the mean between group difference. Our findings suggest that bimodal distribution of outcomes should not be assumed in interventions for spinal pain and do not support the automatic prioritization of responder analysis over the between group difference in the evaluation of treatment effectiveness for pain. Secondary analysis of clinical trials of nonsurgical interventions for spinal pain found no evidence for bimodally distributed outcomes. The findings do not support the automatic prioritization of responder analyses over the average between group difference in the evaluation of treatment effectiveness for spinal pain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Some explicit expressions for the probability distribution of force ...
Indian Academy of Sciences (India)
Some explicit expressions for the probability distribution of force magnitude. SARALEES NADARAJAH. School of Mathematics, University of Manchester, ... Radeke et al (2004) has the joint post distribution force (pdf) given by ..... Goddard J D 2004 On entropy estimates of contact forces in static granular assemblies.
Optimal design of unit hydrographs using probability distribution and ...
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
Keywords. Unit hydrograph; rainfall-runoff; hydrology; genetic algorithms; optimization; probability distribution. 1. Introduction. One of the most common interests of hydrologists is the estimation of direct runoff from a watershed for specified distribution of rainfall. This can be achieved either by a system or a physical approach ...
Some explicit expressions for the probability distribution of force ...
Indian Academy of Sciences (India)
Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the ...
Measuring Robustness of Timetables at Stations using a Probability Distribution
DEFF Research Database (Denmark)
Jensen, Lars Wittrup; Landex, Alex
infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...
Probability distributions with truncated, log and bivariate extensions
Thomopoulos, Nick T
2018-01-01
This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...
International Nuclear Information System (INIS)
Helton, J.C.
1996-01-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems
Evidence for Truncated Exponential Probability Distribution of Earthquake Slip
Thingbaijam, Kiran K. S.
2016-07-13
Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.
Comparative analysis through probability distributions of a data set
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
Assigning probability distributions to input parameters of performance assessment models
International Nuclear Information System (INIS)
Mishra, Srikanta
2002-02-01
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available
Assigning probability distributions to input parameters of performance assessment models
Energy Technology Data Exchange (ETDEWEB)
Mishra, Srikanta [INTERA Inc., Austin, TX (United States)
2002-02-01
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.
Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum
Directory of Open Access Journals (Sweden)
Farshid eSepehrband
2016-05-01
Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.
International Nuclear Information System (INIS)
Santana, Steven Michael; Kirby, Brian J; Antonyak, Marc A; Cerione, Richard A
2014-01-01
Extracellular shed vesicles (ESVs) facilitate a unique mode of cell–cell communication wherein vesicle uptake can induce a change in the recipient cell's state. Despite the intensity of ESV research, currently reported data represent the bulk characterization of concentrated vesicle samples with little attention paid to heterogeneity. ESV populations likely represent diversity in mechanisms of formation, cargo and size. To better understand ESV subpopulations and the signaling cascades implicated in their formation, we characterize ESV size distributions to identify subpopulations in normal and cancerous epithelial cells. We have discovered that cancer cells exhibit bimodal ESV distributions, one small-diameter and another large-diameter population, suggesting that two mechanisms may govern ESV formation, an exosome population and a cancer-specific microvesicle population. Altered glutamine metabolism in cancer is thought to fuel cancer growth but may also support metastatic niche formation through microvesicle production. We describe the role of a glutaminase inhibitor, compound 968, in ESV production. We have discovered that inhibiting glutamine metabolism significantly impairs large-diameter microvesicle production in cancer cells. (paper)
Santana, Steven Michael; Antonyak, Marc A.; Cerione, Richard A.; Kirby, Brian J.
2014-12-01
Extracellular shed vesicles (ESVs) facilitate a unique mode of cell-cell communication wherein vesicle uptake can induce a change in the recipient cell's state. Despite the intensity of ESV research, currently reported data represent the bulk characterization of concentrated vesicle samples with little attention paid to heterogeneity. ESV populations likely represent diversity in mechanisms of formation, cargo and size. To better understand ESV subpopulations and the signaling cascades implicated in their formation, we characterize ESV size distributions to identify subpopulations in normal and cancerous epithelial cells. We have discovered that cancer cells exhibit bimodal ESV distributions, one small-diameter and another large-diameter population, suggesting that two mechanisms may govern ESV formation, an exosome population and a cancer-specific microvesicle population. Altered glutamine metabolism in cancer is thought to fuel cancer growth but may also support metastatic niche formation through microvesicle production. We describe the role of a glutaminase inhibitor, compound 968, in ESV production. We have discovered that inhibiting glutamine metabolism significantly impairs large-diameter microvesicle production in cancer cells.
Modeling highway travel time distribution with conditional probability models
Energy Technology Data Exchange (ETDEWEB)
Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)
2014-01-01
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).
Size effect on strength and lifetime probability distributions of ...
Indian Academy of Sciences (India)
Size effect on strength and lifetime probability distributions of quasibrittle structures. #. ZDEN ˇEK P BAŽANT1,∗ and JIA-LIANG LE2. 1Department of Civil and Environmental Engineering, Northwestern University,. 2145 Sheridan Road, CEE/A135, Evanston, Illinois 60208, USA. 2Department of Civil Engineering, University ...
Modelling the Skinner Thesis : Consequences of a Lognormal or a Bimodal Resource Base Distribution
Auping, W.L.
2014-01-01
The copper case is often used as an example in resource depletion studies. Despite these studies, several profound uncertainties remain in the system. One of these uncertainties is the distribution of copper grades in the lithosphere. The Skinner thesis promotes the idea that copper grades may be
Quantum Fourier transform, Heisenberg groups and quasi-probability distributions
International Nuclear Information System (INIS)
Patra, Manas K; Braunstein, Samuel L
2011-01-01
This paper aims to explore the inherent connection between Heisenberg groups, quantum Fourier transform (QFT) and (quasi-probability) distribution functions. Distribution functions for continuous and finite quantum systems are examined from three perspectives and all of them lead to Weyl-Gabor-Heisenberg groups. The QFT appears as the intertwining operator of two equivalent representations arising out of an automorphism of the group. Distribution functions correspond to certain distinguished sets in the group algebra. The marginal properties of a particular class of distribution functions (Wigner distributions) arise from a class of automorphisms of the group algebra of the Heisenberg group. We then study the reconstruction of the Wigner function from the marginal distributions via inverse Radon transform giving explicit formulae. We consider some applications of our approach to quantum information processing and quantum process tomography.
Huang, Xiaowei; Zhang, Yanling; Meng, Long; Qian, Ming; Wong, Kelvin; Abbott, Derek; Zheng, Rongqin; Zheng, Hairong; Niu, Lili
2017-03-01
Echolucent carotid plaques are associated with acute cardiovascular and cerebrovascular events (ACCEs) in atherosclerotic patients. The aim of this study was to develop a computer-aided method for identifying echolucent plaques. A total of 315 ultrasound images of carotid plaques (105 echo-rich, 105 intermediate and 105 echolucent) collected from 153 patients were included in this study. A bimodal gamma distribution was proposed to model the pixel statistics in the gray scale images of plaques. The discrete Fréchet distance features (DFDFs) of each plaque were extracted based on the statistical model. The most discriminative features (MDFs) were obtained from DFDFs by linear discriminant analysis, and a k-nearestneighbor classifier was implemented for classification of different types of plaques. The classification accuracy of the three types of plaques using MDFs can reach 77.46%. When a receiver operating characteristics (ROC) curve was produced to identify echolucent plaques, the area under the curve was 0.831. Our results indicate potential feasibility of the method for identifying echolucent plaques based on DFDFs. Our method may potentially improve the ability of noninvasive ultrasonic examination in risk prediction of ACCEs for patients with plaques.
Simulation of Daily Weather Data Using Theoretical Probability Distributions.
Bruhn, J. A.; Fry, W. E.; Fick, G. W.
1980-09-01
A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previous day. Both minimum relative humidity and total solar radiation are assumed to be normally distributed. The values of the parameters describing the distribution of minimum relative humidity is dependent on rainfall occurrence on the previous day and current day. Parameter values for total solar radiation are dependent on the occurrence of rain on the current day. The assumptions made during model construction were found to be appropriate for actual weather data from Geneva, New York. The performance of the weather model was evaluated by comparing the cumulative frequency distributions of simulated weather data with the distributions of actual weather data from Geneva, New York and Fort Collins, Colorado. For each location, simulated weather data were similar to actual weather data in terms of mean response, variability and autocorrelation. The possible applications of this model when used with models of other components of the agro-ecosystem are discussed.
Exact probability distribution function for the volatility of cumulative production
Zadourian, Rubina; Klümper, Andreas
2018-04-01
In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.
Stochastic self-propagating star formation with anisotropic probability distribution
Jungwiert, B.; Palous, J.
1994-07-01
We present a 2D computer code for stochastic self-propagating star formation (SSPSF) in differentially rotating galaxies. The isotropic probability distribution, used in previous models of Seiden, Gerola and Schulman (Seiden & Schulman, 1990, and references therein), is replaced by an anisotropic one. The motivation is provided by models of expanding large-scale supernova remnants (SNR) in disks with shear (Palous et al. 1990): the distortion of the SNR leads to uneven density distribution along its periphery and, consequently, to uneven distribution of new star forming sites. To model anisotropic SSPSF, we process in two steps: first, we eliminate artificial anisotropies inherent to the technique used by Seiden, Gerola and Schulman and, second, we define the probability ellipse on each star forming site. The anisotropy is characterized by its axes ratio and inclination with respect to the galactic center. We show that anisotropic SSPSF is able to produce highly organized spiral structures. Depending on the character of the probability ellipse, we can obtain continous spiral arms of different length, thickness and pitch angle. The relation of the probability ellipse to rotation curves interstellar medium (ISM) density and metallicity is discussed as well as its variation along the Hubble sequence and van den Bergh's luminosity classification of galaxies. To demonstrate applications, we compare our results with two different classes of galaxies: M 101-type grand-design spirals with open and robust arms and NGC 2841-type flocculent galaxies with thin and tightly wound arms.
Geometry of q-Exponential Family of Probability Distributions
Directory of Open Access Journals (Sweden)
Shun-ichi Amari
2011-06-01
Full Text Available The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability estimator.
Outage probability of distributed beamforming with co-channel interference
Yang, Liang
2012-03-01
In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.
Epi-convergence almost surely, in probability and in distribution
Czech Academy of Sciences Publication Activity Database
Lachout, Petr
2006-01-01
Roč. 142, č. 1 (2006), s. 187-214 ISSN 0254-5330 R&D Projects: GA ČR GA201/03/1027 Institutional research plan: CEZ:AV0Z10750506 Keywords : epi-convergence of functions * random functions * convergence almost surely * convergence in probability * convergence in distribution Subject RIV: BA - General Mathematics Impact factor: 0.589, year: 2006
Partial Generalized Probability Weighted Moments for Exponentiated Exponential Distribution
Directory of Open Access Journals (Sweden)
Neema Mohamed El Haroun
2015-09-01
Full Text Available The generalized probability weighted moments are widely used in hydrology for estimating parameters of flood distributions from complete sample. The method of partial generalized probability weighted moments was used to estimate the parameters of distributions from censored sample. This article offers new method called partial generalized probability weighted moments (PGPWMs for the analysis of censored data. The method of PGPWMs is an extended class from partial generalized probability weighted moments. To illustrate the new method, estimation of the unknown parameters from exponentiated exponential distribution based on doubly censored sample is considered. PGPWMs estimators for right and left censored samples are obtained as special cases. Simulation study is conducted to investigate performance of estimates for exponentiated exponential distribution. Comparison between estimators is made through simulation via their biases and mean square errors. An illustration with real data is provided. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;}
Estimating probable flaw distributions in PWR steam generator tubes
International Nuclear Information System (INIS)
Gorman, J.A.; Turner, A.P.L.
1997-01-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses
Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence
International Nuclear Information System (INIS)
Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del
2009-01-01
Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.
Yang, Haesang; Seong, Woojae
2018-02-01
Compressional wave speed and attenuation were measured for water-saturated granular media employing five kinds of glass beads having unimodal and bimodal grain size distributions. Glass beads with grain sizes ranging from 250 to 850 μm were used for the acoustic measurements at a frequency range from 350 kHz to 1.1 MHz, which includes the transition range where scattering and non-scattering losses co-exist. The compressional wave speed and attenuation data are presented as a function of frequency and grain size distribution. The compressional wave speed and attenuation data show a variety of frequency dependencies for varying grain size distribution. The observed acoustic properties are investigated for the volume ratio of larger and smaller sized grains in the mixed bimodal media. Also, the measured results are compared with the empirical multiple scattering formula as a function of Rayleigh parameter kd (product of wavenumber in the water k and mean grain diameter of the glass beads d) using weighted mean grain size. The measured results are also discussed, focusing on the geophysical difference between unimodal and bimodal mixed grains.
Tregloan-Reed, J.; Southworth, J.; Mancini, L.; Mollière, P.; Ciceri, S.; Bruni, I.; Ricci, D.; Ayala-Loera, C.; Henning, T.
2018-03-01
We present high-precision photometry of eight separate transit events in the HAT-P-32 planetary system. One transit event was observed simultaneously by two telescopes of which one obtained a simultaneous multiband light curve in three optical bands, giving a total of 11 transit light curves. Due to the filter selection and in conjunction with using the defocused photometry technique, we were able to obtain an extremely high-precision, ground-based transit in the u band (350 nm), with an rms scatter of ≈1 mmag. All 11 transits were modelled using PRISM and GEMC, and the physical properties of the system calculated. We find the mass and radius of the host star to be 1.182 ± 0.041 M⊙ and 1.225 ± 0.015 R⊙, respectively. For the planet, we find a mass of 0.80 ± 0.14 MJup, a radius of 1.807 ± 0.022 RJup, and a density of 0.126 ± 0.023 ρJup. These values are consistent with those found in the literature. We also obtain a new orbital ephemeris for the system T0 = BJD/TDB 2 454 420.447187(96) + 2.15000800(10) × E. We measured the transmission spectrum of HAT-P-32 A b and compared it to theoretical transmission spectra. Our results indicate a bimodal cloud particle distribution consisting of Rayleigh-like haze and grey absorbing cloud particles within the atmosphere of HAT-P-32 A b.
Changes of Probability Distributions in Tsunami Heights with Fault Parameters
Kim, Kwan-Hyuck; Kwon, Hyun-Han; Park, Yong Sung; Cho, Yong-Sik
2017-04-01
This study explored the changes of the probability distribution in tsunami heights along the eastern coastline of the Korea for virtual earthquakes. The results confirmed that the changes of the probability distribution in tsunami heights depending on tsunami fault parameters was found. A statistical model was developed in order to jointly analyse tsunami heights on a variety of events by regarding the functional relationships; the parameters in a Weibull distribution with earthquake characteristics could be estimated, all within a Bayesian regression framework. The proposed model could be effective and informative for the estimation of tsunami risk from an earthquake of a given magnitude at a particular location. Definitely, the coefficient of determination between the true and estimated values for Weibull distribution parameters were over 90% for both virtual and historical tsunami. Keywords: Tsunami heights, Bayesian model, Regression analysis, Risk analysis Acknowledgements This research was supported by a grant from Study on Solitary Wave Run-up for Hazard Mitigation of Coastal Communities against Sea Level Rise Project[No. 20140437] funded by Korea Institute of Marine Science and Technology promotion.
Topologically appropriate coordinates for (Vzz, η) joint probability distributions
International Nuclear Information System (INIS)
Evenson, William E.; Adams, M.; Bunker, Austin; Hodges, Jeffery A.; Matheson, P. L.; Park, Tyler; Stufflebeam, Michael; Sullivan, Francis P.; Zacate, M. O.
2016-01-01
Inhomogeneous broadening (IHB) of hyperfine interactions in materials arises from a distribution of electric field gradients (EFGs) due to randomly distributed defects contributing non-uniformly to the EFG at probe sites. Hyperfine experiments reflect the inhomogeneous distribution of defects through the joint probability distribution function (PDF) of V zz and η determined by the defect concentration, crystal structure, and defect sites in the crystal. Czjzek showed how to choose coordinates in the (V zz , η) plane that are consistent with the physical constraints and ordering convention for these EFG parameters. Here we show how to transform to a new set of coordinates that decreases the distortion inherent in Czjzek’s representation. These new coordinates allow one to express the joint PDF for random distributions of defects in a form reasonably approximated by the product of two independent marginal distributions. This paper focuses on these topologically appropriate coordinates, with simple examples drawn from Czjzek’s work and from our simulations of point defects in cubic lattices as well as random amorphous distributions of defects. Detailed simulations have been carried out for IHB in cubic structures and point charge models relevant to perturbed angular correlation (PAC) experiments.
Methods for fitting a parametric probability distribution to most probable number data.
Williams, Michael S; Ebel, Eric D
2012-07-02
Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
van Rijssel, Jozef; Kuipers, Bonny W M; Erne, Ben
2015-01-01
High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal
Characterizing single-molecule FRET dynamics with probability distribution analysis.
Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N
2010-07-12
Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.
Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification
Directory of Open Access Journals (Sweden)
Huiwu Luo
2015-01-01
Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.
Joint Probability Distributions for a Class of Non-Markovian Processes
Baule, A.; Friedrich, R.
2004-01-01
We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H.C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single time probability distributions to the case of N-time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fr...
Günal, Gülçin; Kip, Çiğdem; Eda Öğüt, S; İlhan, Hasan; Kibar, Güneş; Tuncel, Ali
2018-02-01
Monodisperse silica microspheres with bimodal pore-size distribution were proposed as a high performance sorbent for DNA isolation in batch fashion under equilibrium conditions. The proposed sorbent including both macroporous and mesoporous compartments was synthesized 5.1 μm in-size, by a "staged shape templated hydrolysis and condensation method". Hydrophilic polymer based sorbents were also obtained in the form of monodisperse-macroporous microspheres ca 5.5 μm in size, with different functionalities, by a developed "multi-stage microsuspension copolymerization" technique. The batch DNA isolation performance of proposed material was comparatively investigated using polymer based sorbents with similar morphologies. Among all sorbents tried, the best DNA isolation performance was achieved with the monodisperse silica microspheres with bimodal pore size distribution. The collocation of interconnected mesoporous and macroporous compartments within the monodisperse silica microspheres provided a high surface area and reduced the intraparticular mass transfer resistance and made easier both the adsorption and desorption of DNA. Among the polymer based sorbents, higher DNA isolation yields were achieved with the monodisperse-macroporous polymer microspheres carrying trimethoxysilyl and quaternary ammonium functionalities. However, batch DNA isolation performances of polymer based sorbents were significantly lower with respect to the silica microspheres.
Multiscale probability distribution of pressure fluctuations in fluidized beds
International Nuclear Information System (INIS)
Ghasemi, Fatemeh; Sahimi, Muhammad; Reza Rahimi Tabar, M; Peinke, Joachim
2012-01-01
Analysis of flow in fluidized beds, a common chemical reactor, is of much current interest due to its fundamental as well as industrial importance. Experimental data for the successive increments of the pressure fluctuations time series in a fluidized bed are analyzed by computing a multiscale probability density function (PDF) of the increments. The results demonstrate the evolution of the shape of the PDF from the short to long time scales. The deformation of the PDF across time scales may be modeled by the log-normal cascade model. The results are also in contrast to the previously proposed PDFs for the pressure fluctuations that include a Gaussian distribution and a PDF with a power-law tail. To understand better the properties of the pressure fluctuations, we also construct the shuffled and surrogate time series for the data and analyze them with the same method. It turns out that long-range correlations play an important role in the structure of the time series that represent the pressure fluctuation. (paper)
Directory of Open Access Journals (Sweden)
Alufelwi M. Tshavhungwe
2010-07-01
Full Text Available Mesoporous organosilica materials containing ethane groups in their framework were formed with two and three pore sizes (i.e. bimodal and trimodal pores when synthesised by the sol-gel method in the presence of cobalt ions. The compounds 1,2-bistrimethoxysilylethane and tetraethylorthosilicate were used as silicon sources and the reactions were done in the presence of a surfactant, which served as a template. Diffuse reflectance infrared Fourier transform spectroscopy revealed that organic functional groups were incorporated into the ethanesilica. Powder X-ray diffraction and nitrogen adsorption data indicated that the mesophase and textural properties (surface area, pore volume, pore diameter of the materials were dependent on the ageing temperature, the amount/ratio of silica precursors and cobalt ion incorporation. Secondary mesopores were drastically reduced by changing the ratio of silicon precursors.
Letzel, Alexander; Gökce, Bilal; Menzel, Andreas; Plech, Anton; Barcikowski, Stephan
2018-03-01
For a known material, the size distribution of a nanoparticle colloid is a crucial parameter that defines its properties. However, measured size distributions are not easy to interpret as one has to consider weighting (e.g. by light absorption, scattering intensity, volume, surface, number) and the way size information was gained. The radius of a suspended nanoparticle can be given as e.g. sphere equivalent, hydrodynamic, Feret or radius of gyration. In this study, gold nanoparticles in water are synthesized by pulsed-laser ablation (LAL) and fragmentation (LFL) in liquids and characterized by various techniques (scanning transmission electron microscopy (STEM), small-angle X-ray scattering (SAXS), analytical disc centrifugation (ADC), dynamic light scattering (DLS) and UV-vis spectroscopy with Mie-Gans Theory) to study the comparability of different analytical techniques and determine the method that is preferable for a given task related to laser-generated nanoparticles. In particular, laser-generated colloids are known to be bimodal and/or polydisperse, but bimodality is sometimes not analytically resolved in literature. In addition, frequently reported small size shifts of the primary particle mode around 10 nm needs evaluation of its statistical significance related to the analytical method. Closely related to earlier studies on SAXS, different colloids in defined proportions are mixed and their size as a function of the nominal mixing ratio is analyzed. It is found that the derived particle size is independent of the nominal mixing ratio if the colloid size fractions do not overlap considerably. Conversely, the obtained size for colloids with overlapping size fractions strongly depends on the nominal mixing ratio since most methods cannot distinguish between such fractions. Overall, SAXS and ADC are very accurate methods for particle size analysis. Further, the ability of different methods to determine the nominal mixing ratio of sizes fractions is studied
Probability Distributions of Minkowski Distances between Discrete Random Variables.
Schroger, Erich; And Others
1993-01-01
Minkowski distances are used to indicate similarity of two vectors in an N-dimensional space. How to compute the probability function, the expectation, and the variance for Minkowski distances and the special cases City-block distance and Euclidean distance. Critical values for tests of significance are presented in tables. (SLD)
Most probable degree distribution at fixed structural entropy
Indian Academy of Sciences (India)
This result indicates that scale-free degree distributions emerge naturally when con- sidering networks ensemble with small structural entropy. The appearance of the power-law degree distribution reflects the tendency of social, technological and es- pecially biological networks toward 'ordering'. This tendency is at work ...
Some explicit expressions for the probability distribution of force ...
Indian Academy of Sciences (India)
96: Art. No. 098001. Tighe B P, Socolar J E S, Schaeffer D G, Mitchener W G, Huber M L 2005 Force distributions in a triangular lattice of rigid bars. Phys. Rev. E 72: Art. No. 031306. Vargas W L, Murcia J C, Palacio L E, Dominguez D M 2003 Fractional diffusion model for force distribution in static granular media. Phys. Rev.
Modified Stieltjes Transform and Generalized Convolutions of Probability Distributions
Directory of Open Access Journals (Sweden)
Lev B. Klebanov
2018-01-01
Full Text Available The classical Stieltjes transform is modified in such a way as to generalize both Stieltjes and Fourier transforms. This transform allows the introduction of new classes of commutative and non-commutative generalized convolutions. A particular case of such a convolution for degenerate distributions appears to be the Wigner semicircle distribution.
Joint probability distributions for a class of non-Markovian processes.
Baule, A; Friedrich, R
2005-02-01
We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Tonttila, J.; Romakkaniemi, S.; Räisänen, P.; Kokkola, H.; Järvinen, H.
2012-04-01
Off-line calculations of cloud activation of aerosols using a probability density function (PDF) for vertical velocity (w) are performed. The focus is on the variation of the shape of the PDF using two functional formulations: the Normal distribution PDF and the Pearson type IV PDF. The Normal distribution provides a familiar example, as it has been widely used to approximate vertical velocity distributions in numerous applications, including climate models. Pearson type IV distribution provides an alternative that, to our knowledge, has not been employed before to describe the vertical velocity PDF. The advantage of the Pearson distribution is its versatility in representing skewed and more peaked distribution shapes compared to the Normal distribution, though this is obtained at the expense of increased mathematical complexity. The experiments are performed using a box model, in which the environmental conditions, including the aerosol size distribution (bi-modal) and chemical composition (ammonium-sulphate particles) are prescribed as constants. Measured size distributions comprising clean and polluted cases are used. Cloud activation of aerosols is calculated by integrating over the positive side of the PDF of w, which yields the mean number of activated particles (Nact). The mean, variance, and skewness of the PDFs along with the type of the PDF itself are altered in order to explore the effect of the PDF shape on the activation process. All experiments are repeated for three well-documented activation parameterizations: Lin & Leaitch, Abdul-Razzak & Ghan and Fountoukis & Nenes. The results show that for symmetric distributions of w (skewness = 0) there is a maximum difference of 10-15 % in Nact between the cases with w given by the Normal distribution, and the more peaked Pearson distribution. The largest differences are seen for the most polluted cases. Nact in clean cases will saturate rather quickly with respect to the maximum supersaturation and, hence
Probability Distribution Function of the Upper Equatorial Pacific Current Speeds
National Research Council Canada - National Science Library
Chu, Peter C
2005-01-01
...), constructed from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events...
Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions
DEFF Research Database (Denmark)
Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette
2016-01-01
We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found by asse...... and the randomness of the generation process, sampling may fail to generate a satisfactory puzzle. To avoid failure we employ a strategy using adaptive probabilities which change in response to previous steps of generative process, thus minimizing the risk of failure....
Supervised learning of probability distributions by neural networks
Baum, Eric B.; Wilczek, Frank
1988-01-01
Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.
Percentile estimation using the normal and lognormal probability distribution
International Nuclear Information System (INIS)
Bement, T.R.
1980-01-01
Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution
International Nuclear Information System (INIS)
Birchall, A.; Muirhead, C.R.; James, A.C.
1988-01-01
An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)
Study on probability distribution of fire scenarios in risk assessment to emergency evacuation
International Nuclear Information System (INIS)
Chu Guanquan; Wang Jinhui
2012-01-01
Event tree analysis (ETA) is a frequently-used technique to analyze the probability of probable fire scenario. The event probability is usually characterized by definite value. It is not appropriate to use definite value as these estimates may be the result of poor quality statistics and limited knowledge. Without addressing uncertainties, ETA will give imprecise results. The credibility of risk assessment will be undermined. This paper presents an approach to address event probability uncertainties and analyze probability distribution of probable fire scenario. ETA is performed to construct probable fire scenarios. The activation time of every event is characterized as stochastic variable by considering uncertainties of fire growth rate and other input variables. To obtain probability distribution of probable fire scenario, Markov Chain is proposed to combine with ETA. To demonstrate the approach, a case study is presented.
Size effect on strength and lifetime probability distributions of ...
Indian Academy of Sciences (India)
The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to ... The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.
Cosmological constraints from the convergence 1-point probability distribution
Energy Technology Data Exchange (ETDEWEB)
Patton, Kenneth [The Ohio State Univ., Columbus, OH (United States); Blazek, Jonathan [The Ohio State Univ., Columbus, OH (United States); Ecole Polytechnique Federale de Lausanne (EPFL), Versoix (Switzerland); Honscheid, Klaus [The Ohio State Univ., Columbus, OH (United States); Huff, Eric [The Ohio State Univ., Columbus, OH (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Melchior, Peter [Princeton Univ., Princeton, NJ (United States); Ross, Ashley J. [The Ohio State Univ., Columbus, OH (United States); Suchyta, Eric D. [The Ohio State Univ., Columbus, OH (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2017-06-29
Here, we examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast l-picola simulations and a Fisher analysis. We find competitive constraints in the ^{Ω}m–σ8 plane from the convergence PDF with 188 arcmin^{2} pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ < 886). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of 2–3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.
Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf
2016-04-01
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Energy Technology Data Exchange (ETDEWEB)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)
2016-04-18
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
International Nuclear Information System (INIS)
Difilippo, F.C.
1994-01-01
For a mixture of two families of resonances, we found the probability distribution for the distance, as first neighbors, between resonances that belong to different families. Integration of this distribution gives the probability of accidental overlapping of resonances of one isotope by resonances of the other, provided that the resonances of each isotope belong to a single family. (author)
Calculation of ruin probabilities for a dense class of heavy tailed distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady
2015-01-01
of distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... such as the renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation...
Garboś, Sławomir; Święcicka, Dorota
2015-11-01
The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake. Copyright © 2015 Elsevier Ltd. All rights reserved.
Approximate solution for the reactor neutron probability distribution
International Nuclear Information System (INIS)
Ruby, L.; McSwine, T.L.
1985-01-01
Several authors have studied the Kolmogorov equation for a fission-driven chain-reacting system, written in terms of the generating function G(x,y,z,t) where x, y, and z are dummy variables referring to the neutron, delayed neutron precursor, and detector-count populations, n, m, and c, respectively. Pal and Zolotukhin and Mogil'ner have shown that if delayed neutrons are neglected, the solution is approximately negative binomial for the neutron population. Wang and Ruby have shown that if the detector effect is neglected, the solution, including the effect of delayed neutrons, is approximately negative binomial. All of the authors assumed prompt-neutron emission not exceeding two neutrons per fission. An approximate method of separating the detector effect from the statistics of the neutron and precursor populations has been proposed by Ruby. In this weak-coupling limit, it is assumed that G(x,y,z,t) = H(x,y)I(z,t). Substitution of this assumption into the Kolmogorov equation separates the latter into two equations, one for H(x,y) and the other for I(z,t). Solution of the latter then gives a generating function, which indicates that in the weak-coupling limit, the detector counts are Poisson distributed. Ruby also showed that if the detector effect is neglected in the equation for H(x,y), i.e., the detector efficiency is set to zero, then the resulting equation is identical with that considered by Wang and Ruby. The authors present here an approximate solution for H(x,y) that does not set the detector efficiency to zero
Directory of Open Access Journals (Sweden)
Mahdieh Shakoori Oskooie
2016-12-01
Full Text Available The bimodal microstructures of Al6063 consisting of 15, 30, and 45 vol. % coarse-grained (CG bands within the ultrafine-grained (UFG matrix were synthesized via blending of high-energy mechanically milled powders with unmilled powders followed by hot powder extrusion. The corrosion behavior of the bimodal specimens was assessed by means of polarization, steady-state cyclic polarization and impedance tests, whereas their microstructural features and corrosion products were examined using optical microscopy (OM, scanning transmission electron microscopy (STEM, field emission scanning electron microscopy (FE-SEM, electron backscattered diffraction (EBSD, energy dispersive spectroscopy (EDS, and X-ray diffraction (XRD techniques. The bimodal Al6063 containing 15 vol. % CG phase exhibits the highest corrosion resistance among the bimodal microstructures and even superior electrochemical behavior compared with the plain UFG and CG materials in the 3.5% NaCl solution. The enhanced corrosion resistance is attributed to the optimum cathode to anode surface area ratio that gives rise to the formation of an effective galvanic couple between CG areas and the UFG matrix. The operational galvanic coupling leads to the domination of a “self-anodic protection system” on bimodal microstructure and consequently forms a uniform thick protective passive layer over it. In contrast, the 45 vol. % CG bimodal specimen shows the least corrosion resistance due to the catastrophic galvanic corrosion in UFG regions. The observed results for UFG Al6063 suggest that metallurgical tailoring of the grain structure in terms of bimodal microstructures leads to simultaneous enhancement in the electrochemical behavior and mechanical properties of passivable alloys that are usually inversely correlated. The mechanism of self-anodic protection for passivable metals with bimodal microstructures is discussed here for the first time.
Sonnino, Giorgio; Steinbrecher, György; Cardinali, Alessandro; Sonnino, Alberto; Tlidi, Mustapha
2013-01-01
Using statistical thermodynamics, we derive a general expression of the stationary probability distribution for thermodynamic systems driven out of equilibrium by several thermodynamic forces. The local equilibrium is defined by imposing the minimum entropy production and the maximum entropy principle under the scale invariance restrictions. The obtained probability distribution presents a singularity that has immediate physical interpretation in terms of the intermittency models. The derived reference probability distribution function is interpreted as time and ensemble average of the real physical one. A generic family of stochastic processes describing noise-driven intermittency, where the stationary density distribution coincides exactly with the one resulted from entropy maximization, is presented.
Beta-binomial regression and bimodal utilization.
Liu, Chuan-Fen; Burgess, James F; Manning, Willard G; Maciejewski, Matthew L
2013-10-01
To illustrate how the analysis of bimodal U-shaped distributed utilization can be modeled with beta-binomial regression, which is rarely used in health services research. Veterans Affairs (VA) administrative data and Medicare claims in 2001-2004 for 11,123 Medicare-eligible VA primary care users in 2000. We compared means and distributions of VA reliance (the proportion of all VA/Medicare primary care visits occurring in VA) predicted from beta-binomial, binomial, and ordinary least-squares (OLS) models. Beta-binomial model fits the bimodal distribution of VA reliance better than binomial and OLS models due to the nondependence on normality and the greater flexibility in shape parameters. Increased awareness of beta-binomial regression may help analysts apply appropriate methods to outcomes with bimodal or U-shaped distributions. © Health Research and Educational Trust.
Separating the contributions of variability and parameter uncertainty in probability distributions
International Nuclear Information System (INIS)
Sankararaman, S.; Mahadevan, S.
2013-01-01
This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters
Reactive Sintering of Bimodal WC-Co Hardmetals
Marek Tarraste; Kristjan Juhani; Jüri Pirso; Mart Viljus
2015-01-01
Bimodal WC-Co hardmetals were produced using novel technology - reactive sintering. Milled and activated tungsten and graphite powders were mixed with commercial coarse grained WC-Co powder and then sintered. The microstructure of produced materials was free of defects and consisted of evenly distributed coarse and fine tungsten carbide grains in cobalt binder. The microstructure, hardness and fracture toughness of reactive sintered bimodal WC-Co hardmetals is exhibited. Developed bimodal har...
Collective motions of globally coupled oscillators and some probability distributions on circle
International Nuclear Information System (INIS)
Jaćimović, Vladimir; Crnkić, Aladin
2017-01-01
In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.
Collective motions of globally coupled oscillators and some probability distributions on circle
Energy Technology Data Exchange (ETDEWEB)
Jaćimović, Vladimir [Faculty of Natural Sciences and Mathematics, University of Montenegro, Cetinjski put, bb., 81000 Podgorica (Montenegro); Crnkić, Aladin, E-mail: aladin.crnkic@hotmail.com [Faculty of Technical Engineering, University of Bihać, Ljubijankićeva, bb., 77000 Bihać, Bosnia and Herzegovina (Bosnia and Herzegovina)
2017-06-28
In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.
Ribereau, Pierre; Masiello, Esterina; Naveau, Philippe
2014-01-01
International audience; Following the work of Azzalini ([2] and [3]) on the skew normal distribution, we propose an extension of the Generalized Extreme Value (GEV) distribution, the SGEV. This new distribution allows for a better t of maxima and can be interpreted as both the distribution of maxima when maxima are taken on dependent data and when maxima are taken over a random block size. We propose to estimate the parameters of the SGEV distribution via the Probability Weighted Moments meth...
New family of probability distributions with applications to Monte Carlo studies
International Nuclear Information System (INIS)
Johnson, M.E.; Tietjen, G.L.; Beckman, R.J.
1980-01-01
A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution
International Nuclear Information System (INIS)
Wasastjerna, F.; Lux, I.
1980-03-01
A transmission probability method implemented in the program TPHEX is described. This program was developed for the calculation of neutron flux distributions in hexagonal light water reactor fuel assemblies. The accuracy appears to be superior to diffusion theory, and the computation time is shorter than that of the collision probability method. (author)
A measure of mutual divergence among a number of probability distributions
Directory of Open Access Journals (Sweden)
J. N. Kapur
1987-01-01
major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.
Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas
2017-01-01
The notion of Sinkhorn divergence has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive central limit theo...
New bimodal pore catalysts for Fischer-Tropsch synthesis
Energy Technology Data Exchange (ETDEWEB)
Shinoda, Misao; Zhang, Yi; Yoneyama, Yoshiharu; Hasegawa, Kiyoshi; Tsubaki, Noritatsu [Department of Material System and Life Science, School of Engineering, Toyama University, Gofuku 3190, Toyama 930-8555 (Japan)
2004-11-15
A simple preparation method of bimodal pore supports was developed by introducing SiO{sub 2} or ZrO{sub 2} sols into large pores of SiO{sub 2} gel pellets directly. The pores of the obtained bimodal pore supports distributed distinctly as two kinds of main pores. On the other hand, the increased BET surface area and decreased pore volume, compared to those of original silica gel, indicated that the obtained bimodal pore supports formed according to the designed route. The obtained bimodal pore supports were applied in liquid-phase Fischer-Tropsch synthesis (FTS) where cobalt was supported. The bimodal pore catalysts presented the best reaction performance in liquid-phase Fischer-Tropsch synthesis (FTS) as higher reaction rate and lower methane selectivities, because the spatial promotional effect of bimodal pore structure and chemical effect of the porous zirconia behaved inside the large pores of original silica gel.
Directory of Open Access Journals (Sweden)
Changhao Fan
2017-01-01
Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.
The distributed failure probability approach to dependent failure analysis, and its application
International Nuclear Information System (INIS)
Hughes, R.P.
1989-01-01
The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2012-01-01
to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
Probability distribution of long-run indiscriminate felling of trees in ...
African Journals Online (AJOL)
The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...
Bounds for the probability distribution function of the linear ACD process
Fernandes, Marcelo
2003-01-01
Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.
The distribution function of a probability measure on a space with a fractal structure
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Granero, M.A.; Galvez-Rodriguez, J.F.
2017-07-01
In this work we show how to define a probability measure with the help of a fractal structure. One of the keys of this approach is to use the completion of the fractal structure. Then we use the theory of a cumulative distribution function on a Polish ultrametric space and describe it in this context. Finally, with the help of fractal structures, we prove that a function satisfying the properties of a cumulative distribution function on a Polish ultrametric space is a cumulative distribution function with respect to some probability measure on the space. (Author)
Feynman quasi probability distribution for spin-(1/2), and its generalizations
International Nuclear Information System (INIS)
Colucci, M.
1999-01-01
It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects
Bimodal Nuclear Thermal Rocket Analysis Developments
Belair, Michael; Lavelle, Thomas; Saimento, Charles; Juhasz, Albert; Stewart, Mark
2014-01-01
Nuclear thermal propulsion has long been considered an enabling technology for human missions to Mars and beyond. One concept of operations for these missions utilizes the nuclear reactor to generate electrical power during coast phases, known as bimodal operation. This presentation focuses on the systems modeling and analysis efforts for a NERVA derived concept. The NERVA bimodal operation derives the thermal energy from the core tie tube elements. Recent analysis has shown potential temperature distributions in the tie tube elements that may limit the thermodynamic efficiency of the closed Brayton cycle used to generate electricity with the current design. The results of this analysis are discussed as well as the potential implications to a bimodal NERVA type reactor.
International Nuclear Information System (INIS)
Humbert, Ph.
2005-01-01
In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)
PROBABILITY DISTRIBUTION OVER THE SET OF CLASSES IN ARABIC DIALECT CLASSIFICATION TASK
Directory of Open Access Journals (Sweden)
O. V. Durandin
2017-01-01
Full Text Available Subject of Research.We propose an approach for solving machine learning classification problem that uses the information about the probability distribution on the training data class label set. The algorithm is illustrated on a complex natural language processing task - classification of Arabic dialects. Method. Each object in the training set is associated with a probability distribution over the class label set instead of a particular class label. The proposed approach solves the classification problem taking into account the probability distribution over the class label set to improve the quality of the built classifier. Main Results. The suggested approach is illustrated on the automatic Arabic dialects classification example. Mined from the Twitter social network, the analyzed data contain word-marks and belong to the following six Arabic dialects: Saudi, Levantine, Algerian, Egyptian, Iraq, Jordan, and to the modern standard Arabic (MSA. The paper results demonstrate an increase of the quality of the built classifier achieved by taking into account probability distributions over the set of classes. Experiments carried out show that even relatively naive accounting of the probability distributions improves the precision of the classifier from 44% to 67%. Practical Relevance. Our approach and corresponding algorithm could be effectively used in situations when a manual annotation process performed by experts is connected with significant financial and time resources, but it is possible to create a system of heuristic rules. The implementation of the proposed algorithm enables to decrease significantly the data preparation expenses without substantial losses in the precision of the classification.
Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad
2017-10-01
The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.
Generalized Huberman-Rudnick scaling law and robustness of q-Gaussian probability distributions
Afsar, Ozgur; Tirnakli, Ugur
2013-01-01
We generalize Huberman-Rudnick universal scaling law for all periodic windows of the logistic map and show the robustness of q-Gaussian probability distributions in the vicinity of chaos threshold. Our scaling relation is universal for the self-similar windows of the map which exhibit period-doubling subharmonic bifurcations. Using this generalized scaling argument, for all periodic windows, as chaos threshold is approached, a developing convergence to q-Gaussian is numerically obtained both in the central regions and tails of the probability distributions of sums of iterates.
On the probability distribution of the stochastic saturation scale in QCD
International Nuclear Information System (INIS)
Marquet, C.; Soyez, G.; Xiao Bowen
2006-01-01
It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations
Directory of Open Access Journals (Sweden)
Diogo de Carvalho Bezerra
2015-12-01
Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.
DEFF Research Database (Denmark)
Helles, Glennie; Fonseca, Rasmus
2009-01-01
done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results: In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle...... residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary......Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments...
International Nuclear Information System (INIS)
Caldarola, L.
1976-01-01
A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)
Duque Escobar, Gonzalo
2015-01-01
El Corredor Bimodal Cafetero es un proyecto de infraestructura estratégica que articula la Hidrovía del Magdalena con el Corredor Férreo del río Cauca, inscrito en el Plan Nacional de Desarrollo 2014/2018 y financiable con la salida de 30 mil toneladas diarias de carbón andino a la cuenca del Pacífico. Incluye el Túnel Cumanday para cruzar la Cordillera Central, el Ferrocarril Cafetero de 150 km y 3% de pendiente entre La Dorada y el Km 41, y la Transversal Cafetera de 108 km para una vía de...
Maximizing a Probability: A Student Workshop on an Application of Continuous Distributions
Griffiths, Martin
2010-01-01
For many students meeting, say, the gamma distribution for the first time, it may well turn out to be a rather fruitless encounter unless they are immediately able to see an application of this probability model to some real-life situation. With this in mind, we pose here an appealing problem that can be used as the basis for a workshop activity…
The distribution of FRAX(®)-based probabilities in women from Japan.
Kanis, John A; Johansson, Helena; Odén, Anders; McCloskey, Eugene V
2012-11-01
New assessment guidelines for osteoporosis in Japan include the use of the WHO risk assessment tool (FRAX) that computes the 10-year probability of fracture. The aim of this study was to determine the distribution of fracture probabilities and to assess the impact of probability-based intervention thresholds in women from Japan aged 50 years and older. Age-specific simulation cohorts were constructed from the prevalences of clinical risk factors and femoral neck bone mineral density to determine the distribution of fracture probabilities as assessed by FRAX. These data were used to estimate the number and proportion of women at or above a 10-year fracture probability of 5, 10, 15, 20, 25, and 30 %. In addition, case scenarios that applied a FRAX probability threshold of 15 % were compared with current guidance. In the absence of additional criteria for treatment, a 15 % fracture probability threshold would identify approximately 32 % of women over the age of 50 years (9.3 million women) as eligible for treatment. Because of expected changes in population demography, the 15 % fracture probability threshold would capture approximately 38 % of women over the age of 50 years (12.7 million women), mainly those aged 80 years or older. The introduction of a FRAX threshold of 15 % would permit treatment in women with clinical risk factors that would otherwise fall below previously established intervention thresholds. The incorporation of FRAX into assessment guidelines is likely to redirect treatments for osteoporosis from younger women at low risk to elderly women at high fracture risk.
Directory of Open Access Journals (Sweden)
A. B. Levina
2016-03-01
Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking
Audio analysis of statistically instantaneous signals with mixed Gaussian probability distributions
Naik, Ganesh R.; Wang, Wenwu
2012-10-01
In this article, a novel method is proposed to measure the separation qualities of statistically instantaneous audio signals with mixed Gaussian probability distributions. This study evaluates the impact of the Probability Distribution Function (PDF) of the mixed signals on the outcomes of both sub- and super-Gaussian distributions. Different Gaussian measures are evaluated by using various spectral-distortion measures. It aims to compare the different audio mixtures from both super-Gaussian and sub-Gaussian perspectives. Extensive computer simulation confirms that the separated sources always have super-Gaussian characteristics irrespective of the PDF of the signals or mixtures. The result based on the objective measures demonstrates the effectiveness of source separation in improving the quality of the separated audio sources.
International Nuclear Information System (INIS)
Gupta, S.S.; Panchapakesan, S.
1975-01-01
A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure
International Nuclear Information System (INIS)
Koshinchanov, Georgy; Dimitrov, Dobri
2008-01-01
The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: - The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; - As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. - Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; - Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; - Next method is considering only
May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M
2018-03-13
Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.
Wang, S Q; Zhang, H Y; Li, Z L
2016-10-01
Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.
THREE-MOMENT BASED APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING SYSTEMS
Directory of Open Access Journals (Sweden)
T. I. Aliev
2014-03-01
Full Text Available The paper deals with the problem of approximation of probability distributions of random variables defined in positive area of real numbers with coefficient of variation different from unity. While using queueing systems as models for computer networks, calculation of characteristics is usually performed at the level of expectation and variance. At the same time, one of the main characteristics of multimedia data transmission quality in computer networks is delay jitter. For jitter calculation the function of packets time delay distribution should be known. It is shown that changing the third moment of distribution of packets delay leads to jitter calculation difference in tens or hundreds of percent, with the same values of the first two moments – expectation value and delay variation coefficient. This means that delay distribution approximation for the calculation of jitter should be performed in accordance with the third moment of delay distribution. For random variables with coefficients of variation greater than unity, iterative approximation algorithm with hyper-exponential two-phase distribution based on three moments of approximated distribution is offered. It is shown that for random variables with coefficients of variation less than unity, the impact of the third moment of distribution becomes negligible, and for approximation of such distributions Erlang distribution with two first moments should be used. This approach gives the possibility to obtain upper bounds for relevant characteristics, particularly, the upper bound of delay jitter.
Probability distribution of surface wind speed induced by convective adjustment on Venus
Yamamoto, Masaru
2017-03-01
The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.
Crovelli, R.A.; Balay, R.H.
1991-01-01
A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.
Directory of Open Access Journals (Sweden)
Furui Du
2014-01-01
Full Text Available The traditional mine microseism locating methods are mainly based on the assumption that the wave velocity is uniform through the space, which leads to some errors for the assumption goes against the laws of nature. In this paper, the wave velocity is regarded as a random variable, and the probability distribution information of the wave velocity is fused into the traditional locating method. This paper puts forwards the microseism source location method for the undersea mining on condition of the probability distribution of the wave velocity and comes up with the solving process of Monte Carlo. In addition, based on the simulated results of the Monte Carlo method, the space is divided into three areas: the most possible area (area I, the possible area (area II, and the small probability area (area III. Attached to corresponding mathematical formulations, spherical models and cylindrical models in different areas are, respectively, built according to whether the source is in the sensor arrays. Both the examples and the actual applications show that (1 the method of microseism source location in this paper can highly improve the accuracy of the microseism monitoring, especially for the source beyond the sensor arrays, and (2 the space-dividing method based on occurrence possibilities of the source can recognize and sweep the hidden dangers for it predicts the probable location range of the source efficiently, while the traditional method cannot.
Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence
Directory of Open Access Journals (Sweden)
C. C. Wu
2011-04-01
Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Reparametrization-covariant theory for on-line learning of probability distributions.
Aida, T
2001-11-01
We discuss the on-line learning of probability distributions in a reparametrization covariant formulation. Reparametrization covariance plays an essential role not only to respect an intrinsic property of "information" but also for pattern recognition problems. We can obtain an optimal on-line learning algorithm with reparametrization invariance, where the conformal gauge connects a covariant formulation with a noncovariant one in a natural way.
Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes
DEFF Research Database (Denmark)
Albrecher, H.; Asmussen, Søren
We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...... claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically...
Directory of Open Access Journals (Sweden)
Shulin Lyu
2018-01-01
The σ function, namely, the derivative of the log of the smallest eigenvalue distributions of the finite-n LUE or the JUE, satisfies the Jimbo–Miwa–Okamoto σ form of PV and PVI, although in the shift Jacobi case, with the weight xα(1−xβ, the β parameter does not show up in the equation. We also obtain the asymptotic expansions for the smallest eigenvalue distributions of the Laguerre unitary and Jacobi unitary ensembles after appropriate double scalings, and obtained the constants in the asymptotic expansion of the gap probabilities, expressed in term of the Barnes G-function valuated at special point.
Sorriso-Valvo, Luca; Carbone, Vincenzo; Veltri, Pierluigi; Consolini, Giuseppe; Bruno, Roberto
Intermittency in fluid turbulence can be emphasized through the analysis of Probability Distribution Functions (PDF) for velocity fluctuations, which display a strong non-gaussian behavior at small scales. Castaing et al. (1990) have introduced the idea that this behavior can be represented, in the framework of a multiplicative cascade model, by a convolution of gaussians whose variances is distributed according to a log-normal distribution. In this letter we have tried to test this conjecture on the MHD solar wind turbulence by performing a fit of the PDF of the bulk speed and magnetic field intensity fluctuations calculated in the solar wind, with the model. This fit allows us to calculate a parameter λ² depending on the scale, which represents the width of the log-normal distribution of the variances of the gaussians. The physical implications of the obtained values of the parameter as well as of its scaling law are finally discussed.
Energy Technology Data Exchange (ETDEWEB)
Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)
2009-09-15
The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.
Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen
2010-04-01
The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.
Probability distribution for the Gaussian curvature of the zero level surface of a random function
Hannay, J. H.
2018-04-01
A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z) = 0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f = 0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.
International Nuclear Information System (INIS)
Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki
1984-01-01
In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)
Stefanescu, E. R.; Patra, A.; Sheridan, M. F.; Cordoba, G.
2012-04-01
event. Logistic regression - Here, we define A as a discrete r.v., while B is a continuous one. P(B) represents the probability of having a flow≥ hcritical at location B, while P(A) represents the probability of having a flow or non-flow at A. Bayes analysis - At this stage of the analysis we consider only the r.v. A, where P(A) represents the probability of having a flow≥ hcritical at location A. We are interested in observing how the probability of having a flow≥ hcritical at location A is changing when data from the model is taken into consideration. We assume a Beta prior distribution for P(A) and compute P(A/data) using Maximum Likelihood Estimation (MLE) approach. Bayesian network for causal relationships - Here, we are interested in more than two critical locations and we are able to incorporate using a directed acyclic graph the causal relationship between all the chosen locations. Marginal probabilities along with the joint probability associated with an event based on the "causal links" between variables.
Simulation of Burn Probabilities and Fire Size Distributions for the Western United States
Finney, M.
2009-04-01
This simulation research was conducted on behalf of five U.S. land management agencies in order to develop a fire risk assessment system for the contiguous land area of the United States. The requirements included generating burn probabilities, characterizing fire behavior variation, and providing a means to evaluate sensitivity to both fire suppression and fuel treatment effects. This paper presents the methods and results of wildland fire size distributions and burn probabilities that were simulated for large units of land that together comprised the entire western United States. The outputs of these simulations are compared with historic data from Federal lands. The methods involved simulating fire ignition and growth for 10,000 to 20,000 "years" of artificial weather. The fire growth simulations were based on previously published methods (Finney 1998, 2002) and, when run repeatedly with different weather and ignition locations, produce fire behavior distributions at each landscape location (e.g. a "cell"). The artificial weather was generated using 1) a time-series analysis of recorded fire danger rating indices for each land unit that served as a proxy of daily and seasonal variation in fuel moisture, and 2) distributions of wind speed and direction from weather records in each unit. The simulations also required spatial data on fuel structure and topography which were provided by the LandFire project for the study area (http://www.landfire.gov). The occurrence and frequency of ignitions were simulated stochastically using empirical relationships that predicted the probability of large fire occurrence from the fire danger rating index. Fire suppression was represented using a modeling analysis of 453 large fires that was used to predict the probability of fire containment (by suppression forces) based on independent predictors of fire growth rates and fuel type. Fuel treatments were implemented into the fuel structure of the landscape to evaluate how these
International Nuclear Information System (INIS)
Vinogradov, S.
2012-01-01
Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.
Directory of Open Access Journals (Sweden)
Tong Yifei
2014-01-01
Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.
Topologically appropriate coordinates for (V{sub zz}, η) joint probability distributions
Energy Technology Data Exchange (ETDEWEB)
Evenson, William E., E-mail: bill@evenson.ch; Adams, M.; Bunker, Austin; Hodges, Jeffery A.; Matheson, P. L., E-mail: phil.matheson@uvu.edu; Park, Tyler; Stufflebeam, Michael; Sullivan, Francis P. [Utah Valley University, Department of Physics (United States); Zacate, M. O. [Northern Kentucky University, Department of Physics and Geology (United States)
2016-12-15
Inhomogeneous broadening (IHB) of hyperfine interactions in materials arises from a distribution of electric field gradients (EFGs) due to randomly distributed defects contributing non-uniformly to the EFG at probe sites. Hyperfine experiments reflect the inhomogeneous distribution of defects through the joint probability distribution function (PDF) of V{sub zz} and η determined by the defect concentration, crystal structure, and defect sites in the crystal. Czjzek showed how to choose coordinates in the (V{sub zz}, η) plane that are consistent with the physical constraints and ordering convention for these EFG parameters. Here we show how to transform to a new set of coordinates that decreases the distortion inherent in Czjzek’s representation. These new coordinates allow one to express the joint PDF for random distributions of defects in a form reasonably approximated by the product of two independent marginal distributions. This paper focuses on these topologically appropriate coordinates, with simple examples drawn from Czjzek’s work and from our simulations of point defects in cubic lattices as well as random amorphous distributions of defects. Detailed simulations have been carried out for IHB in cubic structures and point charge models relevant to perturbed angular correlation (PAC) experiments.
Topologically appropriate coordinates for ( V z z , η) joint probability distributions
Evenson, William E.; Adams, M.; Bunker, Austin; Hodges, Jeffery A.; Matheson, P. L.; Park, Tyler; Stufflebeam, Michael; Sullivan, Francis P.; Zacate, M. O.
2016-12-01
Inhomogeneous broadening (IHB) of hyperfine interactions in materials arises from a distribution of electric field gradients (EFGs) due to randomly distributed defects contributing non-uniformly to the EFG at probe sites. Hyperfine experiments reflect the inhomogeneous distribution of defects through the joint probability distribution function (PDF) of V z z and η determined by the defect concentration, crystal structure, and defect sites in the crystal. Czjzek showed how to choose coordinates in the ( V z z , η) plane that are consistent with the physical constraints and ordering convention for these EFG parameters. Here we show how to transform to a new set of coordinates that decreases the distortion inherent in Czjzek's representation. These new coordinates allow one to express the joint PDF for random distributions of defects in a form reasonably approximated by the product of two independent marginal distributions. This paper focuses on these topologically appropriate coordinates, with simple examples drawn from Czjzek's work and from our simulations of point defects in cubic lattices as well as random amorphous distributions of defects. Detailed simulations have been carried out for IHB in cubic structures and point charge models relevant to perturbed angular correlation (PAC) experiments.
Park, Tyler; Adams, Mike; Bunker, Austin; Hodges, Jeffery; Stufflebeam, Michael; Evenson, William; Matheson, Phil; Zacate, Matthew
2009-10-01
Materials contain defects, which affect crystal properties such as damping of the correlation signal,G2(t), in time and broadening of the frequency spectrum in perturbed angular correlation (PAC) experiments. We attribute this inhomogeneous broadening (IHB) to the random static defects that produce a distribution of electric field gradients (EFGs). Our goal is to find a relationship between the amount of broadening and the concentration of defects. After simulating the EFGs from random configurations of defects, we map our results from the Vzz-Vxx plane to a coordinate system optimized for the EFG distribution through a Czjzek transformation, followed by a conformal mapping. From histograms in this space, we can define probability distribution functions with parameters that vary according to defect concentration. This allows us to calculate the broadened G2(t) spectrum for any concentration, and, in reverse, identify concentrations given a broadened G2(t) spectrum.
New method for extracting tumors in PET/CT images based on the probability distribution
International Nuclear Information System (INIS)
Nitta, Shuhei; Hontani, Hidekata; Hukami, Tadanori
2006-01-01
In this report, we propose a method for extracting tumors from PET/CT images by referring to the probability distribution of pixel values in the PET image. In the proposed method, first, the organs that normally take up fluorodeoxyglucose (FDG) (e.g., the liver, kidneys, and brain) are extracted. Then, the tumors are extracted from the images. The distribution of pixel values in PET images differs in each region of the body. Therefore, the threshold for detecting tumors is adaptively determined by referring to the distribution. We applied the proposed method to 37 cases and evaluated its performance. This report also presents the results of experiments comparing the proposed method and another method in which the pixel values are normalized for extracting tumors. (author)
Roles of factorial noise in inducing bimodal gene expression
Liu, Peijiang; Yuan, Zhanjiang; Huang, Lifang; Zhou, Tianshou
2015-06-01
Some gene regulatory systems can exhibit bimodal distributions of mRNA or protein although the deterministic counterparts are monostable. This noise-induced bimodality is an interesting phenomenon and has important biological implications, but it is unclear how different sources of expression noise (each source creates so-called factorial noise that is defined as a component of the total noise) contribute separately to this stochastic bimodality. Here we consider a minimal model of gene regulation, which is monostable in the deterministic case. Although simple, this system contains factorial noise of two main kinds: promoter noise due to switching between gene states and transcriptional (or translational) noise due to synthesis and degradation of mRNA (or protein). To better trace the roles of factorial noise in inducing bimodality, we also analyze two limit models, continuous and adiabatic approximations, apart from the exact model. We show that in the case of slow gene switching, the continuous model where only promoter noise is considered can exhibit bimodality; in the case of fast switching, the adiabatic model where only transcriptional or translational noise is considered can also exhibit bimodality but the exact model cannot; and in other cases, both promoter noise and transcriptional or translational noise can cooperatively induce bimodality. Since slow gene switching and large protein copy numbers are characteristics of eukaryotic cells, whereas fast gene switching and small protein copy numbers are characteristics of prokaryotic cells, we infer that eukaryotic stochastic bimodality is induced mainly by promoter noise, whereas prokaryotic stochastic bimodality is induced primarily by transcriptional or translational noise.
Dynamical and statistical bimodality in nuclear fragmentation
Mallik, S.; Chaudhuri, G.; Gulminelli, F.
2018-02-01
The origin of bimodal behavior in the residue distribution experimentally measured in heavy ion reactions is reexamined using Boltzmann-Uehling-Uhlenbeck simulations. We suggest that, depending on the incident energy and impact parameter of the reaction, both entrance channel and exit channel effects can be at the origin of the observed behavior. Specifically, fluctuations in the reaction mechanism induced by fluctuations in the collision rate, as well as thermal bimodality directly linked to the nuclear liquid-gas phase transition, are observed in our simulations. Both phenomenologies were previously proposed in the literature but presented as incompatible and contradictory interpretations of the experimental measurements. These results indicate that heavy ion collisions at intermediate energies can be viewed as a powerful tool to study both bifurcations induced by out-of-equilibrium critical phenomena, as well as finite-size precursors of thermal phase transitions.
Guo, L. M.; Zhu, H. B.; Zhang, N. X.
The probability density distribution of the traffic density is analyzed based on the empirical data. It is found that the beta distribution can fit the result obtained from the measured traffic density perfectly. Then a modified traffic model is proposed to simulate the microscopic traffic flow, in which the probability density distribution of the traffic density is taken into account. The model also contains the behavior of drivers’ speed adaptation by taking into account the driving behavior difference and the dynamic headway. Accompanied by presenting the flux-density diagrams, the velocity evolution diagrams and the spatial-temporal profiles of vehicles are also given. The synchronized flow phase and the wide moving jam phase are indicated, which is the challenge for the cellular automata traffic model. Furthermore the phenomenon of the high speed car-following is exhibited, which has been observed in the measured data previously. The results set demonstrate the effectiveness of the proposed model in detecting the complicated dynamic phenomena of the traffic flow.
Nuijten, M J
1999-07-01
The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.
Impact of spike train autostructure on probability distribution of joint spike events.
Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl
2013-05-01
The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing.
Discrete coherent states and probability distributions in finite-dimensional spaces
Energy Technology Data Exchange (ETDEWEB)
Galetti, D.; Marchiolli, M.A.
1995-06-01
Operator bases are discussed in connection with the construction of phase space representatives of operators in finite-dimensional spaces and their properties are presented. It is also shown how these operator bases allow for the construction of a finite harmonic oscillator-like coherent state. Creation and annihilation operators for the Fock finite-dimensional space are discussed and their expressions in terms of the operator bases are explicitly written. The relevant finite-dimensional probability distributions are obtained and their limiting behavior for an infinite-dimensional space are calculated which agree with the well know results. (author). 20 refs, 2 figs.
On the Meta Distribution of Coverage Probability in Uplink Cellular Networks
Elsawy, Hesham
2017-04-07
This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.
Exact probability distribution function for multifractal random walk models of stocks
Saakian, D. B.; Martirosyan, A.; Hu, Chin-Kun; Struzik, Z. R.
2011-07-01
We investigate the multifractal random walk (MRW) model, popular in the modelling of stock fluctuations in the financial market. The exact probability distribution function (PDF) is derived by employing methods proposed in the derivation of correlation functions in string theory, including the analytical extension of Selberg integrals. We show that the recent results by Y. V. Fyodorov, P. Le Doussal and A. Rosso obtained with the logarithmic Random Energy Model (REM) model are sufficient to derive exact formulas for the PDF of the log returns in the MRW model.
Study of the SEMG probability distribution of the paretic tibialis anterior muscle
International Nuclear Information System (INIS)
Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B
2007-01-01
The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed
Probability distribution functions for intermittent scrape-off layer plasma fluctuations
Theodorsen, A.; Garcia, O. E.
2018-03-01
A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.
Alimi, Jean-Michel; de Fromont, Paul
2018-04-01
The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.
Bimodal metal micro-nanopowders for powder injection molding
Pervikov, Aleksandr; Rodkevich, Nikolay; Glazkova, Elena; Lerner, Marat
2017-12-01
The paper studies a bimodal metal powder composition designed to prepare feedstock for powder injection molding, as well as microstructure and porosity of sintered pats. Two kinds of metal powder compositions are used, in particular, a mixture of micro- and nanopowders and a bimodal powder prepared with dispersion of steel wire. The feedstock is prepared by mixing a bimodal metal powder composition with acetylacetone and paraffin wax. The microstructure of the debound parts is observed by scanning electron microscopy. The sintered parts are characterized by density measurements and metallographic analysis. The technique of the metal powder composition proves to affect the characteristics of sintered parts. Nanoparticles are shown in the interstitial spaces among the microparticles upon mixing micro- and nanopowders, but the regular distribution of nanoparticles on the surface of microparticles is observed in the bimodal powder providing the reduction of the porosity of sintered parts and increasing the density to the proper density of steel.
Reactive Sintering of Bimodal WC-Co Hardmetals
Directory of Open Access Journals (Sweden)
Marek Tarraste
2015-09-01
Full Text Available Bimodal WC-Co hardmetals were produced using novel technology - reactive sintering. Milled and activated tungsten and graphite powders were mixed with commercial coarse grained WC-Co powder and then sintered. The microstructure of produced materials was free of defects and consisted of evenly distributed coarse and fine tungsten carbide grains in cobalt binder. The microstructure, hardness and fracture toughness of reactive sintered bimodal WC-Co hardmetals is exhibited. Developed bimodal hardmetal has perspective for demanding wear applications for its increased combined hardness and toughness. Compared to coarse material there is only slight decrease in fracture toughness (K1c is 14.7 for coarse grained and 14.4 for bimodal, hardness is increased from 1290 to 1350 HV units.DOI: http://dx.doi.org/10.5755/j01.ms.21.3.7511
Inhomogeneous broadening of PAC spectra with Vzz and η joint probability distribution functions
International Nuclear Information System (INIS)
Evenson, W. E.; Adams, M.; Bunker, A.; Hodges, J.; Matheson, P.; Park, T.; Stufflebeam, M.; Zacate, M. O.
2013-01-01
The perturbed angular correlation (PAC) spectrum, G 2 (t), is broadened by the presence of randomly distributed defects in crystals due to a distribution of electric field gradients (EFGs) experienced by probe nuclei. Heuristic approaches to fitting spectra that exhibit such inhomogeneous broadening (ihb) consider only the distribution of EFG magnitudes V zz , but the physical effect actually depends on the joint probability distribution function (pdf) of V zz and EFG asymmetry parameter η. The difficulty in determining the joint pdf leads us to more appropriate representations of the EFG coordinates, and to express the joint pdf as the product of two approximately independent pdfs describing each coordinate separately. We have pursued this case in detail using as an initial illustration of the method a simple point defect model with nuclear spin I = 5/2 in several cubic lattices, where G 2 (t) is primarily induced by a defect trapped in the first neighbor shell of a probe and broadening is due to defects distributed at random outside the first neighbor shell. Effects such as lattice relaxation are ignored in this simple test of the method. The simplicity of our model is suitable for gaining insight into ihb with more than V zz alone. We simulate ihb in this simple case by averaging the net EFGs of 20,000 random defect arrangements, resulting in a broadened average G 2 (t). The 20,000 random cases provide a distribution of EFG components which are first transformed to Czjzek coordinates and then further into the full Czjzek half plane by conformal mapping. The topology of this transformed space yields an approximately separable joint pdf for the EFG components. We then fit the nearly independent pdfs and reconstruct G 2 (t) as a function of defect concentration. We report results for distributions of defects on simple cubic, face-centered cubic, and body-centered cubic lattices. The method explored here for analyzing ihb is applicable to more realistic cases.
Inhomogeneous broadening of PAC spectra with V zz and η joint probability distribution functions
Evenson, W. E.; Adams, M.; Bunker, A.; Hodges, J.; Matheson, P.; Park, T.; Stufflebeam, M.; Zacate, M. O.
2013-05-01
The perturbed angular correlation (PAC) spectrum, G 2( t), is broadened by the presence of randomly distributed defects in crystals due to a distribution of electric field gradients (EFGs) experienced by probe nuclei. Heuristic approaches to fitting spectra that exhibit such inhomogeneous broadening (ihb) consider only the distribution of EFG magnitudes V zz , but the physical effect actually depends on the joint probability distribution function (pdf) of V zz and EFG asymmetry parameter η. The difficulty in determining the joint pdf leads us to more appropriate representations of the EFG coordinates, and to express the joint pdf as the product of two approximately independent pdfs describing each coordinate separately. We have pursued this case in detail using as an initial illustration of the method a simple point defect model with nuclear spin I = 5/2 in several cubic lattices, where G 2( t) is primarily induced by a defect trapped in the first neighbor shell of a probe and broadening is due to defects distributed at random outside the first neighbor shell. Effects such as lattice relaxation are ignored in this simple test of the method. The simplicity of our model is suitable for gaining insight into ihb with more than V zz alone. We simulate ihb in this simple case by averaging the net EFGs of 20,000 random defect arrangements, resulting in a broadened average G 2( t). The 20,000 random cases provide a distribution of EFG components which are first transformed to Czjzek coordinates and then further into the full Czjzek half plane by conformal mapping. The topology of this transformed space yields an approximately separable joint pdf for the EFG components. We then fit the nearly independent pdfs and reconstruct G 2( t) as a function of defect concentration. We report results for distributions of defects on simple cubic, face-centered cubic, and body-centered cubic lattices. The method explored here for analyzing ihb is applicable to more realistic cases.
Directory of Open Access Journals (Sweden)
Jayajit Das '
2015-07-01
Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.
International Nuclear Information System (INIS)
Booth, J.T.; Zavgorodni, S.F.; Royal Adelaide Hospital, SA
2001-01-01
Uncertainty in the precise quantity of radiation dose delivered to tumours in external beam radiotherapy is present due to many factors, and can result in either spatially uniform (Gaussian) or spatially non-uniform dose errors. These dose errors are incorporated into the calculation of tumour control probability (TCP) and produce a distribution of possible TCP values over a population. We also study the effect of inter-patient cell sensitivity heterogeneity on the population distribution of patient TCPs. This study aims to investigate the relative importance of these three uncertainties (spatially uniform dose uncertainty, spatially non-uniform dose uncertainty, and inter-patient cell sensitivity heterogeneity) on the delivered dose and TCP distribution following a typical course of fractionated external beam radiotherapy. The dose distributions used for patient treatments are modelled in one dimension. Geometric positioning uncertainties during and before treatment are considered as shifts of a pre-calculated dose distribution. Following the simulation of a population of patients, distributions of dose across the patient population are used to calculate mean treatment dose, standard deviation in mean treatment dose, mean TCP, standard deviation in TCP, and TCP mode. These parameters are calculated with each of the three uncertainties included separately. The calculations show that the dose errors in the tumour volume are dominated by the spatially uniform component of dose uncertainty. This could be related to machine specific parameters, such as linear accelerator calibration. TCP calculation is affected dramatically by inter-patient variation in the cell sensitivity and to a lesser extent by the spatially uniform dose errors. The positioning errors with the 1.5 cm margins used cause dose uncertainty outside the tumour volume and have a small effect on mean treatment dose (in the tumour volume) and tumour control. Copyright (2001) Australasian College of
Yampolsky, M; Salafia, C M; Shlakhter, O
2013-06-01
While the mean shape of human placenta is round with centrally inserted umbilical cord, significant deviations from this ideal are fairly common, and may be clinically meaningful. Traditionally, they are explained by trophotropism. We have proposed a hypothesis explaining typical variations in placental shape by randomly determined fluctuations in the growth process of the vascular tree. It has been recently reported that umbilical cord displacement in a birth cohort has a log-normal probability distribution, which indicates that the displacement between an initial point of origin and the centroid of the mature shape is a result of accumulation of random fluctuations of the dynamic growth of the placenta. To confirm this, we investigate statistical distributions of other features of placental morphology. In a cohort of 1023 births at term digital photographs of placentas were recorded at delivery. Excluding cases with velamentous cord insertion, or missing clinical data left 1001 (97.8%) for which placental surface morphology features were measured. Best-fit statistical distributions for them were obtained using EasyFit. The best-fit distributions of umbilical cord displacement, placental disk diameter, area, perimeter, and maximal radius calculated from the cord insertion point are of heavy-tailed type, similar in shape to log-normal distributions. This is consistent with a stochastic origin of deviations of placental shape from normal. Deviations of placental shape descriptors from average have heavy-tailed distributions similar in shape to log-normal. This evidence points away from trophotropism, and towards a spontaneous stochastic evolution of the variants of placental surface shape features. Copyright © 2013 Elsevier Ltd. All rights reserved.
Shen, Xiaojing; Sun, Junying; Kivekäs, Niku; Kristensson, Adam; Zhang, Xiaoye; Zhang, Yangmei; Zhang, Lu; Fan, Ruxia; Qi, Xuefei; Ma, Qianli; Zhou, Huaigang
2018-01-01
In this work, the spatial extent of new particle formation (NPF) events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD) data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ) in the North China Plain (NCP), Mt. Tai (TS) in central eastern China, and Lin'an (LAN) in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100-200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR) and new particle formation rate (J) than air masses from Inner Mongolia (IM). At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial extent, relative
Directory of Open Access Journals (Sweden)
X. Shen
2018-01-01
Full Text Available In this work, the spatial extent of new particle formation (NPF events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ in the North China Plain (NCP, Mt. Tai (TS in central eastern China, and Lin'an (LAN in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100–200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR and new particle formation rate (J than air masses from Inner Mongolia (IM. At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial
DEFF Research Database (Denmark)
Yura, Harold; Hanson, Steen Grüner
2012-01-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
Modeling the probability distribution of positional errors incurred by residential address geocoding
Directory of Open Access Journals (Sweden)
Mazumdar Soumya
2007-01-01
Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.
On the probability distribution of daily streamflow in the United States
Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.
2017-06-01
Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.
Various models for pion probability distributions from heavy-ion collisions
International Nuclear Information System (INIS)
Mekjian, A.Z.; Mekjian, A.Z.; Schlei, B.R.; Strottman, D.; Schlei, B.R.
1998-01-01
Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength η of a Poisson emitter and a critical density η c are connected in a thermal model by η/n c =e -m/T <1, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can be much larger than Poisson in the pion laser model and for a negative binomial description. The clan representation of the negative binomial distribution due to Van Hove and Giovannini is discussed using the present description. Applications to CERN/NA44 and CERN/NA49 data are discussed in terms of the relativistic hydrodynamic model. copyright 1998 The American Physical Society
Directory of Open Access Journals (Sweden)
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
Bažant, Zdeněk P.; Le, Jia-Liang; Bazant, Martin Z.
2009-01-01
The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined. PMID:19561294
Binomial moments of the distance distribution and the probability of undetected error
Energy Technology Data Exchange (ETDEWEB)
Barg, A. [Lucent Technologies, Murray Hill, NJ (United States). Bell Labs.; Ashikhmin, A. [Los Alamos National Lab., NM (United States)
1998-09-01
In [1] K.A.S. Abdel-Ghaffar derives a lower bound on the probability of undetected error for unrestricted codes. The proof relies implicitly on the binomial moments of the distance distribution of the code. The authors use the fact that these moments count the size of subcodes of the code to give a very simple proof of the bound in [1] by showing that it is essentially equivalent to the Singleton bound. They discuss some combinatorial connections revealed by this proof. They also discuss some improvements of this bound. Finally, they analyze asymptotics. They show that an upper bound on the undetected error exponent that corresponds to the bound of [1] improves known bounds on this function.
Bazant, Zdenek P; Le, Jia-Liang; Bazant, Martin Z
2009-07-14
The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically theory for the strength cdf of quasibrittle structure is refined by deriving it from fracture mechanics of nanocracks propagating by small, activation-energy-controlled, random jumps through the atomic lattice. This refinement also provides a plausible physical justification of the power law for subcritical creep crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined.
Wenger, Seth J; Freeman, Mary C
2008-10-01
Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.
Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.
2012-01-01
1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities
Energy Technology Data Exchange (ETDEWEB)
Evenson, W. E.; Adams, M.; Bunker, A.; Hodges, J.; Matheson, P.; Park, T.; Stufflebeam, M. [Utah Valley University, Department of Physics (United States); Zacate, M. O., E-mail: zacatem1@nku.edu [Northern Kentucky University, Department of Physics and Geology (United States)
2013-05-15
The perturbed angular correlation (PAC) spectrum, G{sub 2}(t), is broadened by the presence of randomly distributed defects in crystals due to a distribution of electric field gradients (EFGs) experienced by probe nuclei. Heuristic approaches to fitting spectra that exhibit such inhomogeneous broadening (ihb) consider only the distribution of EFG magnitudes V{sub zz}, but the physical effect actually depends on the joint probability distribution function (pdf) of V{sub zz} and EFG asymmetry parameter {eta}. The difficulty in determining the joint pdf leads us to more appropriate representations of the EFG coordinates, and to express the joint pdf as the product of two approximately independent pdfs describing each coordinate separately. We have pursued this case in detail using as an initial illustration of the method a simple point defect model with nuclear spin I = 5/2 in several cubic lattices, where G{sub 2}(t) is primarily induced by a defect trapped in the first neighbor shell of a probe and broadening is due to defects distributed at random outside the first neighbor shell. Effects such as lattice relaxation are ignored in this simple test of the method. The simplicity of our model is suitable for gaining insight into ihb with more than V{sub zz} alone. We simulate ihb in this simple case by averaging the net EFGs of 20,000 random defect arrangements, resulting in a broadened average G{sub 2}(t). The 20,000 random cases provide a distribution of EFG components which are first transformed to Czjzek coordinates and then further into the full Czjzek half plane by conformal mapping. The topology of this transformed space yields an approximately separable joint pdf for the EFG components. We then fit the nearly independent pdfs and reconstruct G{sub 2}(t) as a function of defect concentration. We report results for distributions of defects on simple cubic, face-centered cubic, and body-centered cubic lattices. The method explored here for analyzing ihb is
Lee, T. S.; Yoon, S.; Jeong, C.
2012-12-01
The primary purpose of frequency analysis in hydrology is to estimate the magnitude of an event with a given frequency of occurrence. The precision of frequency analysis depends on the selection of an appropriate probability distribution model (PDM) and parameter estimation techniques. A number of PDMs have been developed to describe the probability distribution of the hydrological variables. For each of the developed PDMs, estimated parameters are provided based on alternative estimation techniques, such as the method of moments (MOM), probability weighted moments (PWM), linear function of ranked observations (L-moments), and maximum likelihood (ML). Generally, the results using ML are more reliable than the other methods. However, the ML technique is more laborious than the other methods because an iterative numerical solution, such as the Newton-Raphson method, must be used for the parameter estimation of PDMs. In the meantime, meta-heuristic approaches have been developed to solve various engineering optimization problems (e.g., linear and stochastic, dynamic, nonlinear). These approaches include genetic algorithms, ant colony optimization, simulated annealing, tabu searches, and evolutionary computation methods. Meta-heuristic approaches use a stochastic random search instead of a gradient search so that intricate derivative information is unnecessary. Therefore, the meta-heuristic approaches have been shown to be a useful strategy to solve optimization problems in hydrology. A number of studies focus on using meta-heuristic approaches for estimation of hydrological variables with parameter estimation of PDMs. Applied meta-heuristic approaches offer reliable solutions but use more computation time than derivative-based methods. Therefore, the purpose of this study is to enhance the meta-heuristic approach for the parameter estimation of PDMs by using a recently developed algorithm known as a harmony search (HS). The performance of the HS is compared to the
Ballesteros-Paredes, Javier; Vázquez-Semadeni, Enrique; Gazol, Adriana; Hartmann, Lee W.; Heitsch, Fabian; Colín, Pedro
2011-09-01
It has been recently shown that molecular clouds do not exhibit a unique shape for the column density probability distribution function (N-PDF). Instead, clouds without star formation seem to possess a lognormal distribution, while clouds with active star formation develop a power-law tail at high column densities. The lognormal behaviour of the N-PDF has been interpreted in terms of turbulent motions dominating the dynamics of the clouds, while the power-law behaviour occurs when the cloud is dominated by gravity. In the present contribution, we use thermally bi-stable numerical simulations of cloud formation and evolution to show that, indeed, these two regimes can be understood in terms of the formation and evolution of molecular clouds: a very narrow lognormal regime appears when the cloud is being assembled. However, as the global gravitational contraction occurs, the initial density fluctuations are enhanced, resulting, first, in a wider lognormal N-PDF, and later, in a power-law N-PDF. We thus suggest that the observed N-PDF of molecular clouds are a manifestation of their global gravitationally contracting state. We also show that, contrary to recent suggestions, the exact value of the power-law slope is not unique, as it depends on the projection in which the cloud is being observed.
International Nuclear Information System (INIS)
Raghukiran, Nadimpalli; Kumar, Ravi
2016-01-01
Hypereutectic Al–Si and Al–Si–Sc alloys were spark plasma sintered from corresponding gas-atomized powders. The microstructures of the Al–Si and Al–Si–Sc alloys possessed remarkably refined silicon particles in the size range of 0.38–3.5 µm and 0.35–1.16 µm respectively in contrast to the silicon particles of size greater than 100 µm typically found in conventionally cast alloys. All the sintered alloys exhibited significant ductility of as high as 85% compressive strain without failure even with the presence of relatively higher weight fraction of the brittle silicon phase. Moreover, the Al–Si–Sc alloys have shown appreciable improvement in the compressive strength over their binary counterparts due to the presence of intermetallic compound AlSi 2 Sc 2 of size 10–20 nm distributed uniformly in the matrix of those alloys. The dry sliding pin-on-disc wear tests showed improvement in the wear performance of the sintered alloys with increase in silicon content in the alloys. Further, the Al–Si–Sc ternary alloys with relatively lesser silicon content exhibited appreciable improvement in the wear resistance over their binary counterparts. The Al–Si–Sc alloys with bimodal distribution of the strengthening phases consisting of ultra-fine (sub-micron size) silicon particles and the nano-scale AlSi 2 Sc 2 improved the strength and wear properties of the alloys while retaining significant amount of ductility.
Cieplak, Agnieszka; Slosar, Anze
2018-01-01
The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.
Multiple Streaming and the Probability Distribution of Density in Redshift Space
Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
2000-07-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude (σlreal-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which are physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S3, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated.
Exact probability distributions of selected species in stochastic chemical reaction networks.
López-Caamal, Fernando; Marquez-Lago, Tatiana T
2014-09-01
Chemical reactions are discrete, stochastic events. As such, the species' molecular numbers can be described by an associated master equation. However, handling such an equation may become difficult due to the large size of reaction networks. A commonly used approach to forecast the behaviour of reaction networks is to perform computational simulations of such systems and analyse their outcome statistically. This approach, however, might require high computational costs to provide accurate results. In this paper we opt for an analytical approach to obtain the time-dependent solution of the Chemical Master Equation for selected species in a general reaction network. When the reaction networks are composed exclusively of zeroth and first-order reactions, this analytical approach significantly alleviates the computational burden required by simulation-based methods. By building upon these analytical solutions, we analyse a general monomolecular reaction network with an arbitrary number of species to obtain the exact marginal probability distribution for selected species. Additionally, we study two particular topologies of monomolecular reaction networks, namely (i) an unbranched chain of monomolecular reactions with and without synthesis and degradation reactions and (ii) a circular chain of monomolecular reactions. We illustrate our methodology and alternative ways to use it for non-linear systems by analysing a protein autoactivation mechanism. Later, we compare the computational load required for the implementation of our results and a pure computational approach to analyse an unbranched chain of monomolecular reactions. Finally, we study calcium ions gates in the sarco/endoplasmic reticulum mediated by ryanodine receptors.
Characterizing the Lyα forest flux probability distribution function using Legendre polynomials
Energy Technology Data Exchange (ETDEWEB)
Cieplak, Agnieszka M.; Slosar, Anže, E-mail: acieplak@bnl.gov, E-mail: anze@bnl.gov [Brookhaven National Laboratory, Bldg 510, Upton, NY, 11973 (United States)
2017-10-01
The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n -th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.
Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw
2011-07-01
Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.
Francisco, E.; Pendás, A. Martín; Blanco, M. A.
2008-04-01
Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer
Debnath, Mukul C.; Liang, Baolai; Laghumavarapu, Ramesh B.; Wang, Guodong; Das, Aparna; Juang, Bor-Chau; Huffaker, Diana L.
2017-06-01
High-quality InAs quantum dots (QDs) with nominal thicknesses of 5.0-8.0 monolayers were grown on a digital AlAs0.56Sb0.44 matrix lattice-matched to the InP(001) substrate. All QDs showed bimodal size distribution, and their optical properties were investigated by photoluminescence (PL) and time-resolved PL measurements. Power dependent PL exhibited a linear relationship between the peak energy and the cube root of the excitation power for both the small QD family (SQDF) and the large QD family (LQDF), which is attributed to the type-II transition. The PL intensity, peak energy, and carrier lifetime of SQDF and LQDF showed very sensitive at high temperature. Above 125 K, the PL intensity ratio increased continuously between LQDF and SQDF, the peak energy shifted anomalously in SQDF, and the longer carrier radiative lifetime (≥3.0 ns at 77 K) reduced rapidly in SQDF and slowly in LQDF. These results are ascribed to thermally activated carrier escape from SQDF into the wetting layer, which then relaxed into LQDF with low-localized energy states.
Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan
2017-08-01
We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.
Estimating the distribution of probable age-at-death from dental remains of immature human fossils.
Shackelford, Laura L; Stinespring Harris, Ashley E; Konigsberg, Lyle W
2012-02-01
In two historic longitudinal growth studies, Moorrees et al. (Am J Phys Anthropol 21 (1963) 99-108; J Dent Res 42 (1963) 1490-1502) presented the "mean attainment age" for stages of tooth development for 10 permanent tooth types and three deciduous tooth types. These findings were presented graphically to assess the rate of tooth formation in living children and to age immature skeletal remains. Despite being widely cited, these graphical data are difficult to implement because there are no accompanying numerical values for the parameters underlying the growth data. This analysis generates numerical parameters from the data reported by Moorrees et al. by digitizing 358 points from these tooth formation graphs using DataThief III, version 1.5. Following the original methods, the digitized points for each age transition were conception-corrected and converted to the logarithmic scale to determine a median attainment age for each dental formation stage. These values are subsequently used to estimate age-at-death distributions for immature individuals using a single tooth or multiple teeth, including estimates for 41 immature early modern humans and 25 immature Neandertals. Within-tooth variance is calculated for each age estimate based on a single tooth, and a between-tooth component of variance is calculated for age estimates based on two or more teeth to account for the increase in precision that comes from using additional teeth. Finally, we calculate the relative probability of observing a particular dental formation sequence given known-age reference information and demonstrate its value in estimating age for immature fossil specimens. Copyright © 2011 Wiley Periodicals, Inc.
International Nuclear Information System (INIS)
Cheng, Tianjin; Pandey, Mahesh D.; Weide, J.A.M. van der
2012-01-01
The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.
'Bi-modal' isoscalar giant dipole strength in 58Ni
International Nuclear Information System (INIS)
Nayak, B.K.; Garg, U.; Hedden, M.; Koss, M.; Li, T.; Liu, Y.; Madhusudhana Rao, P.V.; Zhu, S.; Itoh, M.; Sakaguchi, H.; Takeda, H.; Uchida, M.; Yasuda, Y.; Yosoi, M.; Fujimura, H.; Fujiwara, M.; Hara, K.; Kawabata, T.; Akimune, H.; Harakeh, M.N.
2006-01-01
The strength distribution of the isoscalar giant dipole resonance (ISGDR) in 58 Ni has been obtained over the energy range 10.5-49.5 MeV via extreme forward angle scattering (including 0 deg.) of 386 MeV α particles. We observe a 'bi-modal' E1 strength distribution for the first time in an A<90 nucleus. The observed ISGDR strength distribution is in reasonable agreement with the predictions of a recent RPA calculation
Radtke, T.; Fritzsche, S.
2008-11-01
, quantum information science has contributed to our understanding of quantum mechanics and has provided also new and efficient protocols, based on the use of entangled quantum states. To determine the behavior and entanglement of n-qubit quantum registers, symbolic and numerical simulations need to be applied in order to analyze how these quantum information protocols work and which role the entanglement plays hereby. Solution method: Using the computer algebra system Maple, we have developed a set of procedures that support the definition, manipulation and analysis of n-qubit quantum registers. These procedures also help to deal with (unitary) logic gates and (nonunitary) quantum operations that act upon the quantum registers. With the parameterization of various frequently-applied objects, that are implemented in the present version, the program now facilitates a wider range of symbolic and numerical studies. All commands can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems, both in ideal and noisy quantum circuits. Reasons for new version: In the first version of the FEYNMAN program [1], we implemented the data structures and tools that are necessary to create, manipulate and to analyze the state of quantum registers. Later [2,3], support was added to deal with quantum operations (noisy channels) as an ingredient which is essential for studying the effects of decoherence. With the present extension, we add a number of parametrizations of objects frequently utilized in decoherence and entanglement studies, such that as hermitian and unitary matrices, probability distributions, or various kinds of quantum states. This extension therefore provides the basis, for example, for the optimization of a given function over the set of pure states or the simple generation of random objects. Running time: Most commands that act upon quantum registers with five or less qubits take ⩽10 seconds of processor time on a Pentium 4 processor
Directory of Open Access Journals (Sweden)
Hong-fu Guo
2017-01-01
Full Text Available Particle size and distribution play an important role in ignition. The size and distribution of the cyclotetramethylene tetranitramine (HMX particles were investigated by Laser Particle Size Analyzer Malvern MS2000 before experiment and calculation. The mean size of particles is 161 μm. Minimum and maximum sizes are 80 μm and 263 μm, respectively. The distribution function is like a quadratic function. Based on the distribution of micron scale explosive particles, a microscopic model is established to describe the process of ignition of HMX particles under drop weight. Both temperature of contact zones and ignition probability of powder explosive can be predicted. The calculated results show that the temperature of the contact zones between the particles and the drop weight surface increases faster and higher than that of the contact zones between two neighboring particles. For HMX particles, with all other conditions being kept constant, if the drop height is less than 0.1 m, ignition probability will be close to 0. When the drop heights are 0.2 m and 0.3 m, the ignition probability is 0.27 and 0.64, respectively, whereas when the drop height is more than 0.4 m, ignition probability will be close to 0.82. In comparison with experimental results, the two curves are reasonably close to each other, which indicates our model has a certain degree of rationality.
Energy Technology Data Exchange (ETDEWEB)
O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-10-27
The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.
Yu, N.; Delrieu, G.; Boudevillain, Brice; Hazenberg, P.; Uijlenhoet, R.
2014-01-01
This study offers a unified formulation of single- and multimoment normalizations of the raindrop size distribution (DSD), which have been proposed in the framework of scaling analyses in the literature. The key point is to consider a well-defined “general distribution” g(x) as the probability
Human and mouse switch-like genes share common transcriptional regulatory mechanisms for bimodality
Directory of Open Access Journals (Sweden)
Tozeren Aydin
2008-12-01
Full Text Available Abstract Background Gene expression is controlled over a wide range at the transcript level through complex interplay between DNA and regulatory proteins, resulting in profiles of gene expression that can be represented as normal, graded, and bimodal (switch-like distributions. We have previously performed genome-scale identification and annotation of genes with switch-like expression at the transcript level in mouse, using large microarray datasets for healthy tissue, in order to study the cellular pathways and regulatory mechanisms involving this class of genes. We showed that a large population of bimodal mouse genes encoding for cell membrane and extracellular matrix proteins is involved in communication pathways. This study expands on previous results by annotating human bimodal genes, investigating their correspondence to bimodality in mouse orthologs and exploring possible regulatory mechanisms that contribute to bimodality in gene expression in human and mouse. Results Fourteen percent of the human genes on the HGU133A array (1847 out of 13076 were identified as bimodal or switch-like. More than 40% were found to have bimodal mouse orthologs. KEGG pathways enriched for bimodal genes included ECM-receptor interaction, focal adhesion, and tight junction, showing strong similarity to the results obtained in mouse. Tissue-specific modes of expression of bimodal genes among brain, heart, and skeletal muscle were common between human and mouse. Promoter analysis revealed a higher than average number of transcription start sites per gene within the set of bimodal genes. Moreover, the bimodal gene set had differentially methylated histones compared to the set of the remaining genes in the genome. Conclusion The fact that bimodal genes were enriched within the cell membrane and extracellular environment make these genes as candidates for biomarkers for tissue specificity. The commonality of the important roles bimodal genes play in tissue
A HYPOTHESIS FOR THE COLOR BIMODALITY OF JUPITER TROJANS
International Nuclear Information System (INIS)
Wong, Ian; Brown, Michael E.
2016-01-01
One of the most enigmatic and hitherto unexplained properties of Jupiter Trojans is their bimodal color distribution. This bimodality is indicative of two sub-populations within the Trojans, which have distinct size distributions. In this paper, we present a simple, plausible hypothesis for the origin and evolution of the two Trojan color sub-populations. In the framework of dynamical instability models of early solar system evolution, which suggest a common primordial progenitor population for both Trojans and Kuiper Belt objects, we use observational constraints to assert that the color bimodalities evident in both minor body populations developed within the primordial population prior to the onset of instability. We show that, beginning with an initial composition of rock and ices, location-dependent volatile loss through sublimation in this primordial population could have led to sharp changes in the surface composition with heliocentric distance. We propose that the depletion or retention of H 2 S ice on the surface of these objects was the key factor in creating an initial color bimodality. Objects that retained H 2 S on their surfaces developed characteristically redder colors upon irradiation than those that did not. After the bodies from the primordial population were scattered and emplaced into their current positions, they preserved this primordial color bimodality to the present day. We explore predictions of the volatile loss model—in particular, the effect of collisions within the Trojan population on the size distributions of the two sub-populations—and propose further experimental and observational tests of our hypothesis.
A HYPOTHESIS FOR THE COLOR BIMODALITY OF JUPITER TROJANS
Energy Technology Data Exchange (ETDEWEB)
Wong, Ian; Brown, Michael E., E-mail: iwong@caltech.edu [Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, CA 91125 (United States)
2016-10-01
One of the most enigmatic and hitherto unexplained properties of Jupiter Trojans is their bimodal color distribution. This bimodality is indicative of two sub-populations within the Trojans, which have distinct size distributions. In this paper, we present a simple, plausible hypothesis for the origin and evolution of the two Trojan color sub-populations. In the framework of dynamical instability models of early solar system evolution, which suggest a common primordial progenitor population for both Trojans and Kuiper Belt objects, we use observational constraints to assert that the color bimodalities evident in both minor body populations developed within the primordial population prior to the onset of instability. We show that, beginning with an initial composition of rock and ices, location-dependent volatile loss through sublimation in this primordial population could have led to sharp changes in the surface composition with heliocentric distance. We propose that the depletion or retention of H{sub 2}S ice on the surface of these objects was the key factor in creating an initial color bimodality. Objects that retained H{sub 2}S on their surfaces developed characteristically redder colors upon irradiation than those that did not. After the bodies from the primordial population were scattered and emplaced into their current positions, they preserved this primordial color bimodality to the present day. We explore predictions of the volatile loss model—in particular, the effect of collisions within the Trojan population on the size distributions of the two sub-populations—and propose further experimental and observational tests of our hypothesis.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Directory of Open Access Journals (Sweden)
Fang Zheng
2013-04-01
Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.
Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei
2014-04-01
Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.
Directory of Open Access Journals (Sweden)
Panpan Zhao
2017-05-01
Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.
Bimodal immune activation in psoriasis.
Christophers, E; Metzler, G; Röcken, M
2014-01-01
Psoriasis is an immune-regulated skin disease with various clinical subtypes and disease activities. The majority of patients present with predominantly stable plaques. At the onset of new lesions, plaque-type psoriasis frequently demonstrates pin-sized and highly inflammatory papules sometimes with an inflammatory border. The histopathology of initial psoriasis differs from stable plaque-type psoriasis. Early lesions demonstrate innate immune cells with neutrophils, degranulating mast cells and macrophages. These are followed by interleukin (IL)-1-dependent T helper (Th)17 cells, finally resulting in the Th1-dominated immunopathology of stable plaque-type psoriasis, where mononuclear cells predominate with interspersed neutrophilic (Munro) microabscesses. These features suggest a bimodal immune pathway where alternate activation of either innate (autoinflammatory) or adaptive (autoimmune) immunity predominates. Neutrophilic infiltrations appear during early psoriasis with Munro abscesses. They are time limited and occur periodically, clinically best seen in linear nail pitting. These features strongly suggest a critical role for an IL-1-Th17-dominated autoinflammation in the initiation of psoriasis, followed by a Th1-dominated late-phase reaction. The concept of bimodal immune activation helps to explain results from therapeutic interventions that are variable and previously only partly understood. © 2013 British Association of Dermatologists.
Impact of Zygosity on Bimodal Phenotype Distributions
DEFF Research Database (Denmark)
Holst-Hansen, Thomas; Abad, Elena; Muntasell, Aura
2017-01-01
. Here, we study this question making use of the natural genetic variability of human populations, which allows us to compare the expression profiles of a receptor protein in natural killer cells among donors infected with human cytomegalovirus with one or two copies of the allele. Crucially...
Czech Academy of Sciences Publication Activity Database
Kracík, Jan
2011-01-01
Roč. 52, č. 6 (2011), s. 659-671 ISSN 0888-613X R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : combining probabilities * Kullback-Leibler divergence * maximum likelihood * expert opinions * linear opinion pool Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2011/AS/kracik-0359399.pdf
Tomovski, Zivorad; Mehrez, Khaled
2016-01-01
By making use of the familiar Mathieu series and its generalizations, the authors derive a number of new integral representations and present a systematic study of probability density functions and probability distributions associated with some generalizations of the Mathieu series. In particular, the mathematical expectation, variance and the characteristic functions, related to the probability density functions of the considered probability distributions are derived. As a consequence, some ...
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Basu, Kinjal; Sengupta, Debapriya
2012-01-01
Consider the problem when $X_1,X_2,..., X_n$ are distributed on a circle following an unknown distribution $F$ on $S^1$. In this article we have consider the absolute general set-up where the density can have local features such as discontinuities and edges. Furthermore, there can be outlying data which can follow some discrete distributions. The traditional Kernel Density Estimation methods fail to identify such local features in the data. Here we device a non-parametric density estimate on ...
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
On the probability distribution of stock returns in the Mike-Farmer model
Gu, G.-F.; Zhou, W.-X.
2009-02-01
Recently, Mike and Farmer have constructed a very powerful and realistic behavioral model to mimick the dynamic process of stock price formation based on the empirical regularities of order placement and cancelation in a purely order-driven market, which can successfully reproduce the whole distribution of returns, not only the well-known power-law tails, together with several other important stylized facts. There are three key ingredients in the Mike-Farmer (MF) model: the long memory of order signs characterized by the Hurst index Hs, the distribution of relative order prices x in reference to the same best price described by a Student distribution (or Tsallis’ q-Gaussian), and the dynamics of order cancelation. They showed that different values of the Hurst index Hs and the freedom degree αx of the Student distribution can always produce power-law tails in the return distribution fr(r) with different tail exponent αr. In this paper, we study the origin of the power-law tails of the return distribution fr(r) in the MF model, based on extensive simulations with different combinations of the left part L(x) for x 0 of fx(x). We find that power-law tails appear only when L(x) has a power-law tail, no matter R(x) has a power-law tail or not. In addition, we find that the distributions of returns in the MF model at different timescales can be well modeled by the Student distributions, whose tail exponents are close to the well-known cubic law and increase with the timescale.
International Nuclear Information System (INIS)
Viana, R.S.; Yoriyaz, H.; Santos, A.
2011-01-01
The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)
Jones, Evan; Singal, Jack
2018-01-01
We present results of using individual galaxies' redshift probability information derived from a photometric redshift (photo-z) algorithm, SPIDERz, to identify potential catastrophic outliers in photometric redshift determinations. By using test data comprised of COSMOS multi-band photometry and known spectroscopic redshifts from the 3D-HST survey spanning a wide redshift range (0strategy in photo-z determinations using a range of flagging parameter values. These results could potentially be useful for utilization of photometric redshifts in future large scale surveys where catastrophic outliers are particularly detrimental to the science goals.
International Nuclear Information System (INIS)
Lajoie, M-A.; Marleau, G.
2010-01-01
The analysis of VHTR fuel tends to be difficult when using deterministic methods currently employed in lattice codes notably because of limitations on geometry representation and the stochastic positioning of spherical elements. The method proposed here and implemented in the lattice code DRAGON is to generate the positions of multi-layered spheres using random sequential addition, and to analyze the resulting geometry using a full three-dimensional spherical collision probability method. The preliminary validation runs are consistent with results obtained using a Monte-Carlo method, for both regularly and randomly positioned pins. (author)
Elizalde, E.; Gaztanaga, E.
1992-01-01
The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.
BSA adsorption on bimodal PEO brushes
Bosker, WTE; Iakovlev, PA; Norde, W; Stuart, Martien A. Cohen
2005-01-01
BSA adsorption onto bimodal PEO brushes at a solid surface was measured using optical reflectometry. Bimodal brushes consist of long (N = 770) and short (N = 48) PEO chains and were prepared on PS surfaces, applying mixtures of PS29-PEO48 and PS37-PEO770 block copolymers and using the
BSA adsorption on bimodal PEO brushes
Bosker, W.T.E.; Iakovlev, P.A.; Norde, W.; Cohen Stuart, M.A.
2005-01-01
BSA adsorption onto bimodal PEO brushes at a solid surface was measured using optical reflectometry. Bimodal brushes consist of long (N=770) and short (N=48) PEO chains and were prepared on PS surfaces, applying mixtures of PS 29-PEO48 and PS37-PEO770 block copolymers and using the Langmuir-Blodgett
Multiparameter probability distributions for heavy rainfall modeling in extreme southern Brazil
Directory of Open Access Journals (Sweden)
Samuel Beskow
2015-09-01
New hydrological insights for the region: The Anderson–Darling and Filliben tests were the most restrictive in this study. Based on the Anderson–Darling test, it was found that the Kappa distribution presented the best performance, followed by the GEV. This finding provides evidence that these multiparameter distributions result, for the region of study, in greater accuracy for the generation of intensity–duration–frequency curves and the prediction of peak streamflows and design hydrographs. As a result, this finding can support the design of hydraulic structures and flood management in river basins.
Is extrapair mating random? On the probability distribution of extrapair young in avian broods
Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan
2007-01-01
A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review
DEFF Research Database (Denmark)
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
2010-01-01
Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... priors to be used. We demonstrate how sequential simulation can be seen as an application of the Gibbs sampler, and how such a Gibbs sampler assisted by sequential simulation can be used to perform a random walk generating realizations of a relatively complex random function. We propose to combine...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....
Leijala, U.; Bjorkqvist, J. V.; Pellikka, H.; Johansson, M. M.; Kahma, K. K.
2017-12-01
Predicting the behaviour of the joint effect of sea level and wind waves is of great significance due to the major impact of flooding events in densely populated coastal regions. As mean sea level rises, the effect of sea level variations accompanied by the waves will be even more harmful in the future. The main challenge when evaluating the effect of waves and sea level variations is that long time series of both variables rarely exist. Wave statistics are also highly location-dependent, thus requiring wave buoy measurements and/or high-resolution wave modelling. As an initial approximation of the joint effect, the variables may be treated as independent random variables, to achieve the probability distribution of their sum. We present results of a case study based on three probability distributions: 1) wave run-up constructed from individual wave buoy measurements, 2) short-term sea level variability based on tide gauge data, and 3) mean sea level projections based on up-to-date regional scenarios. The wave measurements were conducted during 2012-2014 on the coast of city of Helsinki located in the Gulf of Finland in the Baltic Sea. The short-term sea level distribution contains the last 30 years (1986-2015) of hourly data from Helsinki tide gauge, and the mean sea level projections are scenarios adjusted for the Gulf of Finland. Additionally, we present a sensitivity test based on six different theoretical wave height distributions representing different wave behaviour in relation to sea level variations. As these wave distributions are merged with one common sea level distribution, we can study how the different shapes of the wave height distribution affect the distribution of the sum, and which one of the components is dominating under different wave conditions. As an outcome of the method, we obtain a probability distribution of the maximum elevation of the continuous water mass, which enables a flexible tool for evaluating different risk levels in the
Martin A. Spetich; Zhaofei Fan; Zhen Sui; Michael Crosby; Hong S. He; Stephen R. Shifley; Theodor D. Leininger; W. Keith Moser
2017-01-01
Stresses to trees under a changing climate can lead to changes in forest tree survival, mortality and distribution.Â For instance, a study examining the effects of human-induced climate change on forest biodiversity by Hansen and others (2001) predicted a 32% reduction in loblollyâshortleaf pine habitat across the eastern United States.Â However, they also...
Testroet, Eric D; Sherman, Peter; Yoder, Chad; Testroet, Amber; Reynolds, Carmen; O'Neil, Mathew; Lei, Soi Meng; Beitz, Donald C; Baas, Tom J
2017-04-03
Adipocyte sizes from adipose tissue of mature animals form a bimodal distribution, thus reporting mean cell size is misleading. The objectives of this study were to develop a robust method for testing bimodality of porcine adipocytes, describe the size distribution with an informative metric, and statistically test hypertrophy and appearance of new small adipocytes, possibly resulting from hyperplasia or lipid filling of previously divided fibroblastic cells. Ninety-three percent of adipose samples measured were bimodal (P testing hyperplasia or lipid filling of previously divided fibroblastic cells based upon the probability of an adipocyte falling into 2 chosen competing "bins" as adiposity increases. We also conclude that increased adiposity is correlated positively with an adipocyte being found in the minor mode (r = 0.46) and correlated negatively with an adipocyte being found in the major mode (r = -0.22), providing evidence of either hyperplasia or lipid filling of previously divided fibroblastic cells. We additionally conclude that as adiposity increases, the mode of the major distribution of cells occurs at a larger diameter of adipocyte, indicating hypertrophy.
Lahmiri, Salim
2016-03-01
Hybridisation of the bi-dimensional empirical mode decomposition (BEMD) with denoising techniques has been proposed in the literature as an effective approach for image denoising. In this Letter, the Student's probability density function is introduced in the computation of the mean envelope of the data during the BEMD sifting process to make it robust to values that are far from the mean. The resulting BEMD is denoted tBEMD. In order to show the effectiveness of the tBEMD, several image denoising techniques in tBEMD domain are employed; namely, fourth order partial differential equation (PDE), linear complex diffusion process (LCDP), non-linear complex diffusion process (NLCDP), and the discrete wavelet transform (DWT). Two biomedical images and a standard digital image were considered for experiments. The original images were corrupted with additive Gaussian noise with three different levels. Based on peak-signal-to-noise ratio, the experimental results show that PDE, LCDP, NLCDP, and DWT all perform better in the tBEMD than in the classical BEMD domain. It is also found that tBEMD is faster than classical BEMD when the noise level is low. When it is high, the computational cost in terms of processing time is similar. The effectiveness of the presented approach makes it promising for clinical applications.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
1980-06-30
AO AObS 250 SOUTH CARO.INA UNIV COLUNSIA DEPT OF MATHENATICS CON--ETC V/6 12/I1 NONPARANETRIC BAYES ESTIMATION OF DISTRIBUTION FUNCTIONS AND TH-rTCM...WeiF4964 79-C -4YF 9. PERFORMING ORGANIZATION NAME AND ADDRESS 0NT. SK ~ University of South Carolina, Department of IL . - Mathematics, Computer ...powerful than the sign test. The power of the test was compared with that of the sign test by computer simulations using the Marshall-Olkin bivariate
Perrotta, A
2002-01-01
A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).
Uranium concentration and distribution in six peridotite inclusions of probable mantle origin
Haines, E. L.; Zartman, R. E.
1973-01-01
Fission-track activation was used to investigate uranium concentration and distribution in peridotite inclusions in alkali basalt from six localities. Whole-rock uranium concentrations range from 24 to 82 ng/g. Most of the uranium is uniformly distributed in the major silicate phases - olivine, orthopyroxene, and clinopyroxene. Chromian spinels may be classified into two groups on the basis of their uranium content - those which have less than 10 ng/g and those which have 100 to 150 ng/g U. In one sample accessory hydrous phases, phlogopite and hornblende, contain 130 and 300 ng/g U, respectively. The contact between the inclusion and the host basalt is usually quite sharp. Glassy or microcrystalline veinlets found in some samples contain more than 1 microgram/g. Very little uranium is associated with microcrystals of apatite. These results agree with some earlier investigators, who have concluded that suboceanic peridotites contain too little uranium to account for normal oceanic heat flow by conduction alone.
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available To produce probability distributions for regional climate change in surface temperature and precipitation, a probability distribution for global mean temperature increase has been combined with the probability distributions for the appropriate scaling variables, i.e. the changes in regional temperature/precipitation per degree global mean warming. Each scaling variable is assumed to be normally distributed. The uncertainty of the scaling relationship arises from systematic differences between the regional changes from global and regional climate model simulations and from natural variability. The contributions of these sources of uncertainty to the total variance of the scaling variable are estimated from simulated temperature and precipitation data in a suite of regional climate model experiments conducted within the framework of the EU-funded project PRUDENCE, using an Analysis Of Variance (ANOVA. For the area covered in the 2001–2004 EU-funded project SWURVE, five case study regions (CSRs are considered: NW England, the Rhine basin, Iberia, Jura lakes (Switzerland and Mauvoisin dam (Switzerland. The resulting regional climate changes for 2070–2099 vary quite significantly between CSRs, between seasons and between meteorological variables. For all CSRs, the expected warming in summer is higher than that expected for the other seasons. This summer warming is accompanied by a large decrease in precipitation. The uncertainty of the scaling ratios for temperature and precipitation is relatively large in summer because of the differences between regional climate models. Differences between the spatial climate-change patterns of global climate model simulations make significant contributions to the uncertainty of the scaling ratio for temperature. However, no meaningful contribution could be found for the scaling ratio for precipitation due to the small number of global climate models in the PRUDENCE project and natural variability, which is
Directory of Open Access Journals (Sweden)
Denis Cousineau
2008-03-01
Full Text Available This article discusses how to characterize response time (RT frequency distributions in terms of probability functions and how to implement the necessary analysis tools using MATLAB. The first part of the paper discusses the general principles of maximum likelihood estimation. A detailed implementation that allows fitting the popular ex-Gaussian function is then presented followed by the results of a Monte Carlo study that shows the validity of the proposed approach. Although the main focus is the ex-Gaussian function, the general procedure described here can be used to estimate best fitting parameters of various probability functions. The proposed computational tools, written in MATLAB source code, are available through the Internet.
Nuclear bimodal new vision solar system missions
International Nuclear Information System (INIS)
Mondt, J.F.; Zubrin, R.M.
1996-01-01
This paper presents an analysis of the potential mission capability using space reactor bimodal systems for planetary missions. Missions of interest include the Main belt asteroids, Jupiter, Saturn, Neptune, and Pluto. The space reactor bimodal system, defined by an Air Force study for Earth orbital missions, provides 10 kWe power, 1000 N thrust, 850 s Isp, with a 1500 kg system mass. Trajectories to the planetary destinations were examined and optimal direct and gravity assisted trajectories were selected. A conceptual design for a spacecraft using the space reactor bimodal system for propulsion and power, that is capable of performing the missions of interest, is defined. End-to-end mission conceptual designs for bimodal orbiter missions to Jupiter and Saturn are described. All missions considered use the Delta 3 class or Atlas 2AS launch vehicles. The space reactor bimodal power and propulsion system offers both; new vision open-quote open-quote constellation close-quote close-quote type missions in which the space reactor bimodal spacecraft acts as a carrier and communication spacecraft for a fleet of microspacecraft deployed at different scientific targets and; conventional missions with only a space reactor bimodal spacecraft and its science payload. copyright 1996 American Institute of Physics
Jacobsen, J L; Saleur, H
2008-02-29
We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.
Santillán, David; Mosquera, Juan-Carlos; Cueto-Felgueroso, Luis
2017-11-01
Hydraulic fracture trajectories in rocks and other materials are highly affected by spatial heterogeneity in their mechanical properties. Understanding the complexity and structure of fluid-driven fractures and their deviation from the predictions of homogenized theories is a practical problem in engineering and geoscience. We conduct a Monte Carlo simulation study to characterize the influence of heterogeneous mechanical properties on the trajectories of hydraulic fractures propagating in elastic media. We generate a large number of random fields of mechanical properties and simulate pressure-driven fracture propagation using a phase-field model. We model the mechanical response of the material as that of an elastic isotropic material with heterogeneous Young modulus and Griffith energy release rate, assuming that fractures propagate in the toughness-dominated regime. Our study shows that the variance and the spatial covariance of the mechanical properties are controlling factors in the tortuousness of the fracture paths. We characterize the deviation of fracture paths from the homogenous case statistically, and conclude that the maximum deviation grows linearly with the distance from the injection point. Additionally, fracture path deviations seem to be normally distributed, suggesting that fracture propagation in the toughness-dominated regime may be described as a random walk.
The Development of Bimodal Bilingualism: Implications for Linguistic Theory.
Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen
2016-01-01
A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.
A novel broadband bi-mode active frequency selective surface
Xu, Yang; Gao, Jinsong; Xu, Nianxi; Shan, Dongzhi; Song, Naitao
2017-05-01
A novel broadband bi-mode active frequency selective surface (AFSS) is presented in this paper. The proposed structure is composed of a periodic array of convoluted square patches and Jerusalem Crosses. According to simulation results, the frequency response of AFSS definitely exhibits a mode switch feature between band-pass and band-stop modes when the diodes stay in ON and OFF states. In order to apply a uniform bias to each PIN diode, an ingenious biasing network based on the extension of Wheatstone bridge is adopted in prototype AFSS. The test results are in good agreement with the simulation results. A further physical mechanism of the bi-mode AFSS is shown by contrasting the distribution of electric field on the AFSS patterns for the two working states.
A novel broadband bi-mode active frequency selective surface
Directory of Open Access Journals (Sweden)
Yang Xu
2017-05-01
Full Text Available A novel broadband bi-mode active frequency selective surface (AFSS is presented in this paper. The proposed structure is composed of a periodic array of convoluted square patches and Jerusalem Crosses. According to simulation results, the frequency response of AFSS definitely exhibits a mode switch feature between band-pass and band-stop modes when the diodes stay in ON and OFF states. In order to apply a uniform bias to each PIN diode, an ingenious biasing network based on the extension of Wheatstone bridge is adopted in prototype AFSS. The test results are in good agreement with the simulation results. A further physical mechanism of the bi-mode AFSS is shown by contrasting the distribution of electric field on the AFSS patterns for the two working states.
International Nuclear Information System (INIS)
Halenka, J.; Olchawa, W.
2005-01-01
From experiments, see e.g. [W. Wiese, D. Kelleher, and D. Paquette, Phys. Rev. A 6, 1132 (1972); V. Helbig and K. Nich, J. Phys. B 14, 3573 (1981).; J. Halenka, Z. Phys. D 16, 1 (1990); . Djurovic, D. Nikolic, I. Savic, S. Sorge, and A.V. Demura, Phys. Rev. E 71, 036407 (2005)], results that the hydrogen lines formed in plasma with N e φ 10 16 cm -3 are asymmetrical. The inhomogeneity of ionic micro field and the higher order corrections (quadratic and next ones) in perturbation theory are the reason for such asymmetry. So far, the ion-emitter quadrupole interaction and the quadratic Stark effect have been included in calculations. The recent work shows that a significant discrepancy between calculations and measurements occurs in the wings of H-beta line in plasmas with cm -3 . It should be stressed here that e.g. for the energy operator the correction raised by the quadratic Stark effect is proportional to (where is the emitter-perturber distance) similarly as the correction caused by the emitter-perturber octupole interaction and the quadratic correction from emitter-perturber quadrupole interaction. Thus, it is obvious that a model of the profile calculation is consistent one if all the aforementioned corrections are simultaneously included. Such calculations are planned in the future paper. A statistics of the octupole inhomogeneity tensor in a plasma is necessarily needed in the first step of such calculations. For the first time the distribution functions of the octupole inhomogeneity have been calculated in this paper using the Mayer-Mayer cluster expansion method similarly as for the quadrupole function in the paper [J. Halenka, Z. Phys. D 16, 1 (1990)]. The quantity is the reduced scale of the micro field strength, where is the Holtsmark normal field and is the mean distance defined by the relationship, that is approximately equal to the mean ion-ion distance; whereas is the screening parameter, where is the electronic Debye radius. (author)
Irreducible complexity of iterated symmetric bimodal maps
Directory of Open Access Journals (Sweden)
J. P. Lampreia
2005-01-01
Full Text Available We introduce a tree structure for the iterates of symmetric bimodal maps and identify a subset which we prove to be isomorphic to the family of unimodal maps. This subset is used as a second factor for a ∗-product that we define in the space of bimodal kneading sequences. Finally, we give some properties for this product and study the ∗-product induced on the associated Markov shifts.
Fragmentation versus stability in bimodal coalitions
Galam, Serge
1996-02-01
Competing bimodal coalitions among a group of actors are discussed. First, a model from political sciences is revisited. Most of the model statements are found not to be contained in the model. Second, a new coalition model is built. It accounts for local versus global alignment with respect to the joining of a coalition. The existence of two competing world coalitions is found to yield one unique stable distribution of actors. On the opposite a unique world leadership allows the emergence of unstable relationships. In parallel to regular actors which have a clear coalition choice, “neutral”, “frustrated” and “risky” actors are produced. The cold war organisation after world war II is shown to be rather stable. The emergence of a fragmentation process from eastern group disappearance is explained as well as continuing western group stability. Some hints are obtained about possible policies to stabilize world nation relationships. European construction is analyzed with respect to European stability. Chinese stability is also discussed.
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Energy Technology Data Exchange (ETDEWEB)
Brodie, Jean P.; Conroy, Charlie; Arnold, Jacob A.; Romanowsky, Aaron J. [University of California Observatories and Department of Astronomy and Astrophysics, University of California, Santa Cruz, CA 95064 (United States); Usher, Christopher; Forbes, Duncan A. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Strader, Jay, E-mail: brodie@ucolick.org [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States)
2012-11-10
Due to its proximity (9 Mpc) and the strongly bimodal color distribution of its spectroscopically well-sampled globular cluster (GC) system, the early-type galaxy NGC 3115 provides one of the best available tests of whether the color bimodality widely observed in GC systems generally reflects a true metallicity bimodality. Color bimodality has alternatively been attributed to a strongly nonlinear color-metallicity relation reflecting the influence of hot horizontal-branch stars. Here, we couple Subaru Suprime-Cam gi photometry with Keck/DEIMOS spectroscopy to accurately measure GC colors and a CaT index that measures the Ca II triplet. We find the NGC 3115 GC system to be unambiguously bimodal in both color and the CaT index. Using simple stellar population models, we show that the CaT index is essentially unaffected by variations in horizontal-branch morphology over the range of metallicities relevant to GC systems (and is thus a robust indicator of metallicity) and confirm bimodality in the metallicity distribution. We assess the existing evidence for and against multiple metallicity subpopulations in early- and late-type galaxies and conclude that metallicity bi/multimodality is common. We briefly discuss how this fundamental characteristic links directly to the star formation and assembly histories of galaxies.
International Nuclear Information System (INIS)
Shabbir, Aqsa
2016-01-01
In this doctoral work, pattern recognition techniques are developed and applied to data from tokamak plasmas, in order to contribute to a systematic analysis of edge-localized modes (ELMs). We employ probabilistic models for a quantitative data description geared towards an enhanced systematization of ELM phenomenology. Hence, we start from the point of view that the fundamental object resulting from the observation of a system is a probability distribution, with every single measurement providing a sample from this distribution. In exploring the patterns emerging from the various ELM regimes and relations, we need methods that can handle the intrinsic probabilistic nature of the data. The original contributions of this work are twofold. First, several novel pattern recognition methods in non-Euclidean spaces of probability distribution functions (PDFs) are developed and validated. The second main contribution lies in the application of these and other techniques to a systematic analysis of ELMs in tokamak plasmas. In regard to the methodological aims of the work, we employ the framework of information geometry to develop pattern visualization and classification methods in spaces of probability distributions. In information geometry, a family of probability distributions is considered as a Riemannian manifold. Every point on the manifold represents a single PDF and the distribution parameters provide local coordinates on the manifold. The Fisher information plays the role of a Riemannian metric tensor, enabling calculation of geodesic curves on the surface. The length of such curves yields the geodesic distance (GD) on probabilistic manifolds, which is a natural similarity (distance) measure between PDFs. Equipped with a suitable distance measure, we extrapolate several distance-based pattern recognition methods to the manifold setting. This includes k-nearest neighbor (kNN) and conformal predictor (CP) methods for classification, as well as multidimensional
de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander
2017-11-01
To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.
Directory of Open Access Journals (Sweden)
Chung-Ho Su
2010-12-01
Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.
Porto, Markus; Roman, H Eduardo
2002-04-01
We consider autoregressive conditional heteroskedasticity (ARCH) processes in which the variance sigma(2)(y) depends linearly on the absolute value of the random variable y as sigma(2)(y) = a+b absolute value of y. While for the standard model, where sigma(2)(y) = a + b y(2), the corresponding probability distribution function (PDF) P(y) decays as a power law for absolute value of y-->infinity, in the linear case it decays exponentially as P(y) approximately exp(-alpha absolute value of y), with alpha = 2/b. We extend these results to the more general case sigma(2)(y) = a+b absolute value of y(q), with 0 process is taken into account, the resulting PDF becomes a stretched exponential even for q = 1, with a stretched exponent beta = 2/3, in a much better agreement with the empirical data.
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
International Nuclear Information System (INIS)
Tierney, M.S.
1990-12-01
A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)
Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R
2015-01-01
LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information.
Venturelli, Ophelia S; El-Samad, Hana; Murray, Richard M
2012-11-27
Feedback loops are ubiquitous features of biological networks and can produce significant phenotypic heterogeneity, including a bimodal distribution of gene expression across an isogenic cell population. In this work, a combination of experiments and computational modeling was used to explore the roles of multiple feedback loops in the bimodal, switch-like response of the Saccharomyces cerevisiae galactose regulatory network. Here, we show that bistability underlies the observed bimodality, as opposed to stochastic effects, and that two unique positive feedback loops established by Gal1p and Gal3p, which both regulate network activity by molecular sequestration of Gal80p, induce this bimodality. Indeed, systematically scanning through different single and multiple feedback loop knockouts, we demonstrate that there is always a concentration regime that preserves the system's bimodality, except for the double deletion of GAL1 and the GAL3 feedback loop, which exhibits a graded response for all conditions tested. The constitutive production rates of Gal1p and Gal3p operate as bifurcation parameters because variations in these rates can also abolish the system's bimodal response. Our model indicates that this second loss of bistability ensues from the inactivation of the remaining feedback loop by the overexpressed regulatory component. More broadly, we show that the sequestration binding affinity is a critical parameter that can tune the range of conditions for bistability in a circuit with positive feedback established by molecular sequestration. In this system, two positive feedback loops can significantly enhance the region of bistability and the dynamic response time.
Language choice in bimodal bilingual development
Directory of Open Access Journals (Sweden)
Diane eLillo-Martin
2014-10-01
Full Text Available Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children.Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending – expressions in both speech and sign simultaneously – an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children’s language choices.This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult.Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant
Resolving the age bimodality of galaxy stellar populations on kpc scales
Zibetti, Stefano; Gallazzi, Anna R.; Ascasibar, Y.; Charlot, S.; Galbany, L.; García Benito, R.; Kehrig, C.; de Lorenzo-Cáceres, A.; Lyubenova, M.; Marino, R. A.; Márquez, I.; Sánchez, S. F.; van de Ven, G.; Walcher, C. J.; Wisotzki, L.
2017-01-01
Galaxies in the local Universe are known to follow bimodal distributions in the global stellar population properties. We analyse the distribution of the local average stellar population ages of 654 053 sub-galactic regions resolved on ˜1 kpc scales in a volume-corrected sample of 394 galaxies, drawn
Bimodal condensation silicone elastomers as dielectric elastomers
DEFF Research Database (Denmark)
Yu, Liyun; Madsen, Frederikke Bahrt; Skov, Anne Ladegaard
unimodal refers to that there is one polymer only in the system. As an alternative to unimodal networks there are the bimodal networks where two polymers with significantly different molecular weights are mixed with one crosslinker. [2]Silicone rubber can be divided into condensation type and addition type...... according to the curing reaction. The advantages of condensation silicones compared to addition are the relatively low cost, the curing rate largely being independent of temperature, the excellent adhesion, and the catalyst being nontoxic. [3]In this work, a series of bimodal condensation silicone...
International Nuclear Information System (INIS)
Biyajima, M.; Shirane, K.; Suzuki, N.
1988-01-01
Moments in pseudorapidity intervals at the CERN Sp-barpS collider (√s = 546 GeV) are analyzed by means of two probability distributions in the pure-birth stochastic process. Our results show that a probability distribution obtained from the Poisson distribution as an initial condition is more useful than that obtained from the Kronecker δ function. Analyses of moments by Koba-Nielsen-Olesen scaling functions derived from solutions of the pure-birth stochastic process are also made. Moreover, analyses of preliminary data at √s = 200 and 900 GeV are added
Bimodal and Gaussian Ising spin glasses in dimension two
Lundow, P. H.; Campbell, I. A.
2016-02-01
An analysis is given of numerical simulation data to size L =128 on the archetype square lattice Ising spin glasses (ISGs) with bimodal (±J ) and Gaussian interaction distributions. It is well established that the ordering temperature of both models is zero. The Gaussian model has a nondegenerate ground state and thus a critical exponent η ≡0 , and a continuous distribution of energy levels. For the bimodal model, above a size-dependent crossover temperature T*(L ) there is a regime of effectively continuous energy levels; below T*(L ) there is a distinct regime dominated by the highly degenerate ground state plus an energy gap to the excited states. T*(L ) tends to zero at very large L , leaving only the effectively continuous regime in the thermodynamic limit. The simulation data on both models are analyzed with the conventional scaling variable t =T and with a scaling variable τb=T2/(1 +T2) suitable for zero-temperature transition ISGs, together with appropriate scaling expressions. The data for the temperature dependence of the reduced susceptibility χ (τb,L ) and second moment correlation length ξ (τb,L ) in the thermodynamic limit regime are extrapolated to the τb=0 critical limit. The Gaussian critical exponent estimates from the simulations, η =0 and ν =3.55 (5 ) , are in full agreement with the well-established values in the literature. The bimodal critical exponents, estimated from the thermodynamic limit regime analyses using the same extrapolation protocols as for the Gaussian model, are η =0.20 (2 ) and ν =4.8 (3 ) , distinctly different from the Gaussian critical exponents.
Refining Bimodal Microstructure of Materials with MSTRUCT
Czech Academy of Sciences Publication Activity Database
Matěj, Z.; Kadlecová, A.; Janeček, M.; Matějová, Lenka; Dopita, M.; Kužel, R.
2014-01-01
Roč. 29, S2 (2014), S35-S41 ISSN 0885-7156 R&D Projects: GA ČR GA14-23274S Grant - others:UK(CZ) UNCE 204023/2012 Institutional support: RVO:67985858 Keywords : XRD * bimodal * crystallite size Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.636, year: 2014
Deaf Children's Bimodal Bilingualism and Education
Swanwick, Ruth
2016-01-01
This paper provides an overview of the research into deaf children's bilingualism and bilingual education through a synthesis of studies published over the last 15 years. This review brings together the linguistic and pedagogical work on bimodal bilingualism to inform educational practice. The first section of the review provides a synthesis of…
Nonlinear dynamics of the bimodal optical computer
Caulfield, H. John
1999-03-01
In the bimodal optical computer, linear and nonlinear acts occur in rapid succession generating solutions to Ax equals b. Both chaos and stochastic resonance can appear in some cases. This is the first observation of such complexity effects in optical processors.
Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...
Energy Technology Data Exchange (ETDEWEB)
Tierney, M.S.
1991-11-01
The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events attempted boreholes over rooms and drifts,'' mining alters ground-water regime,'' water-withdrawal wells provide alternate pathways,'' and the feature brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features.
International Nuclear Information System (INIS)
Tierney, M.S.
1991-11-01
The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features
Tsai, Keng-Chang; Jian, Jhih-Wei; Yang, Ei-Wen; Hsu, Po-Chiang; Peng, Hung-Pin; Chen, Ching-Tai; Chen, Jun-Bo; Chang, Jeng-Yih; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Non-covalent protein-carbohydrate interactions mediate molecular targeting in many biological processes. Prediction of non-covalent carbohydrate binding sites on protein surfaces not only provides insights into the functions of the query proteins; information on key carbohydrate-binding residues could suggest site-directed mutagenesis experiments, design therapeutics targeting carbohydrate-binding proteins, and provide guidance in engineering protein-carbohydrate interactions. In this work, we show that non-covalent carbohydrate binding sites on protein surfaces can be predicted with relatively high accuracy when the query protein structures are known. The prediction capabilities were based on a novel encoding scheme of the three-dimensional probability density maps describing the distributions of 36 non-covalent interacting atom types around protein surfaces. One machine learning model was trained for each of the 30 protein atom types. The machine learning algorithms predicted tentative carbohydrate binding sites on query proteins by recognizing the characteristic interacting atom distribution patterns specific for carbohydrate binding sites from known protein structures. The prediction results for all protein atom types were integrated into surface patches as tentative carbohydrate binding sites based on normalized prediction confidence level. The prediction capabilities of the predictors were benchmarked by a 10-fold cross validation on 497 non-redundant proteins with known carbohydrate binding sites. The predictors were further tested on an independent test set with 108 proteins. The residue-based Matthews correlation coefficient (MCC) for the independent test was 0.45, with prediction precision and sensitivity (or recall) of 0.45 and 0.49 respectively. In addition, 111 unbound carbohydrate-binding protein structures for which the structures were determined in the absence of the carbohydrate ligands were predicted with the trained predictors. The overall
Merging history of three bimodal clusters
Maurogordato, S.; Sauvageot, J. L.; Bourdin, H.; Cappi, A.; Benoist, C.; Ferrari, C.; Mars, G.; Houairi, K.
2011-01-01
We present a combined X-ray and optical analysis of three bimodal galaxy clusters selected as merging candidates at z ~ 0.1. These targets are part of MUSIC (MUlti-Wavelength Sample of Interacting Clusters), which is a general project designed to study the physics of merging clusters by means of multi-wavelength observations. Observations include spectro-imaging with XMM-Newton EPIC camera, multi-object spectroscopy (260 new redshifts), and wide-field imaging at the ESO 3.6 m and 2.2 m telescopes. We build a global picture of these clusters using X-ray luminosity and temperature maps together with galaxy density and velocity distributions. Idealized numerical simulations were used to constrain the merging scenario for each system. We show that A2933 is very likely an equal-mass advanced pre-merger ~200 Myr before the core collapse, while A2440 and A2384 are post-merger systems (~450 Myr and ~1.5 Gyr after core collapse, respectively). In the case of A2384, we detect a spectacular filament of galaxies and gas spreading over more than 1 h-1 Mpc, which we infer to have been stripped during the previous collision. The analysis of the MUSIC sample allows us to outline some general properties of merging clusters: a strong luminosity segregation of galaxies in recent post-mergers; the existence of preferential axes - corresponding to the merging directions - along which the BCGs and structures on various scales are aligned; the concomitance, in most major merger cases, of secondary merging or accretion events, with groups infalling onto the main cluster, and in some cases the evidence of previous merging episodes in one of the main components. These results are in good agreement with the hierarchical scenario of structure formation, in which clusters are expected to form by successive merging events, and matter is accreted along large-scale filaments. Based on data obtained with the European Southern Observatory, Chile (programs 072.A-0595, 075.A-0264, and 079.A-0425
Isomap nonlinear dimensionality reduction and bimodality of Asian monsoon convection
Hannachi, A.; Turner, A. G.
2013-04-01
It is known that the empirical orthogonal function method is unable to detect possible nonlinear structure in climate data. Here, isometric feature mapping (Isomap), as a tool for nonlinear dimensionality reduction, is applied to 1958-2001 ERA-40 sea-level pressure anomalies to study nonlinearity of the Asian summer monsoon intraseasonal variability. Using the leading two Isomap time series, the probability density function is shown to be bimodal. A two-dimensional bivariate Gaussian mixture model is then applied to identify the monsoon phases, the obtained regimes representing enhanced and suppressed phases, respectively. The relationship with the large-scale seasonal mean monsoon indicates that the frequency of monsoon regime occurrence is significantly perturbed in agreement with conceptual ideas, with preference for enhanced convection on intraseasonal time scales during large-scale strong monsoons. Trend analysis suggests a shift in concentration of monsoon convection, with less emphasis on South Asia and more on the East China Sea.
Phenotypic Diversity Using Bimodal and Unimodal Expression of Stress Response Proteins.
Garcia-Bernardo, Javier; Dunlop, Mary J
2016-05-24
Populations of cells need to express proteins to survive the sudden appearance of stressors. However, these mechanisms may be taxing. Populations can introduce diversity, allowing individual cells to stochastically switch between fast-growing and stress-tolerant states. One way to achieve this is to use genetic networks coupled with noise to generate bimodal distributions with two distinct subpopulations, each adapted to a stress condition. Another survival strategy is to rely on random fluctuations in gene expression to produce continuous, unimodal distributions of the stress response protein. To quantify the environmental conditions where bimodal versus unimodal expression is beneficial, we used a differential evolution algorithm to evolve optimal distributions of stress response proteins given environments with sudden fluctuations between low and high stress. We found that bimodality evolved for a large range of environmental conditions. However, we asked whether these findings were an artifact of considering two well-defined stress environments (low and high stress). As noise in the environment increases, or when there is an intermediate environment (medium stress), the benefits of bimodality decrease. Our results indicate that under realistic conditions, a continuum of resistance phenotypes generated through a unimodal distribution is sufficient to ensure survival without a high cost to the population. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
The Efficiency of the Bimodal System Transportation
Directory of Open Access Journals (Sweden)
Nada Štrumberger
2012-10-01
Full Text Available The development of fast railway results in an increased applicationof Trailer Train bimodal system transportation. Thetraffic costs are multiply reduced, particularly the variablecosts. On the other hand the environmental pollution from exhaustgases is also reduced. Therefore, by the year 2010 cargotransport should be preponderant~v used which would be characterisedby fast electric trains producing less noise, at lowercosts and with clean environment.
Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K
2016-01-01
Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.
Chrystal, A.; Heikoop, J. M.; Davis, P.; Syme, J.; Hagerty, S.; Perkins, G.; Larson, T. E.; Longmire, P.; Fessenden, J. E.
2010-12-01
Elevated nitrate (NO3-) concentrations in drinking water pose a health risk to the public. The dual stable isotopic signatures of δ15N and δ18O in NO3- in surface- and groundwater are often used to identify and distinguish among sources of NO3- (e.g., sewage, fertilizer, atmospheric deposition). In oxic groundwaters where no denitrification is occurring, direct calculations of mixing fractions using a mass balance approach can be performed if three or fewer sources of NO3- are present, and if the stable isotope ratios of the source terms are defined. There are several limitations to this approach. First, direct calculations of mixing fractions are not possible when four or more NO3- sources may be present. Simple mixing calculations also rely upon treating source isotopic compositions as a single value; however these sources themselves exhibit ranges in stable isotope ratios. More information can be gained by using a probabilistic approach to account for the range and distribution of stable isotope ratios in each source. Fitting probability density functions (PDFs) to the isotopic compositions for each source term reveals that some values within a given isotopic range are more likely to occur than others. We compiled a data set of dual isotopes in NO3- sources by combining our measurements with data collected through extensive literature review. We fit each source term with a PDF, and show a new method to probabilistically solve multiple component mixing scenarios with source isotopic composition uncertainty. This method is based on a modified use of a tri-linear diagram. First, source term PDFs are sampled numerous times using a variation of stratified random sampling, Latin Hypercube Sampling. For each set of sampled source isotopic compositions, a reference point is generated close to the measured groundwater sample isotopic composition. This point is used as a vertex to form all possible triangles between all pairs of sampled source isotopic compositions
Bimodal nature in low-energy fission of light actinides
International Nuclear Information System (INIS)
Nagame, Yuichiro; Nishinaka, Ichiro; Tsukada, Kazuaki; Ikezoe, Hiroshi; Otsuki, Tsutomu; Sueki, Keisuke; Nakahara, Hiromichi; Kudo, Hisaaki.
1995-01-01
To solve various problems in the mass division process of light actinoids, some experiments on the basis of bimodal fission were carried. Mass and kinetic energy distribution of Th-232 and U-238 were determined. Pa-225 (N= 134) and Pa-227 (N=136), fission nuclei, were produced by Bi-209 + 0-16 and Bi-209 + 0-18 heavy ion nucleus reactions, and the mass yield distribution were determined by the time-of-flight method and the radiochemical procedure. From the results, two independent deforming processes were proved in the fission process of light actinoid nuclei. On the deforming process through the low fission barrier, nucleus fissioned after small deformation under the influence of stabilization of the shell structure of fission product. In the case of process through the high barrier, however, the nucleus fissioned after large deformation. The unsymmetrical mass division was derived from the former and the symmetrical one from the latter. (S.Y.)
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Resolving the age bimodality of galaxy stellar populations on kpc scales
Zibetti, Stefano; Gallazzi, Anna R.; Ascasibar, Y.; Charlot, S.; Galbany, L.; García Benito, R.; Kehrig, C.; de Lorenzo-Cáceres, A.; Lyubenova, M.; Marino, R. A.; Márquez, I.; Sánchez, S. F.; van de Ven, G.; Walcher, C. J.; Wisotzki, L.
2017-06-01
Galaxies in the local Universe are known to follow bimodal distributions in the global stellar population properties. We analyse the distribution of the local average stellar population ages of 654 053 sub-galactic regions resolved on ˜1 kpc scales in a volume-corrected sample of 394 galaxies, drawn from the Calar Alto Legacy Integral Field Area (CALIFA) DR3 integral-field-spectroscopy survey and complemented by Sloan Digital Sky Survey (SDSS) imaging. We find a bimodal local-age distribution, with an old and a young peak primarily due to regions in early-type galaxies and star-forming regions of spirals, respectively. Within spiral galaxies, the older ages of bulges and interarm regions relative to spiral arms support an internal age bimodality. Although regions of higher stellar mass surface density, μ*, are typically older, μ* alone does not determine the stellar population age and a bimodal distribution is found at any fixed μ*. We identify an 'old ridge' of regions of age ˜9 Gyr, independent of μ*, and a 'young sequence' of regions with age increasing with μ* from 1-1.5 to 4-5 Gyr. We interpret the former as regions containing only old stars, and the latter as regions where the relative contamination of old stellar populations by young stars decreases as μ* increases. The reason why this bimodal age distribution is not inconsistent with the unimodal shape of the cosmic-averaged star formation history is that (I) the dominating contribution by young stars biases the age low with respect to the average epoch of star formation, and (II) the use of a single average age per region is unable to represent the full time extent of the star formation history of 'young sequence' regions.
Emergent bimodality and switch induced by time delays and noises in a synthetic gene circuit
Zhang, Chun; Du, Liping; Xie, Qingshuang; Wang, Tonghuan; Zeng, Chunhua; Nie, Linru; Duan, Weilong; Jia, Zhenglin; Wang, Canjun
2017-10-01
Based on the kinetic model for obtaining emergent bistability proposed by Tan et al. (2009), the effects of the fluctuations of protein synthesis rate and maximum dilution rate, the cross-correlation between two noises, and the time delay and the strength of the feedback loop in the synthetic gene circuit have been investigated through theoretical analysis and numerical simulation. Our results show that: (i) the fluctuations of protein synthesis rate and maximum dilution rate enhance the emergent bimodality of the probability distribution phenomenon, while the cross-correlation between two noises(λ), the time delay(τ) and the strength of the feedback loop(K) cause it to disappear; and (ii) the mean first passage time(MFPT) as functions of the noise strengths exhibits a maximum, this maximum is called noise-delayed switching (NDS) of the high concentration state. The NDS phenomenon shows that the noise can modify the stability of a metastable system in a counterintuitive way, the system remains in the metastable state for a longer time compared to the deterministic case. And the τ and the K enhances the stability of the ON state. The physical mechanisms for the switch between the ON and OFF states can be explained from the point of view of the effective potential.
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Directory of Open Access Journals (Sweden)
Luis Vicente Chamorro Marcillllo
2013-06-01
Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.
Movement, drivers and bimodality of the South Asian High
Directory of Open Access Journals (Sweden)
M. Nützel
2016-11-01
Full Text Available The South Asian High (SAH is an important component of the summer monsoon system in Asia. In this study we investigate the location and drivers of the SAH at 100 hPa during the boreal summers of 1979 to 2014 on interannual, seasonal and synoptic timescales using seven reanalyses and observational data. Our comparison of the different reanalyses focuses especially on the bimodality of the SAH, i.e. the two preferred modes of the SAH centre location: the Iranian Plateau to the west and the Tibetan Plateau to the east. We find that only the National Centers for Environmental Prediction–National Center of Atmospheric Research (NCEP–NCAR reanalysis shows a clear bimodal structure of the SAH centre distribution with respect to daily and pentad (5 day mean data. Furthermore, the distribution of the SAH centre location is highly variable from year to year. As in simple model studies, which connect the SAH to heating in the tropics, we find that the mean seasonal cycle of the SAH and its centre are dominated by the expansion of convection in the South Asian region (70–130° E × 15–30° N on the south-eastern border of the SAH. A composite analysis of precipitation and outgoing long-wave radiation data with respect to the location of the SAH centre reveals that a more westward (eastward location of the SAH is related to stronger (weaker convection and rainfall over India and weaker (stronger precipitation over the western Pacific.
DEFF Research Database (Denmark)
Dimitrov, Nikolay Krasimirov
2016-01-01
extrapolation techniques: the Weibull, Gumbel and Pareto distributions and a double-exponential asymptotic extreme value function based on the ACER method. For the successful implementation of a fully automated extrapolation process, we have developed a procedure for automatic identification of tail threshold...... levels, based on the assumption that the response tail is asymptotically Gumbel distributed. Example analyses were carried out, aimed at comparing the different methods, analysing the statistical uncertainties and identifying the factors, which are critical to the accuracy and reliability...
Penetration in bimodal, polydisperse granular material
Kouraytem, N.
2016-11-07
We investigate the impact penetration of spheres into granular media which are compositions of two discrete size ranges, thus creating a polydisperse bimodal material. We examine the penetration depth as a function of the composition (volume fractions of the respective sizes) and impact speed. Penetration depths were found to vary between delta = 0.5D(0) and delta = 7D(0), which, for mono-modal media only, could be correlated in terms of the total drop height, H = h + delta, as in previous studies, by incorporating correction factors for the packing fraction. Bimodal data can only be collapsed by deriving a critical packing fraction for each mass fraction. The data for the mixed grains exhibit a surprising lubricating effect, which was most significant when the finest grains [d(s) similar to O(30) mu m] were added to the larger particles [d(l) similar to O(200 - 500) mu m], with a size ratio, epsilon = d(l)/d(s), larger than 3 and mass fractions over 25%, despite the increased packing fraction. We postulate that the small grains get between the large grains and reduce their intergrain friction, only when their mass fraction is sufficiently large to prevent them from simply rattling in the voids between the large particles. This is supported by our experimental observations of the largest lubrication effect produced by adding small glass beads to a bed of large sand particles with rough surfaces.
Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with
Directory of Open Access Journals (Sweden)
Laktineh Imad
2010-04-01
Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
A hydroclimatological approach to predicting regional landslide probability using Landlab
Directory of Open Access Journals (Sweden)
R. Strauch
2018-02-01
Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.
A hydroclimatological approach to predicting regional landslide probability using Landlab
Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.
2018-02-01
We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.
Directory of Open Access Journals (Sweden)
Dolićanin-Đekić Diana
2013-01-01
Full Text Available This paper investigates the possibility of distinguishing between the effects of radiation coming from two or more radioactive isotopes, by using methods of statistical mathematics. The procedure uses a mixed distribution of an additive type. Mathematical treatment is demonstrated herein on an example of analysis of composite radiation from two radioactive sources.
Directory of Open Access Journals (Sweden)
J.W. Love
2017-04-01
Where FEC data were obtained with less sensitive counting techniques (i.e. McMaster 30 or 15 epg, zero-inflated distributions and their associated central tendency were the most appropriate and would be recommended to use, i.e. the arithmetic group mean divided by the proportion of non-zero counts present; otherwise apparent anthelmintic efficacy could be misrepresented.
Speech Recognition and Cognitive Skills in Bimodal Cochlear Implant Users
Hua, Håkan; Johansson, Björn; Magnusson, Lennart; Lyxell, Björn; Ellis, Rachel J.
2017-01-01
Purpose: To examine the relation between speech recognition and cognitive skills in bimodal cochlear implant (CI) and hearing aid users. Method: Seventeen bimodal CI users (28-74 years) were recruited to the study. Speech recognition tests were carried out in quiet and in noise. The cognitive tests employed included the Reading Span Test and the…
Energy Technology Data Exchange (ETDEWEB)
Rechard, Rob P. [Department of Performance Assessment 6853, Sandia National Laboratories MS-0776, P.O. Box 5800, Albuquerque, NM 87185-0776 (United States)]. E-mail: rprecha@sandia.gov; Tierney, Martin S. [Plantae Research Associates, 415 Camino Manzano Santa Fe, NM 87505 (United States)
2005-04-01
A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant. Between 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA were assigned by a parameter development team, using a process described in a companion paper. In the five iterative PAs conducted, the most commonly used distributions were the uniform PDF and piecewise-uniform PDF (also referred to as a piecewise-linear cumulative distribution function (CDF)). Other distributions used included the truncated normal, truncated Student-t, and triangular PDFs. In a few instances, a discrete delta (piecewise-uniform CDF), beta, and exponential PDF were also used. The PDFs produced for the 24 most important parameters observed in the five iterative PAs are presented. As background, the list of 194 parameters documented in the first 1989 PA through the 1471 parameters documented in the 1996 compliance PA are also provided.
Energy Technology Data Exchange (ETDEWEB)
Rechard, Rob P. [Performance Assessment Department 6853, MS-0776, P.O. Box 5800, Sandia National Laboratories, Albuquerque, NM 87185-0776 (United States)]. E-mail: rprecha@sandia.gov; Tierney, Martin S. [Plantae Research Associates, 415 Camino Manzano Santa Fe, New Mexico 87505 (United States)
2005-04-01
A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The key to the success of the process was the use of a three-member team consisting of a Parameter Task Leader, PA Analyst, and Subject Matter Expert. This team, in turn, relied upon a series of guidelines for selecting distribution types. The primary function of the guidelines was not to constrain the actual process of developing a parameter distribution but rather to establish a series of well-defined steps where recognized methods would be consistently applied to all parameters. An important guideline was to use a small set of distributions satisfying the maximum entropy formalism. Another important guideline was the consistent use of the log transform for parameters with large ranges (i.e. maximum/minimum>10{sup 3}). A parameter development team assigned 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA using these and other guidelines described.
International Nuclear Information System (INIS)
Rechard, Rob P.; Tierney, Martin S.
2005-01-01
A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The key to the success of the process was the use of a three-member team consisting of a Parameter Task Leader, PA Analyst, and Subject Matter Expert. This team, in turn, relied upon a series of guidelines for selecting distribution types. The primary function of the guidelines was not to constrain the actual process of developing a parameter distribution but rather to establish a series of well-defined steps where recognized methods would be consistently applied to all parameters. An important guideline was to use a small set of distributions satisfying the maximum entropy formalism. Another important guideline was the consistent use of the log transform for parameters with large ranges (i.e. maximum/minimum>10 3 ). A parameter development team assigned 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA using these and other guidelines described
Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.
2014-01-01
Crayfishes and other freshwater aquatic fauna are particularly at risk globally due to anthropogenic demand, manipulation and exploitation of freshwater resources and yet are often understudied. The Ozark faunal region of Missouri and Arkansas harbours a high level of aquatic biological diversity, especially in regard to endemic crayfishes. Three such endemics, Orconectes eupunctus,Orconectes marchandi and Cambarus hubbsi, are threatened by limited natural distribution and the invasions of Orconectes neglectus.
Bimodal condensation silicone elastomers as dielectric elastomers
DEFF Research Database (Denmark)
Yu, Liyun; Madsen, Frederikke Bahrt; Skov, Anne Ladegaard
as well as high electrical and mechanical breakdown strengths. [1] Most model elastomers are prepared by an end-linking process using a crosslinker with a certain functionality ƒ and a linear polymer with functional groups in both ends, and the resulting networks are so-called unimodal networks where...... unimodal refers to that there is one polymer only in the system. As an alternative to unimodal networks there are the bimodal networks where two polymers with significantly different molecular weights are mixed with one crosslinker. [2]Silicone rubber can be divided into condensation type and addition type...... elastomers were prepared by mixing different mass ratios (9:1, 8:2, 7:3, 6:4, 5:5, 4:6) between long polydimethylsiloxane (PDMS) chains and short PDMS chains. The resulting elastomers were investigated with respect to their rheology, dielectric properties, tensile strength, electrical breakdown, as well...
Ben Issaid, Chaouki
2016-06-01
The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Alves-Brito, Alan; Hau, George K. T.; Forbes, Duncan A.; Spitler, Lee R.; Strader, Jay; Brodie, Jean P.; Rhode, Katherine L.
2011-11-01
We present a large sample of over 200 integrated-light spectra of confirmed globular clusters (GCs) associated with the Sombrero (M104) galaxy taken with the Deep Imaging Multi-Object Spectrograph (DEIMOS) instrument on the Keck telescope. A significant fraction of the spectra have signal-to-noise ratio levels high enough to allow measurements of GC metallicities using the method of Brodie & Huchra. We find a distribution of spectroscopic metallicities in the range -2.2 < [Fe/H] < +0.1 that is bimodal, with peaks at [Fe/H]˜-1.4 and -0.6. Thus, the GC system of the Sombrero galaxy, like a few other galaxies now studied in detail, reveals a bimodal spectroscopic metallicity distribution supporting the long-held belief that colour bimodality reflects two metallicity subpopulations. This further suggests that the transformation from optical colour to metallicity for old stellar populations, such as GCs, is not strongly non-linear. We also explore the radial and magnitude distribution with metallicity for GC subpopulations but small number statistics prevent any clear trends in these distributions. Based on observations obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California and the National Aeronautics and Space Administration.
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Phenotypic Diversity Using Bimodal and Unimodal Expression of Stress Response Proteins
Garcia-Bernardo, Javier; Dunlop, Mary J.
2016-01-01
Populations of cells need to express proteins to survive the sudden appearance of stressors. However, these mechanisms may be taxing. Populations can introduce diversity, allowing individual cells to stochastically switch between fast-growing and stress-tolerant states. One way to achieve this is to use genetic networks coupled with noise to generate bimodal distributions with two distinct subpopulations, each adapted to a stress condition. Another survival strategy is to rely on random fluct...
Velocity selection for ultra-cold atoms using bimodal mazer cavity
International Nuclear Information System (INIS)
Irshad, A.; Qamar, S.
2009-04-01
In this paper, we discuss the velocity selection of ultra-cold three-level atoms in Λ configuration using a micromazer. Our model is the same as discussed by Arun et al., for mazer action in a bimodal cavity. We have shown that significantly narrowed velocity distribution of ultra-cold atoms can be obtained in this system due to the presence of dark states. (author)
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....
Stellar Rotation with Kepler and Gaia: Evidence for a Bimodal Star Formation History
Davenport, James
2018-01-01
Kepler stars with rotation periods measured via starspot modulations in their light curves have been matched against the astrometric data from Gaia Data Release 1. A total of 1,299 bright rotating stars were recovered, most with temperatures hotter than 5000 K. From these, 894 were selected as being near the main sequence. These main sequence stars show a bimodality in their rotation period distribution, centered around a ~600 Myr rotation-isochrone. This feature matches the bimodal period distribution found in cooler stars with Kepler, but was previously undetected for solar-type stars due to sample contamination by subgiant and binary stars. A tenuous connection between the rotation period and total proper motion is found, suggesting the period bimodality is due to the age distribution of stars within 300pc of the Sun, rather than a phase of rapid angular momentum loss. I will discuss how the combination of Kepler/K2/TESS with Gaia will enable us to map the star formation history of our galactic neighborhood.
Directory of Open Access Journals (Sweden)
John T. Rees
2000-12-01
Full Text Available A pandeid hydrozoan new to California, Amphinema sp., was collected in 1998 as a hydroid living on the non-indigenous bryozoan, Watersipora subtorquata, attached to floats in Bodega Harbor 80 km north of San Francisco Bay. The hydroid was cultured in the laboratory and medusae it released were raised to maturity. No species name could be assigned because although the hydroid colony structure and morphology of the polyp most closely resemble descriptions of Amphinema rugosum, the immature and adult medusae best resemble A. dinema. These two described species are known from widely-spaced locations worldwide including Europe (British Isles and the Mediterranean, New England, the Caribbean, east Africa, India, Japan and China, implying that they may transport easily between sites by man´s activities. Such wide-spread distributions of both species, coupled with the notable absence of Amphinema sp. from Bodega Harbor during a number of previous field surveys in the 1970´s, strongly intimates that Amphinemasp. has been introduced from elsewhere into Bodega Harbor during the past 25 years. Two additional species of Amphinema medusae present on the west coast of North America are discussed.
Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.
2016-01-01
Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Ben Issaid, Chaouki
2017-01-26
The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverberation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is related to the difficult question of analyzing the statistics of a sum of Gamma- Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose robust importance sampling schemes that efficiently evaluates the outage probability of diversity receivers over Gamma-Gamma fading channels. The proposed estimators satisfy the well-known bounded relative error criterion for both maximum ratio combining and equal gain combining cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.
Age bimodality in the central region of pseudo-bulges in S0 galaxies
Mishra, Preetish K.; Barway, Sudhanshu; Wadadekar, Yogesh
2017-11-01
We present evidence for bimodal stellar age distribution of pseudo-bulges of S0 galaxies as probed by the Dn(4000) index. We do not observe any bimodality in age distribution for pseudo-bulges in spiral galaxies. Our sample is flux limited and contains 2067 S0 and 2630 spiral galaxies drawn from the Sloan Digital Sky Survey. We identify pseudo-bulges in S0 and spiral galaxies, based on the position of the bulge on the Kormendy diagram and their central velocity dispersion. Dividing the pseudo-bulges of S0 galaxies into those containing old and young stellar populations, we study the connection between global star formation and pseudo-bulge age on the u - r colour-mass diagram. We find that most old pseudo-bulges are hosted by passive galaxies while majority of young bulges are hosted by galaxies that are star forming. Dividing our sample of S0 galaxies into early-type S0s and S0/a galaxies, we find that old pseudo-bulges are mainly hosted by early-type S0 galaxies while most of the pseudo-bulges in S0/a galaxies are young. We speculate that morphology plays a strong role in quenching of star formation in the disc of these S0 galaxies, which stops the growth of pseudo-bulges, giving rise to old pseudo-bulges and the observed age bimodality.
Functionalized bimodal mesoporous silicas as carriers for controlled aspirin delivery
Gao, Lin; Sun, Jihong; Li, Yuzhen
2011-08-01
The bimodal mesoporous silica modified with 3-aminopropyltriethoxysilane was performed as the aspirin carrier. The samples' structure, drug loading and release profiles were characterized with X-ray diffraction, scanning electron microscopy, N 2 adsorption and desorption, Fourier transform infrared spectroscopy, TG analysis, elemental analysis and UV-spectrophotometer. For further exploring the effects of the bimodal mesopores on the drug delivery behavior, the unimodal mesoporous material MCM-41 was also modified as the aspirin carrier. Meantime, Korsmeyer-Peppas equation ft= ktn was employed to analyze the dissolution data in details. It is indicated that the bimodal mesopores are beneficial for unrestricted drug molecules diffusing and therefore lead to a higher loading and faster releasing than that of MCM-41. The results show that the aspirin delivery properties are influenced considerably by the mesoporous matrix, whereas the large pore of bimodal mesoporous silica is the key point for the improved controlled-release properties.
Bimodal distribution of fasting gastric acidity in a rural African ...
African Journals Online (AJOL)
Setting. The people of Transkei eat a diet high in linoleic acid, the principal fatty acid in maize. The theory has been put forward that a diet high in linoleic acid and low in fat and riboflavin, such as the traditional diet in Transkei, results in overproduction of prostaglandin E2 in the gastric mucosa, and that this overproduction ...
ORIGINAL ARTICLES Bimodal distribution of fasting gastric acidity ...
African Journals Online (AJOL)
2003-10-18
Oct 18, 2003 ... in riboflavin. It has been suggested' that these dietary elements promote high prostaglandin E2 (PGE2) production in the stomach; PGE2 suppresses gastric .... Table I. Frequency of sonsumption of individual food items and relation to pH. Odds ratio. Consumed. Consumed for pH > 4 (food. Item daily(%).
Bimodal Programming: A Survey of Current Clinical Practice.
Siburt, Hannah W; Holmes, Alice E
2015-06-01
The purpose of this study was to determine the current clinical practice in approaches to bimodal programming in the United States. To be specific, if clinicians are recommending bimodal stimulation, who programs the hearing aid in the bimodal condition, and what method is used for programming the hearing aid? An 11-question online survey was created and sent via email to a comprehensive list of cochlear implant programming centers in the United States. The survey was sent to 360 recipients. Respondents in this study represented a diverse group of clinical settings (response rate: 26%). Results indicate little agreement about who programs the hearing aids, when they are programmed, and how they are programmed in the bimodal condition. Analysis of small versus large implant centers indicated small centers are less likely to add a device to the contralateral ear. Although a growing number of cochlear implant recipients choose to wear a hearing aid on the contralateral ear, there is inconsistency in the current clinical approach to bimodal programming. These survey results provide evidence of large variability in the current bimodal programming practices and indicate a need for more structured clinical recommendations and programming approaches.
Sathyachandran, S.; Roy, D. P.; Boschetti, L.
2010-12-01
Spatially and temporally explicit mapping of the amount of biomass burned by fire is needed to estimate atmospheric emissions of green house gases and aerosols. The instantaneous Fire Radiative Power (FRP) [units: W] is retrieved at active fire detections from mid-infrared wavelength remotely sensed data and can be used to estimate the rate of biomass consumed. Temporal integration of FRP measurements over the duration of the fire provides the Fire Radiative Energy (FRE) [units: J] that has been shown to be linearly related to the total biomass burned [units: g]. However, FRE, and thus biomass burned retrieval, is sensitive to the satellite spatial and temporal sampling of FRP which can be sparse under cloudy conditions and with polar orbiting sensors such as MODIS. In this paper the FRE is derived in a new way as the product of the fire duration and the first moment of the FRP power law probability distribution. MODIS FRP data retrieved over savanna fires in Australia and deforestation fires in Brazil are shown to have power law distributions with different scaling parameters that are related to the fire energy in these two contrasting systems. The FRE derived burned biomass estimates computed using this new method are compared to estimates using the conventional temporal FRP integration method and with literature values. The results of the comparison suggest that the new method may provide more reliable burned biomass estimates under sparse satellite sampling conditions if the fire duration and the power law distribution parameters are characterized a priori.
Pérez-Sánchez, Julio; Senent-Aparicio, Javier
2017-08-01
Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Directory of Open Access Journals (Sweden)
BLAGA IRINA
2014-03-01
Full Text Available The maximum amounts of rainfall are usually characterized by high intensity, and their effects on the substrate are revealed, at slope level, by the deepening of the existing forms of torrential erosion and also by the formation of new ones, and by landslide processes. For the 1971-2000 period, for the weather stations in the hilly area of Cluj County: Cluj- Napoca, Dej, Huedin and Turda the highest values of rainfall amounts fallen in 24, 48 and 72 hours were analyzed and extracted, based on which the variation and the spatial and temporal distribution of the precipitation were analyzed. The annual probability of exceedance of maximum rainfall amounts fallen in short time intervals (24, 48 and 72 hours, based on thresholds and class values was determined, using climatological practices and the Hyfran program facilities.
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Localization ability with bimodal hearing aids and bilateral cochlear implants
Seeber, Bernhard U.; Baumann, Uwe; Fastl, Hugo
2004-09-01
After successful cochlear implantation in one ear, some patients continue to use a hearing aid at the contralateral ear. They report an improved reception of speech, especially in noise, as well as a better perception of music when the hearing aid and cochlear implant are used in this bimodal combination. Some individuals in this bimodal patient group also report the impression of an improved localization ability. Similar experiences are reported by the group of bilateral cochlear implantees. In this study, a survey of 11 bimodally and 4 bilaterally equipped cochlear implant users was carried out to assess localization ability. Individuals in the bimodal implant group were all provided with the same type of hearing aid in the opposite ear, and subjects in the bilateral implant group used cochlear implants of the same manufacturer on each ear. Subjects adjusted the spot of a computer-controlled laser-pointer to the perceived direction of sound incidence in the frontal horizontal plane by rotating a trackball. Two subjects of the bimodal group who had substantial residual hearing showed localization ability in the bimodal configuration, whereas using each single device only the subject with better residual hearing was able to discriminate the side of sound origin. Five other subjects with more pronounced hearing loss displayed an ability for side discrimination through the use of bimodal aids, while four of them were already able to discriminate the side with a single device. Of the bilateral cochlear implant group one subject showed localization accuracy close to that of normal hearing subjects. This subject was also able to discriminate the side of sound origin using the first implanted device alone. The other three bilaterally equipped subjects showed limited localization ability using both devices. Among them one subject demonstrated a side-discrimination ability using only the first implanted device.
van Milligen, B. Ph.; Sánchez, R.; Carreras, B. A.; Lynch, V. E.; LaBombard, B.; Pedrosa, M. A.; Hidalgo, C.; Gonçalves, B.; Balbín, R.
2005-05-01
Plasma density fluctuations and electrostatic turbulent fluxes measured at the scrape-off layer of the Alcator C-Mod tokamak [B. LaBombard, R. L. Boivin, M. Greenwald, J. Hughes, B. Lipschultz, D. Mossessian, C. S. Pitcher, J. L. Terry, and S. J. Zweben, Phys. Plasmas 8, 2107 (2001)], the Wendelstein 7-Advanced Stellarator [H. Renner, E. Anabitarte, E. Ascasibar et al., Plasma Phys. Controlled Fusion 31, 1579 (1989)], and the TJ-II stellarator [C. Alejaldre, J. Alonso, J. Botija et al., Fusion Technol. 17, 131 (1990)] are shown to obey a non-Gaussian but apparently universal (i.e., not dependent on device and discharge parameters) probability density distribution (pdf). The fact that a specific shape acts as an attractor for the pdf seems to suggest that emergent behavior and self-regulation are relevant concepts for these fluctuations. This shape is closely similar to the so-called Bramwell, Holdsworth, and Pinton distribution, which does not have any free parameters.
Functionalized bimodal mesoporous silicas as carriers for controlled aspirin delivery
International Nuclear Information System (INIS)
Gao Lin; Sun Jihong; Li Yuzhen
2011-01-01
The bimodal mesoporous silica modified with 3-aminopropyltriethoxysilane was performed as the aspirin carrier. The samples' structure, drug loading and release profiles were characterized with X-ray diffraction, scanning electron microscopy, N 2 adsorption and desorption, Fourier transform infrared spectroscopy, TG analysis, elemental analysis and UV-spectrophotometer. For further exploring the effects of the bimodal mesopores on the drug delivery behavior, the unimodal mesoporous material MCM-41 was also modified as the aspirin carrier. Meantime, Korsmeyer-Peppas equation f t =kt n was employed to analyze the dissolution data in details. It is indicated that the bimodal mesopores are beneficial for unrestricted drug molecules diffusing and therefore lead to a higher loading and faster releasing than that of MCM-41. The results show that the aspirin delivery properties are influenced considerably by the mesoporous matrix, whereas the large pore of bimodal mesoporous silica is the key point for the improved controlled-release properties. - Graphical abstract: Loading (A) and release profiles (B) of aspirin in N-BMMs and N-MCM-41 indicated that BMMs have more drug loading capacity and faster release rate than that MCM-41. Highlights: → Bimodal mesoporous silicas (BMMs) and MCM-41 modified with amino group via post-treatment procedure. → Loading and release profiles of aspirin in modified BMMs and MCM-41. → Modified BMMs have more drug loading capacity and faster release rate than that modified MCM-41.
International Nuclear Information System (INIS)
Buffa, Francesca M.
2000-01-01
The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σ d ; whilst the quantities d and σ d depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10 8 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the
Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L
2013-02-01
The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Aggressive Bimodal Communication in Domestic Dogs, Canis familiaris.
Déaux, Éloïse C; Clarke, Jennifer A; Charrier, Isabelle
2015-01-01
Evidence of animal multimodal signalling is widespread and compelling. Dogs' aggressive vocalisations (growls and barks) have been extensively studied, but without any consideration of the simultaneously produced visual displays. In this study we aimed to categorize dogs' bimodal aggressive signals according to the redundant/non-redundant classification framework. We presented dogs with unimodal (audio or visual) or bimodal (audio-visual) stimuli and measured their gazing and motor behaviours. Responses did not qualitatively differ between the bimodal and two unimodal contexts, indicating that acoustic and visual signals provide redundant information. We could not further classify the signal as 'equivalent' or 'enhancing' as we found evidence for both subcategories. We discuss our findings in relation to the complex signal framework, and propose several hypotheses for this signal's function.
Aggressive Bimodal Communication in Domestic Dogs, Canis familiaris.
Directory of Open Access Journals (Sweden)
Éloïse C Déaux
Full Text Available Evidence of animal multimodal signalling is widespread and compelling. Dogs' aggressive vocalisations (growls and barks have been extensively studied, but without any consideration of the simultaneously produced visual displays. In this study we aimed to categorize dogs' bimodal aggressive signals according to the redundant/non-redundant classification framework. We presented dogs with unimodal (audio or visual or bimodal (audio-visual stimuli and measured their gazing and motor behaviours. Responses did not qualitatively differ between the bimodal and two unimodal contexts, indicating that acoustic and visual signals provide redundant information. We could not further classify the signal as 'equivalent' or 'enhancing' as we found evidence for both subcategories. We discuss our findings in relation to the complex signal framework, and propose several hypotheses for this signal's function.
Visualisation and characterisation of heterogeneous bimodal PDMS networks
DEFF Research Database (Denmark)
Bahrt, Frederikke; Daugaard, Anders Egede; Fleury, Clemence
2014-01-01
The existence of short-chain domains in heterogeneous bimodal PDMS networks has been confirmed visually, for the first time, through confocal fluorescence microscopy. The networks were prepared using a controlled reaction scheme where short PDMS chains were reacted below the gelation point...... bimodal networks with short-chain domains within a long-chain network. The average sizes of the short-chain domains were found to vary from 2.1 to 5.7 mm depending on the short-chain content. The visualised network structure could be correlated thereafter to the elastic properties, which were determined...... by rheology. All heterogeneous bimodal networks displayed significantly lower moduli than mono-modal PDMS elastomers prepared from the long polymer chains. Low-loss moduli as well as low-sol fractions indicate that low-elastic moduli can be obtained without compromising the network's structure...
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Directory of Open Access Journals (Sweden)
Won Seok Jang
2014-04-01
Full Text Available The Korea Railroad Research Institute (KRRI has developed a bimodal tram and advanced bus rapid transit (BRT system which is an optimized public transit system created by mixing the railway’s punctual operation and the bus’ easy and convenient access. The bimodal tram system provides mass-transportation service with an eco-friendly and human-centered approach. Natural disasters have been increasing worldwide in recent years, including floods, snow, and typhoons disasters. Flooding is the most frequent natural disaster in many countries and is increasingly a concern with climate change; it seriously affects people’s lives and productivity, causing considerable economic loss and significant damage. Enhanced conventional disaster management systems are needed to support comprehensive actions to secure safety and convenience. The objective of this study is to develop a prototype version of a Web GIS-based bimodal tram disaster management system (BTDMS using the Storm Water Management Model (SWMM 5.0 to enhance on-time operation and safety of the bimodal tram system. The BTDMS was tested at the bimodal tram test railroad by simulating probable maximum flood (PMF and snow melting for forecasting flooding and snow covered roads. This result could provide the basis for plans to protect against flooding disasters and snow covered roads in operating the bimodal tram system. The BTDMS will be used to assess and predict weather impacts on roadway conditions and operations and thus has the potential to influence economic growth. The methodology presented in this paper makes it possible to manage impacts of flooding and snowfall on urban transportation and enhance operation of the bimodal tram system. Such a methodology based on modeling could be created for most metropolitan areas in Korea and in many other countries.
Performance Comparisons of Nanoaluminum, Coated Microaluminum and Their Bimodal Mixtures
Woody, D. L.; Dokhan, A.; Johnson, C. E.
2004-07-01
Comparison studies of materials containing standard nano aluminum (ultrafine) and micro aluminum coated with BaSO4 were performed. Differential thermal analysis and thermogravimetric analysis output were used to observe the effect of adding an unconventional coating to micron-sized aluminum particle materials. These results were compared to those of ultrafine aluminum particles. Bimodal combinations of ultrafine aluminum and micron-sized aluminum (coated and uncoated) were observed also. These preliminary results showed an interaction between the ultrafine aluminum (UFAL) and micron-sized aluminum in bimodal mixtures.
Straus, D. M.
2016-12-01
The goals of this research are to: (a) identify features of the probability distribution function (pdf) of pentad precipitation over the continental US (CONUS) that are controlled by the configuration of the large-scale fields, including both tails of the pdf, hence droughts and floods, and the overall shape of the pdf, e.g. skewness and kurtosis; (b) estimate the changes in the properties of the pdf controlled by the large-scale in a future climate. We first describe the significant dependence of the observed precipitation pdf conditioned on circulation regimes over CONUS. The regime states, and the number of regimes, are obtained by a method that assures a high degree of significance, and a high degree of pattern correlation between the states in a regime and its average. The regime-conditioned pdfs yield information on times scales from intra-seasonal to inter-annual. We then apply this method to atmospheric simulations run with the EC-Earth version 3 model for historical sea-surface temperatures (SST) and future (RCP8.5 CMIP5 scenario) estimates of SST, at resolutions T255 and T799, to understand what dynamically controlled changes in the precipitation pdf can be expected in a future climate.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
International Nuclear Information System (INIS)
Keskin, Mustafa; Erdinc, Ahmet
2004-01-01
As a continuation of the previously published work, the pair approximation of the cluster variation method is applied to study the temperature dependences of the order parameters of the Blume-Emery-Griffiths model with repulsive biquadratic coupling on a body centered cubic lattice. We obtain metastable and unstable branches of the order parameters besides the stable branches and phase transitions of these branches are investigated extensively. We study the dynamics of the model by the path probability method with pair distribution in order to make sure that we find and define the metastable and unstable branches of the order parameters completely and correctly. We present the metastable phase diagram in addition to the equilibrium phase diagram and also the first-order phase transition line for the unstable branches of the quadrupole order parameter is superimposed on the phase diagrams. It is found that the metastable phase diagram and the first-order phase boundary for the unstable quadrupole order parameter always exist at the low temperatures which are consistent with experimental and theoretical works
Crema, Enrico R; Habu, Junko; Kobayashi, Kenichi; Madella, Marco
2016-01-01
Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
Durrieu, G; Ciffroy, P; Garnier, J-M
2006-11-01
The objective of the study was to provide global probability density functions (PDFs) representing the uncertainty of distribution coefficients (Kds) in freshwater for radioisotopes of Co, Cs, Sr and I. A comprehensive database containing Kd values referenced in 61 articles was first built and quality scores were affected to each data point according to various criteria (e.g. presentation of data, contact times, pH, solid-to-liquid ratio, expert judgement). A weighted bootstrapping procedure was then set up in order to build PDFs, in such a way that more importance is given to the most relevant data points (i.e. those corresponding to typical natural environments). However, it was also assessed that the relevance and the robustness of the PDFs determined by our procedure depended on the number of Kd values in the database. Owing to the large database, conditional PDFs were also proposed, for site studies where some parametric information is known (e.g. pH, contact time between radionuclides and particles, solid-to-liquid ratio). Such conditional PDFs reduce the uncertainty on the Kd values. These global and conditional PDFs are useful for end-users of dose models because the uncertainty and sensitivity of Kd values are taking into account.
Directory of Open Access Journals (Sweden)
Enrico R Crema
Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
A joint probability density function of wind speed and direction for wind energy analysis
International Nuclear Information System (INIS)
Carta, Jose A.; Ramirez, Penelope; Bueno, Celia
2008-01-01
A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Aspects of stochastic resonance in Josephson junction, bimodal ...
Indian Academy of Sciences (India)
the noise amplitude helps to define maximum SNR or peak SNR for an optimum amplitude of input noise. Although .... Here we consider a typical 2-parameter bimodal cubic map defined by. Xn+1 = b + aXn − X3 n. (2) ... due to shuttling with chaotic input of a logistic map (called chaotic resonance) have been reported earlier ...
Interaural bimodal pitch matching with two-formant vowels
DEFF Research Database (Denmark)
Guérit, François; Chalupper, Josef; Santurette, Sébastien
2013-01-01
For bimodal patients, with a hearing aid (HA) in one ear and a cochlear implant (CI) in the opposite ear, usually a default frequency-to-electrode map is used in the CI. This assumes that the human brain can adapt to interaural place-pitch mismatches. This “one-size-fits-all” method might be part...
Stochastic resonance and chaotic resonance in bimodal maps: A ...
Indian Academy of Sciences (India)
We present the results of an extensive numerical study on the phenomenon of stochastic resonance in a bimodal cubic map. Both Gaussian random noise as well as deterministic chaos are used as input to drive the system between the basins. Our main result is that when two identical systems capable of stochastic ...
Application of Bimodal Master Curve Approach on KSNP RPV steel SA508 Gr. 3
International Nuclear Information System (INIS)
Kim, Jongmin; Kim, Minchul; Choi, Kwonjae; Lee, Bongsang
2014-01-01
In this paper, the standard MC approach and BMC are applied to the forging material of the KSNP RPV steel SA508 Gr. 3. A series of fracture toughness tests were conducted in the DBTT transition region, and fracture toughness specimens were extracted from four regions, i.e., the surface, 1/8T, 1/4T and 1/2T. Deterministic material inhomogeneity was reviewed through a conventional MC approach and the random inhomogeneity was evaluated by BMC. In the present paper, four regions, surface, 1/8T, 1/4T and 1/2T, were considered for the fracture toughness specimens of KSNP (Korean Standard Nuclear Plant) SA508 Gr. 3 steel to provide deterministic material inhomogeneity and review the applicability of BMC. T0 determined by a conventional MC has a low value owing to the higher quenching rate at the surface as expected. However, more than about 15% of the KJC values lay above the 95% probability curves indexed with the standard MC T0 at the surface and 1/8T, which implies the existence of inhomogeneity in the material. To review the applicability of the BMC method, the deterministic inhomogeneity owing to the extraction location and quenching rate is treated as random inhomogeneity. Although the lower bound and upper bound curve of the BMC covered more KJC values than that of the conventional MC, there is no significant relationship between the BMC analysis lines and measured KJC values in the higher toughness distribution, and BMC and MC provide almost the same T0 values. Therefore, the standard MC evaluation method for this material is appropriate even though the standard MC has a narrow upper/lower bound curve range from the RPV evaluation point of view. The material is not homogeneous in reality. Such inhomogeneity comes in the effect of material inhomogeneity depending on the specimen location, heat treatment, and whole manufacturing process. The conventional master curve has a limitation to be applied to a large scatted data of fracture toughness such as the weld region
International Nuclear Information System (INIS)
Levegruen, Sabine; Jackson, Andrew; Zelefsky, Michael J.; Venkatraman, Ennapadam S.; Skwarchuk, Mark W.; Schlegel, Wolfgang; Fuks, Zvi; Leibel, Steven A.; Ling, C. Clifton
2000-01-01
Purpose: To investigate tumor control following three-dimensional conformal radiation therapy (3D-CRT) of prostate cancer and to identify dose-distribution variables that correlate with local control assessed through posttreatment prostate biopsies. Methods and Material: Data from 132 patients, treated at Memorial Sloan-Kettering Cancer Center (MSKCC), who had a prostate biopsy 2.5 years or more after 3D-CRT for T1c-T3 prostate cancer with prescription doses of 64.8-81 Gy were analyzed. Variables derived from the dose distribution in the PTV included: minimum dose (Dmin), maximum dose (Dmax), mean dose (Dmean), dose to n% of the PTV (Dn), where n = 1%, ..., 99%. The concept of the equivalent uniform dose (EUD) was evaluated for different values of the surviving fraction at 2 Gy (SF 2 ). Four tumor control probability (TCP) models (one phenomenologic model using a logistic function and three Poisson cell kill models) were investigated using two sets of input parameters, one for low and one for high T-stage tumors. Application of both sets to all patients was also investigated. In addition, several tumor-related prognostic variables were examined (including T-stage, Gleason score). Univariate and multivariate logistic regression analyses were performed. The ability of the logistic regression models (univariate and multivariate) to predict the biopsy result correctly was tested by performing cross-validation analyses and evaluating the results in terms of receiver operating characteristic (ROC) curves. Results: In univariate analysis, prescription dose (Dprescr), Dmax, Dmean, dose to n% of the PTV with n of 70% or less correlate with outcome (p 2 : EUD correlates significantly with outcome for SF 2 of 0.4 or more, but not for lower SF 2 values. Using either of the two input parameters sets, all TCP models correlate with outcome (p 2 , is limited because the low dose region may not coincide with the tumor location. Instead, for MSKCC prostate cancer patients with their
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
CCD ubvy photometry of the bimodal main-sequence cluster NGC 3680
Energy Technology Data Exchange (ETDEWEB)
Anthony-Twarog, B.J.; Twarog, B.A.; Shodhan, S. (Kansas Univ., Lawrence (USA))
1989-11-01
CCD uvby photometry for the intermediate age, southern open cluster, NGC 3680, is analyzed. For a reddening of E(b-y) = 0.034, a true cluster modulus of 9.74 + or - 0.20 and a cluster metallicity of Fe/H abundance = 0.10 + or - 0.09, based on 18 probable nonbinary members of the cluster brighter than V = 14. The color-magnitude diagram for the cluster suggests that, although the main sequence may be subject to the same bimodal distibution as NGC 752, the likely source in both clusters is a combination of binaries and a sharply curved turnoff. The color-magnitude diagram is compared to the theoretical isochrones of Bertelli et al. (1988), showing an age of (1.9 + or - 0.3) X 10 to the 9th yr. 27 refs.
International Nuclear Information System (INIS)
Tang Hong; Lin Jianzhong
2011-01-01
The extinction coefficient of atmospheric aerosol particles influences the earth’s radiation balance directly or indirectly, and it can be determined by the scattering and absorption characteristics of aerosol particles. The problem of estimating the change of extinction coefficient due to time evolution of bimodal particle size distribution is studied, and two improved methods for calculating the Brownian coagulation coefficient and the condensation growth rate are proposed, respectively. Through the improved method based on Otto kernel, the Brownian coagulation coefficient can be expressed simply in powers of particle volume for the entire particle size regime based on the fitted polynomials of the mean enhancement function. Meanwhile, the improved method based on Fuchs–Sutugin kernel is developed to obtain the condensation growth rate for the entire particle size regime. And then, the change of the overall extinction coefficient of bimodal distributions undergoing Brownian coagulation and condensation can be estimated comprehensively for the entire particle size regime. Simulation experiments indicate that the extinction coefficients obtained with the improved methods coincide fairly well with the true values, which provide a simple, reliable, and general method to estimate the change of extinction coefficient for the entire particle size regime during the bimodal particle dynamic processes.
Energy Technology Data Exchange (ETDEWEB)
Sabooni, S., E-mail: s.sabooni@ma.iut.ac.ir [Department of Materials Engineering, Isfahan University of Technology, 84156-83111 Isfahan (Iran, Islamic Republic of); Karimzadeh, F.; Enayati, M.H. [Department of Materials Engineering, Isfahan University of Technology, 84156-83111 Isfahan (Iran, Islamic Republic of); Ngan, A.H.W. [Department of Mechanical Engineering, The University of Hong Kong, Pokfulam Road, Hong Kong (China)
2015-06-11
In the present study, metastable AISI 304L austenitic stainless steel samples were subjected to different cold rolling reductions from 70% to 93%, followed by annealing at 700 °C for 300 min to form ultrafine grained (UFG) austenite with different grain structures. Transmission electron microscopy (TEM) and nanoindentation were used to characterize the martensitic transformation, in order to relate it to the bimodal distribution of the austenite grain size after subsequent annealing. The results showed that the martensite morphology changed from lath type in the 60% rolled sample to a mixture of lath and dislocation-cell types in the higher rolling reductions. Calculation of the Gibbs free energy change during the reversion treatment showed that the reversion mechanism is shear controlled at the annealing temperature and so the morphology of the reverted austenite is completely dependent on the morphology of the deformation induced martensite. It was found that the austenite had a bimodal grain size distribution in the 80% rolled and annealed state and this is related to the existence of different types of martensite. Increasing the rolling reduction to 93% followed by annealing caused changing of the grain structure to a monomodal like structure, which was mostly covered with small grains of around 300 nm. The existence of bimodal austenite grain size in the 80% rolled and annealed 304L stainless steel led to the improvement of ductility while maintaining a high tensile strength in comparison with the 93% rolled and annealed sample.
Mobile Education: Towards Affective Bi-modal Interaction for Adaptivity
Directory of Open Access Journals (Sweden)
Efthymios Alepis
2009-04-01
Full Text Available One important field where mobile technology can make significant contributions is education. However one criticism in mobile education is that students receive impersonal teaching. Affective computing may give a solution to this problem. In this paper we describe an affective bi-modal educational system for mobile devices. In our research we describe a novel approach of combining information from two modalities namely the keyboard and the microphone through a multi-criteria decision making theory.
Analysis of soil moisture probability in a tree cropped watershed
Espejo-Perez, Antonio Jesus; Giraldez Cervera, Juan Vicente; Pedrera, Aura; Vanderlinden, Karl
2015-04-01
Probability density functions (pdfs) of soil moisture were estimated for an experimental watershed in Southern Spain, cropped with olive trees. Measurements were made using a capacitance sensors network from June 2011 until May 2013. The network consisted of 22 profiles of sensors, installed close to the tree trunk under the canopy and in the adjacent inter-row area, at 11 locations across the watershed to assess the influence of rain interception and root-water uptake on the soil moisture distribution. A bimodal pdf described the moisture dynamics at the 11 sites, both under and in-between the trees. Each mode represented the moisture status during either the dry or the wet period of the year. The observed histograms could be decomposed into a Lognormal pdf for dry period and a Gaussian pdf for the wet period. The pdfs showed a larger variation among the different locations at inter-row positions, as compared to under the canopy, reflecting the strict control of the vegetation on soil moisture. At both positions this variability was smaller during the wet season than during the dry period.
On the effect of segregation on intense bimodal bed load
Directory of Open Access Journals (Sweden)
Zrostlík Štěpán
2017-01-01
Full Text Available Open-channel two-phase flow above a granular mobile bed is studied experimentally and theoretically. In the two-phase flow, water serves as a carrying liquid for plastic grains transported as collisional contact load in the upper-stage plane bed regime. The investigation evaluates friction- and transport characteristics of the flow under the condition of intense collisional transport of grains and links them with the internal structure of the two-phase flow. The paper focusses on the effect of bimodal solids (mixed two fractions of grains of similar density and different size and shape on the flow characteristics and internal structure. Hence, experimental results obtained for the bimodal mixture are compared with results for individual grain fractions. The experiments show that the bimodal character of the transported solids affects the layered internal structure of the flow as a result of fraction segregation due primarily to gravity (kinetic sieving during transport. The segregation also affects the friction- and transport characteristics of intense bed load. In the paper, the effects are described and quantified.
On the effect of segregation on intense bimodal bed load
Zrostlík, Štěpán; Matoušek, Václav
Open-channel two-phase flow above a granular mobile bed is studied experimentally and theoretically. In the two-phase flow, water serves as a carrying liquid for plastic grains transported as collisional contact load in the upper-stage plane bed regime. The investigation evaluates friction- and transport characteristics of the flow under the condition of intense collisional transport of grains and links them with the internal structure of the two-phase flow. The paper focusses on the effect of bimodal solids (mixed two fractions of grains of similar density and different size and shape) on the flow characteristics and internal structure. Hence, experimental results obtained for the bimodal mixture are compared with results for individual grain fractions. The experiments show that the bimodal character of the transported solids affects the layered internal structure of the flow as a result of fraction segregation due primarily to gravity (kinetic) sieving during transport. The segregation also affects the friction- and transport characteristics of intense bed load. In the paper, the effects are described and quantified.
Particle filtering with path sampling and an application to a bimodal ocean current model
International Nuclear Information System (INIS)
Weare, Jonathan
2009-01-01
This paper introduces a recursive particle filtering algorithm designed to filter high dimensional systems with complicated non-linear and non-Gaussian effects. The method incorporates a parallel marginalization (PMMC) step in conjunction with the hybrid Monte Carlo (HMC) scheme to improve samples generated by standard particle filters. Parallel marginalization is an efficient Markov chain Monte Carlo (MCMC) strategy that uses lower dimensional approximate marginal distributions of the target distribution to accelerate equilibration. As a validation the algorithm is tested on a 2516 dimensional, bimodal, stochastic model motivated by the Kuroshio current that runs along the Japanese coast. The results of this test indicate that the method is an attractive alternative for problems that require the generality of a particle filter but have been inaccessible due to the limitations of standard particle filtering strategies.
Wear behavior of light-cured resin composites with bimodal silica nanostructures as fillers.
Wang, Ruili; Bao, Shuang; Liu, Fengwei; Jiang, Xiaoze; Zhang, Qinghong; Sun, Bin; Zhu, Meifang
2013-12-01
To enhance wear behavior of resin composites, bimodal silica nanostructures including silica nanoparticles and silica nanoclusters were prepared and proposed as fillers. The silica nanoclusters, a combination of individually dispersed silica nanoparticles and their agglomerations, with size distribution of 0.07-2.70 μm, were fabricated by the coupling reaction between amino and epoxy functionalized silica nanoparticles, which were obtained by the surface modification of silica nanoparticles (~70 nm) using 3-aminopropyl triethoxysilane (APTES) and 3-glycidoxypropyl trimethoxysilane (GPS) as coupling agents, respectively. Silica nanoparticles and nanoclusters were then silanized with 3-methacryloxypropyl trimethoxysilane (γ-MPS) to prepare composites by mixing with bisphenol A glycerolate dimethacrylate (Bis-GMA) and tri (ethylene glycol) dimethacrylate (TEGDMA). Experimental composites with various filler compositions were prepared and their wear behaviors were assessed in this work. The results suggested that composites with increasing addition of silica nanoparticles in co-fillers possessed lower wear volume and smoother worn surface. Particularly, the composite 53:17 with the optimum weight ratio of silica nanoparticles and silica nanoclusters presented the excellent wear behavior with respect to that of the commercial Esthet-X, although the smallest wear volume was achieved by Z350 XT. The introduction of bimodal silica nanostructures as fillers might provide a new sight for the design of resin composites with significantly improved wear resistance. Crown Copyright © 2013. All rights reserved.
DEFF Research Database (Denmark)
Hallas, Jesper; Pottegård, Anton; Støvring, Henrik
2017-01-01
BACKGROUND: In register-based pharmacoepidemiological studies, each day of follow-up is usually categorized either as exposed or unexposed. However, there is an underlying continuous probability of exposure, and by insisting on a dichotomy, researchers unwillingly force a nondifferential misclass...
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
The Inductive Applications of Probability Calculus
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Probably Almost Bayes Decisions
DEFF Research Database (Denmark)
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
Multiple decomposability of probabilities on contractible locally ...
Indian Academy of Sciences (India)
1970) (Berlin-Heidelberg-New. York: Springer). [10] Heyer H, Probability Measures on Locally Compact Groups (1977) (Berlin-Heidelberg-. New York: Springer). [11] Jurek Z and Mason D, Operator Limit Distributions in Probability Theory (1993).
Identification of probabilities.
Vitányi, Paul M B; Chater, Nick
2017-02-01
Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.
Krueger, Ute; Schimmelpfeng, Katja
2013-03-01
A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.
Small Low Mass Advanced PBR's for Bi-Modal Operation
Ludewig, Hans; Todosow, Michael; Powell, James R.
1994-07-01
A preliminary assessment is made of a low mass bi-modal reactor for use as a propulsion unit and as a heat source for generating electricity. This reactor is based on the particle bed reactor (PBR) concept. It will be able to generate both thrust and electricity simultaneously. This assessment indicates that the reactor can generate approximately 6.8 (4) N of thrust using hydrogen as a coolant, and 100 KWe using a closed Brayton cycle (CBC) power conversion system. Two cooling paths pass through the reactor allowing simultaneous operation of both modes. The development of all the components for this reactor are within the experience base of the NTP project.
Bifurcation Structures in a Bimodal Piecewise Linear Map
Directory of Open Access Journals (Sweden)
Anastasiia Panchuk
2017-05-01
Full Text Available In this paper we present an overview of the results concerning dynamics of a piecewise linear bimodal map. The organizing principles of the bifurcation structures in both regular and chaotic domains of the parameter space of the map are discussed. In addition to the previously reported structures, a family of regions closely related to the so-called U-sequence is described. The boundaries of distinct regions belonging to these structures are obtained analytically using the skew tent map and the map replacement technique.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Effect of meta-carborane on segmental dynamics in a bimodal Poly(dimethylsiloxane) network
Energy Technology Data Exchange (ETDEWEB)
Lewicki, J; Maxwell, R S; Patel, M; Herberg, J; Swain, A C; Liggat, J; Pethrick, R
2008-06-11
Bimodal networks of polydimethylsiloxane (PDMS) filled with varying amounts of icosahedral meta-carborane (m-CB) have been developed and characterized by broadband dielectric spectroscopy (BDS) and static {sup 1}H Multiple Quantum Nuclear Magnetic Resonance (MQ NMR). Both BDS and MQ NMR showed evidence for a decrease in the polymer chain dynamics. BDS spectra quantified a normal-mode relaxation near 40 Hz at 40 C. The frequency maximum observed for filled samples decreased with increasing m-CB content until contents greater than 5 wt. %. The width of the relaxation spectrum increased with the addition of small quantities of filler and decreased with filler contents greater that 5 wt. %. Agglomeration effects were observed at loadings greater than 5 wt % as manifest by the onset of low frequency Maxwell-Wagner-Sillars (MWS) processes. The MQ NMR data allowed the characterization of distributions of the residual dipolar couplings, <{Omega}{sub d}> and thus in the dynamic order parameter, Sb, consistent with the bimodal network architecture expected from the synthesis protocol used. Upon addition of less than 10 wt.% m-CB filler, the mean <{Omega}{sub d}> for the longer chains increased by 46% and the width of the distribution increased by 33%. The mean <{Omega}{sub d}> for the shorter chains increased by much less, indicative of preferential dispersion of the filler particles in the long chain domains of the network structure. We conclude that the mechanism of reinforcement is likely a free volume space filling at low loadings transitioning to complex molecular filler and polymer chain interaction phenomena at higher loadings.
Wang, Bo; Li, Hongxia; Cao, Xueyuan; Zhu, Xiaojun; Gan, Zhongxue
2017-04-01
With the rapid development of the energy networks, various forms of renewable energy resources are absorbed into it. Because of the inherent random behaviour of the renewable resources, introducing them into the energy network will destroy the stability of the grids. It is required to use proper energy storages to reduce the uncertain fluctuation from the renewable energy resources. For a concrete model research, this paper presented an explicit method to give suitable capacities of the energy storages in consideration of the economics of the storage, grid losses and the probabilities of the bus voltages violation, for situations of the winds-power generations injected into the power network. Furthermore, the influence of the correlation between the different winds farms on the optimal storage capacity can also be studied by this method.
Directory of Open Access Journals (Sweden)
Bruno Teixeira Ribeiro
2007-10-01
Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and
Bimodal Fuzzy Analytic Hierarchy Process (BFAHP) For Coronary Heart Disease Risk Assessment.
Sabahi, Farnaz
2018-04-03
Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
International Nuclear Information System (INIS)
Yoon, Suk-Jin; Lee, Sang-Yoon; Kim, Hak-Sub; Cho, Jaeil; Chung, Chul; Sohn, Sangmo T.; Blakeslee, John P.
2011-01-01
The optical color distributions of globular clusters (GCs) in most large elliptical galaxies are bimodal. Based on the assumed linear relationship between GC colors and their metallicities, the bimodality has been taken as evidence of two GC subsystems with different metallicities in each galaxy and has led to a number of theories in the context of galaxy formation. More recent observations and modeling of GCs, however, suggests that the color-metallicity relations (CMRs) are inflected, and thus colors likely trace metallicities in a nonlinear manner. The nonlinearity could produce bimodal color distributions from a broad underlying metallicity spread, even if it is unimodal. Despite the far-reaching implications, whether CMRs are nonlinear and whether the nonlinearity indeed causes the color bimodality are still open questions. Given that the spectroscopic refinement of CMRs is still very challenging, we here propose a new photometric technique to probe the possible nonlinear nature of CMRs. In essence, a color distribution of GCs is a 'projected' distribution of their metallicities. Since the form of CMRs hinges on which color is used, the shape of color distributions varies depending significantly on the colors. Among other optical colors, the u-band related colors (e.g., u – g and u – z) are theoretically predicted to exhibit significantly less inflected CMRs than other preferred CMRs (e.g., for g – z). As a case study, we performed the Hubble Space Telescope (HST)/WFPC2 archival u-band photometry for the M87 (NGC 4486) GC system with confirmed color bimodality. We show that the u-band color distributions are significantly different from that of g – z and consistent with our model predictions. With more u-band measurements, this method will support or rule out the nonlinear CMR scenario for the origin of GC color bimodality with high confidence. The HST/WFC3 observations in F336W for nearby large elliptical galaxies are highly anticipated in this regard.
Warren, P. H.
1986-01-01
The geochemical bimodality of pristine rocks led to proposals that a major fraction of the crust (the Mg rich suite) formed in cumulates in numerous intrusions slightly younger than the magmasphere. It is suggested that assimilation helped to engender the bimodal patterns. Mass/energy balance calculations indicate that large proportions of plagioclase were probably assimilated from the older (Magmasphere-generated) ferroan anorthosite crust by most of the Mg-rich intrusive melts. The magmasphere, in the absence of assimilation probably did not yield appreciable plagioclase until fractional crystallization of mafic silicates had diminished the melt mg ratio to about 0.42. However, assuming identical melt composition, an Mg-rich intrusion assimilating ferroan anorthosite would have reached plagioclase saturation at a much higher mg, about 0.66. It is suggested that the current version of the magmasphere hypothesis (ferroan anorthosites = magmasphere flotation cumulates; Mg-rich rocks = products of younger, localized intrusions) is the only plausable mechanism for engendering the Mg/Fe-relate bimodality.
International Nuclear Information System (INIS)
Karamyan, S.A.; Adam, J.; Belov, A.G.; Chaloun, P.; Norseev, Yu.V.; Stegajlov, V.I.
1997-01-01
Fission-fragment mass distribution has been measured by the cumulative yields of radionuclides detected in the 232 Th(γ,f)-reaction at the Bremsstrahlung endpoint energies of 12 and 24 MeV. The yield upper limits have been estimated for the light nuclei 24 Na, 28 Mg, 38 S etc. at the Th and Ta targets exposure to the 24 MeV Bremsstrahlung. The results are discussed in terms of the multimodal fission phenomena and cluster emission >from a deformed fissioning system or from a compound nucleus
Gaze-independent ERP-BCIs: augmenting performance through location-congruent bimodal stimuli
Thurlings, Marieke E.; Brouwer, Anne-Marie; Van Erp, Jan B. F.; Werkhoven, Peter
2014-01-01
Gaze-independent event-related potential (ERP) based brain-computer interfaces (BCIs) yield relatively low BCI performance and traditionally employ unimodal stimuli. Bimodal ERP-BCIs may increase BCI performance due to multisensory integration or summation in the brain. An additional advantage of bimodal BCIs may be that the user can choose which modality or modalities to attend to. We studied bimodal, visual-tactile, gaze-independent BCIs and investigated whether or not ERP components’ tAUCs and subsequent classification accuracies are increased for (1) bimodal vs. unimodal stimuli; (2) location-congruent vs. location-incongruent bimodal stimuli; and (3) attending to both modalities vs. to either one modality. We observed an enhanced bimodal (compared to unimodal) P300 tAUC, which appeared to be positively affected by location-congruency (p = 0.056) and resulted in higher classification accuracies. Attending either to one or to both modalities of the bimodal location-congruent stimuli resulted in differences between ERP components, but not in classification performance. We conclude that location-congruent bimodal stimuli improve ERP-BCIs, and offer the user the possibility to switch the attended modality without losing performance. PMID:25249947
Nonlatching positive feedback enables robust bimodality by decoupling expression noise from the mean
Razooky, Brandon S.; Cao, Youfang; Hansen, Maike M. K.; Perelson, Alan S.; Simpson, Michael L.
2017-01-01
Fundamental to biological decision-making is the ability to generate bimodal expression patterns where 2 alternate expression states simultaneously exist. Here, we use a combination of single-cell analysis and mathematical modeling to examine the sources of bimodality in the transcriptional program controlling HIV’s fate decision between active replication and viral latency. We find that the HIV transactivator of transcription (Tat) protein manipulates the intrinsic toggling of HIV’s promoter, the long terminal repeat (LTR), to generate bimodal ON-OFF expression and that transcriptional positive feedback from Tat shifts and expands the regime of LTR bimodality. This result holds for both minimal synthetic viral circuits and full-length virus. Strikingly, computational analysis indicates that the Tat circuit’s noncooperative “nonlatching” feedback architecture is optimized to slow the promoter’s toggling and generate bimodality by stochastic extinction of Tat. In contrast to the standard Poisson model, theory and experiment show that nonlatching positive feedback substantially dampens the inverse noise-mean relationship to maintain stochastic bimodality despite increasing mean expression levels. Given the rapid evolution of HIV, the presence of a circuit optimized to robustly generate bimodal expression appears consistent with the hypothesis that HIV’s decision between active replication and latency provides a viral fitness advantage. More broadly, the results suggest that positive-feedback circuits may have evolved not only for signal amplification but also for robustly generating bimodality by decoupling expression fluctuations (noise) from mean expression levels. PMID:29045398
Nonlatching positive feedback enables robust bimodality by decoupling expression noise from the mean
Energy Technology Data Exchange (ETDEWEB)
Razooky, Brandon S. [Rockefeller Univ., New York, NY (United States). Lab. of Virology and Infectious Disease; Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Univ. of California, San Francisco, CA (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Nanophase Materials Science (CNMS); Univ. of Tennessee, Knoxville, TN (United States). Bredesen Center for Interdisciplinary; Cao, Youfang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hansen, Maike M. K. [Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Perelson, Alan S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Simpson, Michael L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Nanophase Materials Science (CNMS); Univ. of Tennessee, Knoxville, TN (United States). Bredesen Center for Interdisciplinary; Weinberger, Leor S. [Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Univ. of California, San Francisco, CA (United States). Dept. of Biochemistry and Biophysics; Univ. of California, San Francisco, CA (United States). QB3: California Inst. of Quantitative Biosciences; Univ. of California, San Francisco, CA (United States). Dept. of Pharmaceutical Chemistry
2017-10-18
Fundamental to biological decision-making is the ability to generate bimodal expression patterns where two alternate expression states simultaneously exist. Here in this study, we use a combination of single-cell analysis and mathematical modeling to examine the sources of bimodality in the transcriptional program controlling HIV’s fate decision between active replication and viral latency. We find that the HIV Tat protein manipulates the intrinsic toggling of HIV’s promoter, the LTR, to generate bimodal ON-OFF expression, and that transcriptional positive feedback from Tat shifts and expands the regime of LTR bimodality. This result holds for both minimal synthetic viral circuits and full-length virus. Strikingly, computational analysis indicates that the Tat circuit’s non-cooperative ‘non-latching’ feedback architecture is optimized to slow the promoter’s toggling and generate bimodality by stochastic extinction of Tat. In contrast to the standard Poisson model, theory and experiment show that non-latching positive feedback substantially dampens the inverse noise-mean relationship to maintain stochastic bimodality despite increasing mean-expression levels. Given the rapid evolution of HIV, the presence of a circuit optimized to robustly generate bimodal expression appears consistent with the hypothesis that HIV’s decision between active replication and latency provides a viral fitness advantage. More broadly, the results suggest that positive-feedback circuits may have evolved not only for signal amplification but also for robustly generating bimodality by decoupling expression fluctuations (noise) from mean expression levels.
Fixed-point distributions of short-range Ising spin glasses on hierarchical lattices
Almeida, Sebastião T. O.; Nobre, Fernando D.
2015-03-01
Fixed-point distributions for the couplings of Ising spin glasses with nearest-neighbor interactions on hierarchical lattices are investigated numerically. Hierarchical lattices within the Migdal-Kadanoff family with fractal dimensions in the range 2.58 ≤D ≤7 , as well as a lattice of the Wheatstone-Bridge family with fractal dimension D ≈3.58 are considered. Three initial distributions for the couplings are analyzed, namely, the Gaussian, bimodal, and uniform ones. In all cases, after a few iterations of the renormalization-group procedure, the associated probability distributions approached universal fixed shapes. For hierarchical lattices of the Migdal-Kadanoff family, the fixed-point distributions were well fitted either by stretched exponentials, or by q -Gaussian distributions; both fittings recover the expected Gaussian limit as D →∞ . In the case of the Wheatstone-Bridge lattice, the best fit was found by means of a stretched-exponential distribution.
[What bimodal bilingual have to say about bilingual developing?
de Quadros, Ronice Müller; Lillo-Martin, Diane; Pichler, Deborah Chen
2013-07-01
The goal of this work is to present what our research with hearing children from Deaf parents, acquiring Brazilian Sign Language (Libras) and Portuguese, and American Sign Language (ASL) and English (Lillo-Martin et. al. 2010) have to say about bilingual development. The data analyzed in this study is part of the database of spontaneous interactions collected longitudinally, alternating contexts of sign and spoken languages. Moreover, there is data from experimental studies with tests in both pairs of languages that is incorporated to the present study. A general view about previous studies related to bimodal bilingual acquisition with hearing children, from "deaf" parents, will be presented. Then, we will show some linguistics aspects of this kind of acquisition found in our study and discuss about bilingual acquisition.
Event-related potentials to visual, auditory, and bimodal (combined auditory-visual) stimuli.
Isoğlu-Alkaç, Ummühan; Kedzior, Karina; Keskindemirci, Gonca; Ermutlu, Numan; Karamursel, Sacit
2007-02-01
The purpose of this study was to investigate the response properties of event related potentials to unimodal and bimodal stimulations. The amplitudes of N1 and P2 were larger during bimodal evoked potentials (BEPs) than auditory evoked potentials (AEPs) in the anterior sites and the amplitudes of P1 were larger during BEPs than VEPs especially at the parieto-occipital locations. Responses to bimodal stimulation had longer latencies than responses to unimodal stimulation. The N1 and P2 components were larger in amplitude and longer in latency during the bimodal paradigm and predominantly occurred at the anterior sites. Therefore, the current bimodal paradigm can be used to investigate the involvement and location of specific neural generators that contribute to higher processing of sensory information. Moreover, this paradigm may be a useful tool to investigate the level of sensory dysfunctions in clinical samples.
Pollacco, Joseph Alexander Paul; Webb, Trevor; McNeill, Stephen; Hu, Wei; Carrick, Sam; Hewitt, Allan; Lilburne, Linda
2017-06-01
Descriptions of soil hydraulic properties, such as the soil moisture retention curve, θ(h), and saturated hydraulic conductivities, Ks, are a prerequisite for hydrological models. Since the measurement of Ks is expensive, it is frequently derived from statistical pedotransfer functions (PTFs). Because it is usually more difficult to describe Ks than θ(h) from pedotransfer functions, Pollacco et al. (2013) developed a physical unimodal model to compute Ks solely from hydraulic parameters derived from the Kosugi θ(h). This unimodal Ks model, which is based on a unimodal Kosugi soil pore-size distribution, was developed by combining the approach of Hagen-Poiseuille with Darcy's law and by introducing three tortuosity parameters. We report here on (1) the suitability of the Pollacco unimodal Ks model to predict Ks for a range of New Zealand soils from the New Zealand soil database (S-map) and (2) further adaptations to this model to adapt it to dual-porosity structured soils by computing the soil water flux through a continuous function of an improved bimodal pore-size distribution. The improved bimodal Ks model was tested with a New Zealand data set derived from historical measurements of Ks and θ(h) for a range of soils derived from sandstone and siltstone. The Ks data were collected using a small core size of 10 cm diameter, causing large uncertainty in replicate measurements. Predictions of Ks were further improved by distinguishing topsoils from subsoil. Nevertheless, as expected, stratifying the data with soil texture only slightly improved the predictions of the physical Ks models because the Ks model is based on pore-size distribution and the calibrated parameters were obtained within the physically feasible range. The improvements made to the unimodal Ks model by using the new bimodal Ks model are modest when compared to the unimodal model, which is explained by the poor accuracy of measured total porosity. Nevertheless, the new bimodal model provides an
Directory of Open Access Journals (Sweden)
J. A. P. Pollacco
2017-06-01
Full Text Available Descriptions of soil hydraulic properties, such as the soil moisture retention curve, θ(h, and saturated hydraulic conductivities, Ks, are a prerequisite for hydrological models. Since the measurement of Ks is expensive, it is frequently derived from statistical pedotransfer functions (PTFs. Because it is usually more difficult to describe Ks than θ(h from pedotransfer functions, Pollacco et al. (2013 developed a physical unimodal model to compute Ks solely from hydraulic parameters derived from the Kosugi θ(h. This unimodal Ks model, which is based on a unimodal Kosugi soil pore-size distribution, was developed by combining the approach of Hagen–Poiseuille with Darcy's law and by introducing three tortuosity parameters. We report here on (1 the suitability of the Pollacco unimodal Ks model to predict Ks for a range of New Zealand soils from the New Zealand soil database (S-map and (2 further adaptations to this model to adapt it to dual-porosity structured soils by computing the soil water flux through a continuous function of an improved bimodal pore-size distribution. The improved bimodal Ks model was tested with a New Zealand data set derived from historical measurements of Ks and θ(h for a range of soils derived from sandstone and siltstone. The Ks data were collected using a small core size of 10 cm diameter, causing large uncertainty in replicate measurements. Predictions of Ks were further improved by distinguishing topsoils from subsoil. Nevertheless, as expected, stratifying the data with soil texture only slightly improved the predictions of the physical Ks models because the Ks model is based on pore-size distribution and the calibrated parameters were obtained within the physically feasible range. The improvements made to the unimodal Ks model by using the new bimodal Ks model are modest when compared to the unimodal model, which is explained by the poor accuracy of measured total porosity. Nevertheless, the new bimodal
Measurement Invariance, Entropy, and Probability
Directory of Open Access Journals (Sweden)
D. Eric Smith
2010-02-01
Full Text Available We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student’s probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability.
International Nuclear Information System (INIS)
Urabe, Itsumasa
2002-01-01
The relevance of concepts brought to mind by stimulus terms concerning atomic energy and radiation utilization has been investigated to learn how people understand the present status of nuclear technology. The relevance of concepts was defined as the frequency distribution of words that came to mind immediately after seeing selected terms needed for present-day life as well as for nuclear engineering. An analysis of knowledge structure shows that a concept of atomic energy has a close relation with that of electric power generation; an understanding of nuclear power utilization may be promoted in relation to an understanding of energy and environmental problems because the concepts of energy, atomic energy, electric power generation, and natural environment have closer relations with one another; a concept of radiation has various relations with harmful radiological health effects, but little relation with industrial, agricultural, and other beneficial uses except of nuclear power generation or medical applications. It also became clear from the investigation that studies on natural radiation may be important to promote an understanding of radiation utilization because a concept of the natural environment does not yet relate to that of natural radiation. (author)
Dickstein, D L; Pullman, M Y; Fernandez, C; Short, J A; Kostakoglu, L; Knesaurek, K; Soleimani, L; Jordan, B D; Gordon, W A; Dams-O'Connor, K; Delman, B N; Wong, E; Tang, C Y; DeKosky, S T; Stone, J R; Cantu, R C; Sano, M; Hof, P R; Gandy, S
2016-09-27
Chronic traumatic encephalopathy (CTE) is a neurodegenerative disorder most commonly associated with repetitive traumatic brain injury (TBI) and characterized by the presence of neurofibrillary tangles of tau protein, known as a tauopathy. Currently, the diagnosis of CTE can only be definitively established postmortem. However, a new positron emission tomography (PET) ligand, [ 18 F]T807/AV1451, may provide the antemortem detection of tau aggregates, and thus various tauopathies, including CTE. Our goal was to examine [ 18 F]T807/AV1451 retention in athletes with neuropsychiatric symptoms associated with a history of multiple concussions. Here we report a 39-year-old retired National Football League player who suffered 22 concussions and manifested progressive neuropsychiatric symptoms. Emotional lability and irritability were the chief complaints. Serial neuropsychological exams revealed a decline in executive functioning, processing speed and fine motor skills. Naming was below average but other cognitive functions were preserved. Structural analysis of longitudinally acquired magenetic resonance imaging scans revealed cortical thinning in the left frontal and lateral temporal areas, as well as volume loss in the basal ganglia. PET with [ 18 F]florbetapir was negative for amyloidosis. The [ 18 F]T807/AV1451 PET showed multifocal areas of retention at the cortical gray matter-white matter junction, a distribution considered pathognomonic for CTE. [ 18 F]T807/AV1451 standard uptake value (SUV) analysis showed increased uptake (SUVr⩾1.1) in bilateral cingulate, occipital, and orbitofrontal cortices, and several temporal areas. Although definitive identification of the neuropathological underpinnings basis for [ 18 F]T807/AV1451 retention requires postmortem correlation, our data suggest that [ 18 F]T807/AV1451 tauopathy imaging may be a promising tool to detect and diagnose CTE-related tauopathy in living subjects.
Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.
1995-01-01
The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
International Nuclear Information System (INIS)
Ouarda, T.B.M.J.; Charron, C.; Chebana, F.
2016-01-01
Highlights: • Review of criteria used to select probability distributions to model wind speed data. • Classical and L-moment ratio diagrams are applied to wind speed data. • The diagrams allow to select the best distribution to model each wind speed sample. • The goodness-of-fit statistics are more consistent with the L-moment ratio diagram. - Abstract: This paper reviews the different criteria used in the field of wind energy to compare the goodness-of-fit of candidate probability density functions (pdfs) to wind speed records, and discusses their advantages and disadvantages. The moment ratio and L-moment ratio diagram methods are also proposed as alternative methods for the choice of the pdfs. These two methods have the advantage of allowing an easy comparison of the fit of several pdfs for several time series (stations) on a single diagram. Plotting the position of a given wind speed data set in these diagrams is instantaneous and provides more information than a goodness-of-fit criterion since it provides knowledge about such characteristics as the skewness and kurtosis of the station data set. In this paper, it is proposed to study the applicability of these two methods for the selection of pdfs for wind speed data. Both types of diagrams are used to assess the fit of the pdfs for wind speed series in the United Arab Emirates. The analysis of the moment ratio diagrams reveals that the Kappa, Log-Pearson type III and Generalized Gamma are the distributions that fit best all wind speed series. The Weibull represents the best distribution among those with only one shape parameter. Results obtained with the diagrams are compared with those obtained with goodness-of-fit statistics and a good agreement is observed especially in the case of the L-moment ratio diagram. It is concluded that these diagrams can represent a simple and efficient approach to be used as complementary method to goodness-of-fit criteria.
Far-from-Equilibrium Route to Superthermal Light in Bimodal Nanolasers
Marconi, Mathias; Javaloyes, Julien; Hamel, Philippe; Raineri, Fabrice; Levenson, Ariel; Yacomotti, Alejandro M.
2018-02-01
Microscale and nanoscale lasers inherently exhibit rich photon statistics due to complex light-matter interaction in a strong spontaneous emission noise background. It is well known that they may display superthermal fluctuations—photon superbunching—in specific situations due to either gain competition, leading to mode-switching instabilities, or carrier-carrier coupling in superradiant microcavities. Here we show a generic route to superbunching in bimodal nanolasers by preparing the system far from equilibrium through a parameter quench. We demonstrate, both theoretically and experimentally, that transient dynamics after a short-pump-pulse-induced quench leads to heavy-tailed superthermal statistics when projected onto the weak mode. We implement a simple experimental technique to access the probability density functions that further enables quantifying the distance from thermal equilibrium via the thermodynamic entropy. The universality of this mechanism relies on the far-from-equilibrium dynamical scenario, which can be mapped to a fast cooling process of a suspension of Brownian particles in a liquid. Our results open up new avenues to mold photon statistics in multimode optical systems and may constitute a test bed to investigate out-of-equilibrium thermodynamics using micro or nanocavity arrays.
Bimodal pollination system of the bromeliad Aechmea nudicaulis involving hummingbirds and bees.
Schmid, S; Schmid, V S; Zillikens, A; Harter-Marques, B; Steiner, J
2011-01-01
In order to compare the effectiveness of birds and insects as pollinators, we studied the floral biology of the bromeliad Aechmea nudicaulis (L.) Grisebach in the biome of the Atlantic rain forest, southern Brazil. On Santa Catarina Island, flowering extends from mid-September to the end of December, with diurnal anthesis. The reproductive system is obligatory xenogamy, thus pollinator-dependent. Flowers secrete 31.84 μl of nectar per day, with a mean sugar concentration of 23.2%. Highest nectar volume and sugar concentration occur at the beginning of anthesis. Most floral traits are characteristic for ornithophily, and nectar production appears to be adapted to the energy demand of hummingbirds. Continued secretion of the sucrose-dominated nectar attracts and binds visitors to inflorescences, strengthening trapline foraging behaviour. Experiments assessing seed set after single flower visits were performed with the most frequent visitors, revealing the hummingbird Thalurania glaucopis as the most effective pollen vector. In addition, bees are also functional pollinators, as substantiated by their high visitation frequency. We conclude that this pollination system is bimodal. Thus, there is redundancy in the pollination service provided by birds and bees, granting a high probability of successful reproduction in Ae. nudicaulis. © 2010 German Botanical Society and The Royal Botanical Society of the Netherlands.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Vajpai, Sanjay Kumar; Sawangrat, Choncharoen; Yamaguchi, Osamu; Ciuca, Octav Paul; Ameyama, Kei
2016-01-01
In the present work, Co-Cr-Mo alloy compacts with a unique bimodal microstructural design, harmonic structure design, were successfully prepared via a powder metallurgy route consisting of controlled mechanical milling of pre-alloyed powders followed by spark plasma sintering. The harmonic structured Co-Cr-Mo alloy with bimodal grain size distribution exhibited relatively higher strength together with higher ductility as compared to the coarse-grained specimens. The harmonic Co-Cr-Mo alloy exhibited a very complex deformation behavior wherein it was found that the higher strength and the high retained ductility are derived from fine-grained shell and coarse-grained core regions, respectively. Finally, it was observed that the peculiar spatial/topological arrangement of stronger fine-grained and ductile coarse-grained regions in the harmonic structure promotes uniformity of strain distribution, leading to improved mechanical properties by suppressing the localized plastic deformation during straining. Copyright © 2015 Elsevier B.V. All rights reserved.
Probability measures on metric spaces
Parthasarathy, K R
2005-01-01
In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom
Nuclear data uncertainties: I, Basic concepts of probability
International Nuclear Information System (INIS)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Bimodal atomic force microscopy imaging of isolated antibodies in air and liquids
Energy Technology Data Exchange (ETDEWEB)
MartInez, N F; Lozano, J R; Herruzo, E T; Garcia, F; Garcia, R [Instituto de Microelectronica de Madrid, CSIC, Isaac Newton 8, 28760 Tres Cantos, Madrid (Spain); Richter, C; Sulzbach, T [NanoWorld Services GmbH, Schottkystrasse 10, 91058 Erlangen (Germany)], E-mail: rgarcia@imm.cnm.csic.es
2008-09-24
We have developed a dynamic atomic force microscopy (AFM) method based on the simultaneous excitation of the first two flexural modes of the cantilever. The instrument, called a bimodal atomic force microscope, allows us to resolve the structural components of antibodies in both monomer and pentameric forms. The instrument operates in both high and low quality factor environments, i.e., air and liquids. We show that under the same experimental conditions, bimodal AFM is more sensitive to compositional changes than amplitude modulation AFM. By using theoretical and numerical methods, we study the material contrast sensitivity as well as the forces applied on the sample during bimodal AFM operation.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
Directory of Open Access Journals (Sweden)
José Alves Junqueira Júnior
2007-06-01
Full Text Available Nos dias atuais a irrigação é uma das principais técnicas a serviço da agricultura. Entretanto, a consideração da irrigação como única fonte de suprir a demanda de água para as plantas pode acarretar em sistemas superdimensionados, o que contribui para elevar seu custo de implantação. Uma das alternativas utilizadas na solução desse problema consiste em considerar a precipitação a um determinado nível de probabilidade, ou seja, a precipitação provável, o que possibilitaria fazer a irrigação complementar. Assim, objetivou-se com o presente trabalho, caracterizar a precipitação provável na região do município de Madre de Deus, MG, comparando quatro diferentes modelos de distribuição de freqüência (Gama, Normal, Log-normal 2 e 3 parâmetros. As lâminas diárias foram totalizadas em períodos de 10, 15 e 30 dias, sendo avaliadas com 13 diferentes níveis de probabilidades, para séries históricas de 57 anos de observação, compreendido entre 1942 e 1999. Foi aplicado o teste de Kolmogorov-Smirnov a fim de avaliar a adequabilidade das mesmas e verificar qual modelo é mais adequado para cada uma das séries históricas. Observou-se que os modelos de probabilidade adequaram-se melhor ao período chuvoso, sendo a distribuição Log-normal 3 parâmetros a mais adequada para as séries históricas de período mensal e a distribuição Gama para os períodos quinzenal e decendial.Nowadays, irrigation is one of the most important agricultural technique. Therefore, this technique can not be the only source to supply water for crops, because the irrigation system may be over designed, increasing installation costs. One of alternatives to solve this problem is to analyze the probability of rainfall, decreasing costs and easing the irrigation management. This study purposes to characterize probable rainfall for Madre de Deus Village, comparing four (4 probability distribution models (Gama, Normal, Log-normal at 2 and 3
Utterance independent bimodal emotion recognition in spontaneous communication
Tao, Jianhua; Pan, Shifeng; Yang, Minghao; Li, Ya; Mu, Kaihui; Che, Jianfeng
2011-12-01
Emotion expressions sometimes are mixed with the utterance expression in spontaneous face-to-face communication, which makes difficulties for emotion recognition. This article introduces the methods of reducing the utterance influences in visual parameters for the audio-visual-based emotion recognition. The audio and visual channels are first combined under a Multistream Hidden Markov Model (MHMM). Then, the utterance reduction is finished by finding the residual between the real visual parameters and the outputs of the utterance related visual parameters. This article introduces the Fused Hidden Markov Model Inversion method which is trained in the neutral expressed audio-visual corpus to solve the problem. To reduce the computing complexity the inversion model is further simplified to a Gaussian Mixture Model (GMM) mapping. Compared with traditional bimodal emotion recognition methods (e.g., SVM, CART, Boosting), the utterance reduction method can give better results of emotion recognition. The experiments also show the effectiveness of our emotion recognition system when it was used in a live environment.
Plastic bimodal xylogenesis in conifers from continental Mediterranean climates.
Camarero, Jesús Julio; Olano, José Miguel; Parras, Alfonso
2010-01-01
*Seasonal radial-increment and xylogenesis data can help to elucidate how climate modulates wood formation in conifers. Few xylogenesis studies have assessed how plastic xylogenesis is in sympatric conifer species from continental Mediterranean areas, where low winter temperatures and summer drought constrain growth. *Here, we analysed intra-annual patterns of secondary growth in sympatric conifer species (Juniperus thurifera, Pinus halepensis and Pinus sylvestris). Two field sites (xeric and mesic) were evaluated using dendrometers, microcores and climatic data. *A bimodal pattern of xylogenesis characterized by spring and autumn precipitation and subsequent cambial reactivation was detected in J. thurifera at both study sites and in P. halepensis at the xeric site, but was absent in P. sylvestris where growth was largely controlled by day length. In the xeric site J. thurifera exhibited an increased response to water availability in autumn relative to P. halepensis and summer cambial suppression was more marked in J. thurifera than in P. halepensis. *Juniperus thurifera exhibited increased plasticity in its xylogenesis pattern compared with sympatric pines, enabling this species to occupy sites with more variable climatic conditions. The plastic xylogenesis patterns of junipers in drought-stressed areas may also provide them with a competitive advantage against co-occurring pines.
Time-predictable bimodal volcanism in the Coso Range, California
Bacon, Charles R.
1982-01-01
The bimodal Pleistocene part of the Coso volcanic field has erupted rhyolite and basalt at constant long-term rates during the past ∼0.5 m.y. Both basalt and high-silica rhyolite were erupted in several independent, geologically brief episodes. The interval between eruptions of rhyolite was proportional to the volume of the preceding eruption. Basaltic eruptions appear to have followed a similar pattern. These time-predictable relations would be expected if (1) extensional strain accumulates in roof rocks at a constant rate, (2) the accumulated strain is relieved by near-vertical fractures, which serve as conduits for eruptions, and (3) the volume of erupted material is proportional to the sum of the conduit (dike) widths. The long-term eruption rate of rhyolite is about 5.4 km3/m.y.; that of basalt is about 2.8 km3/m.y. These rates are less than those of magma supply inferred from heat-flow and petrologic arguments by factors of between 100 and 200.
On Probability Distribution Over Possible Worlds
1988-03-08
is a minimal non-zero element (Bell and Machover , [71). 3 3 First Order Languages When the move is made to first order languages certain problems...Chang and H. J. Keisler. Model Theory. North-Holland, Amster- dam, 1973. [7] John Bell and Mosh6 Machover . A Course in Mathematical Logic. El- sevier
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
SUBARU WEAK-LENSING STUDY OF A2163: BIMODAL MASS STRUCTURE
International Nuclear Information System (INIS)
Okabe, N.; Bourdin, H.; Mazzotta, P.; Maurogordato, S.
2011-01-01
We present a weak-lensing analysis of the merging cluster A2163 using Subaru/Suprime-Cam and CFHT/Mega-Cam data and discuss the dynamics of this cluster merger, based on complementary weak-lensing, X-ray, and optical spectroscopic data sets. From two-dimensional multi-component weak-lensing analysis, we reveal that the cluster mass distribution is well described by three main components including the two-component main cluster A2163-A with mass ratio 1:8, and its cluster satellite A2163-B. The bimodal mass distribution in A2163-A is similar to the galaxy density distribution, but appears as spatially segregated from the brightest X-ray emitting gas region. We discuss the possible origins of this gas-dark-matter offset and suggest the gas core of the A2163-A subcluster has been stripped away by ram pressure from its dark matter component. The survival of this gas core from the tidal forces exerted by the main cluster lets us infer a subcluster accretion with a non-zero impact parameter. Dominated by the most massive component of A2163-A, the mass distribution of A2163 is well described by a universal Navarro-Frenk-White profile as shown by a one-dimensional tangential shear analysis, while the singular-isothermal sphere profile is strongly ruled out. Comparing this cluster mass profile with profiles derived assuming intracluster medium hydrostatic equilibrium (H.E.) in two opposite regions of the cluster atmosphere has allowed us to confirm the prediction of a departure from H.E. in the eastern cluster side, presumably due to shock heating. Yielding a cluster mass estimate of M 500 = 11.18 +1.64 –1.46 × 10 14 h –1 M ☉ , our mass profile confirms the exceptionally high mass of A2163, consistent with previous analyses relying on the cluster dynamical analysis and Y X mass proxy.
Stochastic Modeling of Climatic Probabilities.
1979-11-01
students who contributed in a major way to the success of the project are Sarah Autrey, Jeff Em erson, Karl Grammel , Tom licknor and Debbie Wa i te. A...sophisticati . n an d cost of weapons systems and the recognition that the environment di-grades or offers opportunities h ~s led to tile requirement for...First , make a h istogram of the data , an d then “smooth” the histogram to obtain a frequency distribution (probability density function). The
Ghosh, Indranil
2011-01-01
Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…
Lara-Padilla, Hernan; Mendoza-Buenrostro, Christian; Cardenas, Diego; Rodriguez-Garcia, Aida; Rodriguez, Ciro A.
2017-01-01
The combination of different materials and capabilities to manufacture at several scales open new possibilities in scaffold design for bone regeneration. This work is focused on bimodal scaffolds that combine polylactic acid (PLA) melt extruded strands with polycaprolactone (PCL) electrospun fibers. This type of bimodal scaffold offers better mechanical properties, compared to the use of PCL for the extruded strands, and provides potential a means for controlled drug and/or growth factor deli...
Kong, Xianqi; Brinkmann, Andreas; Terskikh, Victor; Wasylishen, Roderick E; Bernard, Guy M; Duan, Zhuang; Wu, Qichao; Wu, Gang
2016-11-17
We report a combined solid-state ( 1 H, 2 H, 13 C, 17 O) NMR and plane-wave density functional theory (DFT) computational study of the O···H···O low-barrier hydrogen bonds (LBHBs) in two 1,3-diketone compounds: dibenzoylmethane (1) and curcumin (2). In the solid state, both 1 and 2 exist in the cis-keto-enol tautomeric form, each exhibiting an intramolecular LBHB with a short O···O distance (2.435 Å in 1 and 2.455 Å in 2). Whereas numerous experimental (structural and spectroscopic) and computational studies have been reported for the enol isomers of 1,3-diketones, a unified picture about the proton location within an LBHB is still lacking. This work reports for the first time the solid-state 17 O NMR data for the O···H···O LBHBs in 1,3-diketones. The central conclusion of this work is that detailed information about the probability density distribution of the proton (nuclear zero-point motion) across an LBHB can be obtained from a combination of solid-state NMR and plane-wave DFT computations (both NMR parameter calculations and ab initio molecular dynamics simulations). We propose that the precise proton probability distribution across an LBHB should provide a common basis on which different and sometimes seemingly contradicting experimental results obtained from complementary techniques, such as X-ray diffraction, neutron diffraction, and solid-state NMR, can be reconciled.
Teachers' Understandings of Probability
Liu, Yan; Thompson, Patrick
2007-01-01
Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Multiple regimes of operation in bimodal AFM: understanding the energy of cantilever eigenmodes
Directory of Open Access Journals (Sweden)
Daniel Kiracofe
2013-06-01
Full Text Available One of the key goals in atomic force microscopy (AFM imaging is to enhance material property contrast with high resolution. Bimodal AFM, where two eigenmodes are simultaneously excited, confers significant advantages over conventional single-frequency tapping mode AFM due to its ability to provide contrast between regions with different material properties under gentle imaging conditions. Bimodal AFM traditionally uses the first two eigenmodes of the AFM cantilever. In this work, the authors explore the use of higher eigenmodes in bimodal AFM (e.g., exciting the first and fourth eigenmodes. It is found that such operation leads to interesting contrast reversals compared to traditional bimodal AFM. A series of experiments and numerical simulations shows that the primary cause of the contrast reversals is not the choice of eigenmode itself (e.g., second versus fourth, but rather the relative kinetic energy between the higher eigenmode and the first eigenmode. This leads to the identification of three distinct imaging regimes in bimodal AFM. This result, which is applicable even to traditional bimodal AFM, should allow researchers to choose cantilever and operating parameters in a more rational manner in order to optimize resolution and contrast during nanoscale imaging of materials.
BDVC (Bimodal Database of Violent Content): A database of violent audio and video
Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro
2017-09-01
Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Energy Technology Data Exchange (ETDEWEB)
Balokovic, M. [Department of Astronomy, California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Smolcic, V. [Argelander-Institut fuer Astronomie, Auf dem Hugel 71, D-53121 Bonn (Germany); Ivezic, Z. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Zamorani, G. [INAF-Osservatorio Astronomico di Bologna, via Ranzani 1, I-40127 Bologna (Italy); Schinnerer, E. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Kelly, B. C. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States)
2012-11-01
We investigate the dichotomy in the radio loudness distribution of quasars by modeling their radio emission and various selection effects using a Monte Carlo approach. The existence of two physically distinct quasar populations, the radio-loud and radio-quiet quasars, is controversial and over the last decade a bimodal distribution of radio loudness of quasars has been both affirmed and disputed. We model the quasar radio luminosity distribution with simple unimodal and bimodal distribution functions. The resulting simulated samples are compared to a fiducial sample of 8300 quasars drawn from the SDSS DR7 Quasar Catalog and combined with radio observations from the FIRST survey. Our results indicate that the SDSS-FIRST sample is best described by a radio loudness distribution which consists of two components, with (12 {+-} 1)% of sources in the radio-loud component. On the other hand, the evidence for a local minimum in the loudness distribution (bimodality) is not strong and we find that previous claims for its existence were probably affected by the incompleteness of the FIRST survey close to its faint limit. We also investigate the redshift and luminosity dependence of the radio loudness distribution and find tentative evidence that at high redshift radio-loud quasars were rarer, on average louder, and exhibited a smaller range in radio loudness. In agreement with other recent work, we conclude that the SDSS-FIRST sample strongly suggests that the radio loudness distribution of quasars is not a universal function, and that more complex models than presented here are needed to fully explain available observations.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
2014-06-30
set of methods, many of which have their origin in probability in Banach spaces , that arise across a broad range of contemporary problems in di↵erent...salesman problem, . . . • Probability in Banach spaces : probabilistic limit theorems for Banach - valued random variables, empirical processes, local...theory of Banach spaces , geometric functional analysis, convex geometry. • Mixing times and other phenomena in high-dimensional Markov chains. At
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Random phenomena fundamentals of probability and statistics for engineers
Ogunnaike, Babatunde A
2009-01-01
PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...
Lanthanide oxide and phosphate nanoparticles for thermometry and bimodal imaging =
Debasu, Mengistie Leweyehu
. Finalmente, estudam-se as propriedades de fotoluminescencia correspondentes as conversoes ascendente e descendente de energia em nanocristais de (Gd,Yb,Tb)PO4 sintetizados por via hidrotermica. A relaxividade (ressonancia magnetica) do 1H destes materiais sao investigadas, tendo em vista possiveis aplicacoes em imagem bimodal (luminescencia e ressonancia magnetica nuclear).
Applications of Skew Models Using Generalized Logistic Distribution
Directory of Open Access Journals (Sweden)
Pushpa Narayan Rathie
2016-04-01
Full Text Available We use the skew distribution generation procedure proposed by Azzalini [Scand. J. Stat., 1985, 12, 171–178] to create three new probability distribution functions. These models make use of normal, student-t and generalized logistic distribution, see Rathie and Swamee [Technical Research Report No. 07/2006. Department of Statistics, University of Brasilia: Brasilia, Brazil, 2006]. Expressions for the moments about origin are derived. Graphical illustrations are also provided. The distributions derived in this paper can be seen as generalizations of the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. Applications with unimodal and bimodal data are given to illustrate the applicability of the results derived in this paper. The applications include the analysis of the following data sets: (a spending on public education in various countries in 2003; (b total expenditure on health in 2009 in various countries and (c waiting time between eruptions of the Old Faithful Geyser in the Yellow Stone National Park, Wyoming, USA. We compare the fit of the distributions introduced in this paper with the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. The results show that our distributions, in general, fit better the data sets. The general R codes for fitting the distributions introduced in this paper are given in Appendix A.
Spataru, Aurel
2013-01-01
Probability theory is a rapidly expanding field and is used in many areas of science and technology. Beginning from a basis of abstract analysis, this mathematics book develops the knowledge needed for advanced students to develop a complex understanding of probability. The first part of the book systematically presents concepts and results from analysis before embarking on the study of probability theory. The initial section will also be useful for those interested in topology, measure theory, real analysis and functional analysis. The second part of the book presents the concepts, methodology and fundamental results of probability theory. Exercises are included throughout the text, not just at the end, to teach each concept fully as it is explained, including presentations of interesting extensions of the theory. The complete and detailed nature of the book makes it ideal as a reference book or for self-study in probability and related fields. It covers a wide range of subjects including f-expansions, Fuk-N...
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Marks, Kendra L; Martel, David T; Wu, Calvin; Basura, Gregory J; Roberts, Larry E; Schvartz-Leyzac, Kara C; Shore, Susan E
2018-01-03
The dorsal cochlear nucleus is the first site of multisensory convergence in mammalian auditory pathways. Principal output neurons, the fusiform cells, integrate auditory nerve inputs from the cochlea with somatosensory inputs from the head and neck. In previous work, we developed a guinea pig model of tinnitus induced by noise exposure and showed that the fusiform cells in these animals exhibited increased spontaneous activity and cross-unit synchrony, which are physiological correlates of tinnitus. We delivered repeated bimodal auditory-somatosensory stimulation to the dorsal cochlear nucleus of guinea pigs with tinnitus, choosing a stimulus interval known to induce long-term depression (LTD). Twenty minutes per day of LTD-inducing bimodal (but not unimodal) stimulation reduced physiological and behavioral evidence of tinnitus in the guinea pigs after 25 days. Next, we applied the same bimodal treatment to 20 human subjects with tinnitus using a double-blinded, sham-controlled, crossover study. Twenty-eight days of LTD-inducing bimodal stimulation reduced tinnitus loudness and intrusiveness. Unimodal auditory stimulation did not deliver either benefit. Bimodal auditory-somatosensory stimulation that induces LTD in the dorsal cochlear nucleus may hold promise for suppressing chronic tinnitus, which reduces quality of life for millions of tinnitus sufferers worldwide. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Early Bimodal Stimulation Benefits Language Acquisition for Children With Cochlear Implants.
Moberly, Aaron C; Lowenstein, Joanna H; Nittrouer, Susan
2016-01-01
Adding a low-frequency acoustic signal to the cochlear implant (CI) signal (i.e., bimodal stimulation) for a period of time early in life improves language acquisition. Children must acquire sensitivity to the phonemic units of language to develop most language-related skills, including expressive vocabulary, working memory, and reading. Acquiring sensitivity to phonemic structure depends largely on having refined spectral (frequency) representations available in the signal, which does not happen with CIs alone. Combining the low-frequency acoustic signal available through hearing aids with the CI signal can enhance signal quality. A period with this bimodal stimulation has been shown to improve language skills in very young children. This study examined whether these benefits persist into childhood. Data were examined for 48 children with CIs implanted under age 3 years, participating in a longitudinal study. All children wore hearing aids before receiving a CI, but upon receiving a first CI, 24 children had at least 1 year of bimodal stimulation (Bimodal group), and 24 children had only electric stimulation subsequent to implantation (CI-only group). Measures of phonemic awareness were obtained at second and fourth grades, along with measures of expressive vocabulary, working memory, and reading. Children in the Bimodal group generally performed better on measures of phonemic awareness, and that advantage was reflected in other language measures. Having even a brief period of time early in life with combined electric-acoustic input provides benefits to language learning into childhood, likely because of the enhancement in spectral representations provided.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Energy Technology Data Exchange (ETDEWEB)
Keskin, Mustafa [Department of Physics, Erciyes University, 38039 Kayseri (Turkey)], E-mail: keskin@erciyes.edu.tr; Canko, Osman [Department of Physics, Erciyes University, 38039 Kayseri (Turkey)
2008-01-15
We study the thermal variations of the ferromagnetic spin-3/2 Blume-Emery-Griffiths (BEG) model with repulsive biquadratic coupling by using the lowest approximation of the cluster variation method (LACVM) in the absence and presence of the external magnetic field. We obtain metastable and unstable branches of the order parameters besides the stable branches and phase transitions of these branches are investigated extensively. The classification of the stable, metastable and unstable states is made by comparing the free energy values of these states. We also study the dynamics of the model by using the path probability method (PPM) with the point distribution in order to make sure that we find and define the metastable and unstable branches of the order parameters completely and correctly. We present the metastable phase diagrams in addition to the equilibrium phase diagrams in the (kT/J, K/J) and (kT/J, D/J) planes. It is found that the metastable phase diagrams always exist at the low temperatures, which are consistent with experimental and theoretical works.
International Nuclear Information System (INIS)
Keskin, Mustafa; Canko, Osman
2008-01-01
We study the thermal variations of the ferromagnetic spin-3/2 Blume-Emery-Griffiths (BEG) model with repulsive biquadratic coupling by using the lowest approximation of the cluster variation method (LACVM) in the absence and presence of the external magnetic field. We obtain metastable and unstable branches of the order parameters besides the stable branches and phase transitions of these branches are investigated extensively. The classification of the stable, metastable and unstable states is made by comparing the free energy values of these states. We also study the dynamics of the model by using the path probability method (PPM) with the point distribution in order to make sure that we find and define the metastable and unstable branches of the order parameters completely and correctly. We present the metastable phase diagrams in addition to the equilibrium phase diagrams in the (kT/J, K/J) and (kT/J, D/J) planes. It is found that the metastable phase diagrams always exist at the low temperatures, which are consistent with experimental and theoretical works
African Journals Online (AJOL)
Willem Scholtz
“quite probably, also the end of Angola's existence as an independent country”. It went on: “The victory at Cuito Cuanavale for the liberation forces and their Cuban compatriots was therefore decisive in consolidating Angola's independence and achieving that of Namibia.” Therefore, when reflecting on the events, “it is not ...
Indian Academy of Sciences (India)
IAS Admin
He spends several months in India visiting schools, colleges and universities. He enjoys teaching mathematics and statistics at all levels. He loves Indian classical and folk music. This issue of Resonance features Joseph Leonard. Doob, who played a critical role in the devel- opment of probability theory in the world from.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.
Probability Theory Without Tears!
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Indian Academy of Sciences (India)
OF PROBABILITY *. The simplest laws of natural science are those that state the conditions under which some event of interest to us will either certainly occur or certainly not occur; i.e., these conditions may be expressed in one of the following two forms: 1. If a complex (i.e., a set or collection) of conditions S is realized, then.
Application of a bi-modal PBR nuclear propulsion and power system to military missions
Venetoklis, Peter S.
1995-01-01
The rapid proliferation of arms technology and space access combined with current economic realities in the United States are creating ever greater demands for more capable space-based military assets. The paper illustrates that bi-modal nuclear propulsion and power based on the Particle Bed Reactor (PBR) is a high-leverage tehcnology that can maximize utility while minimizing cost. Mission benefits offered by the bi-modal PBR, including enhanced maneuverability, lifetime, survivability, payload power, and operational flexibility, are discussed. The ability to deliver desired payloads on smaller boosters is also illustrated. System descriptions and parameters for 10 kWe and 100 kWe power output levels are summarized. It is demonstrated via design exercise that bi-modal PBR dramtically enhances performance of a military satellite in geosynchronous orbit, increasing payload mass, payload power, and maneuverability.
A bimodal power and propulsion system based on cermet fuel and heat pipe energy transport
International Nuclear Information System (INIS)
Polansky, G.F.; Gunther, N.A.; Rochow, R.F.; Bixler, C.H.
1995-01-01
Bimodal space reactor systems provide both thermal propulsion for the spacecraft orbital transfer and electrical power to the spacecraft bus once it is on station. These systems have the potential to increase both the available payload in high energy orbits and the available power to that payload. These increased mass and power capabilities can be used to either reduce mission cost by permitting the use of smaller launch vehicles or to provide increased mission performance from the current launch vehicle. A major barrier to the deployment of these bimodal systems has been the cost associated with their development. This paper describes a bimodal reactor system with performance potential to permit more than 70% of the instrumented payload of the Titan IV/Centaur to be launched from the Atlas IIAS. The development cost is minimized by basing the design on existing component technologies
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Directory of Open Access Journals (Sweden)
Carlos Alberto Viliotti
2012-05-01
Full Text Available Currently great emphasis is given for seed metering that assist rigorous demands in relation to longitudinal distribution of seeds, as well as to the index of fails in spacing laws, breaks and double seeds. The evaluation of these variable demands much time and work of attainment of data and processing. The objective of this work went propose to use of graphs of normal probability, facilitating the treatment of the data and decreasing the time of processing. The evaluation methodology consists in the counting of broken seeds, fail spacing and double seeds through the measure of the spacing among seeds, preliminary experiments through combinations of treatments had been carried through whose factors of variation were the level of the reservoir of seeds, the leveling of the seed metering, the speed of displacement and dosage of seeds. The evaluation was carried through in two parts, first through preliminary experiments for elaboration of the graphs of normal probability and later in experiments with bigger sampling for evaluation of the influence of the factors most important. It was done the evaluation of seed metering of rotating internal ring, and the amount of necessary data for the evaluation was very decreased through of the graphs of normal probability that facilitated to prioritize only the significant factors. The dosage of seeds was factor that more important because factor (D have greater significance.Atualmente grande ênfase é dada para mecanismos dosadores de sementes, que devem atender exigências cada vez maiores em relação à qualidade de distribuição longitudinal, principalmente em relação ao índice de falhas, quebras e duplas. A avaliação destas variáveis demanda muito tempo e trabalho, tanto para obtenção de dados como para processamento. O objetivo deste trabalho foi propor a utilização de gráficos de probabilidade normal para facilitar o tratamento dos dados relativos aos fatores que possivelmente influem na
Measurement uncertainty and probability
National Research Council Canada - National Science Library
Willink, Robin
2013-01-01
... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...