WorldWideScience

Sample records for bimodal probability distributions

  1. An effective inversion algorithm for retrieving bimodal aerosol particle size distribution from spectral extinction data

    International Nuclear Information System (INIS)

    He, Zhenzong; Qi, Hong; Yao, Yuchen; Ruan, Liming

    2014-01-01

    The Ant Colony Optimization algorithm based on the probability density function (PDF-ACO) is applied to estimate the bimodal aerosol particle size distribution (PSD). The direct problem is solved by the modified Anomalous Diffraction Approximation (ADA, as an approximation for optically large and soft spheres, i.e., χ⪢1 and |m−1|⪡1) and the Beer–Lambert law. First, a popular bimodal aerosol PSD and three other bimodal PSDs are retrieved in the dependent model by the multi-wavelength extinction technique. All the results reveal that the PDF-ACO algorithm can be used as an effective technique to investigate the bimodal PSD. Then, the Johnson's S B (J-S B ) function and the modified beta (M-β) function are employed as the general distribution function to retrieve the bimodal PSDs under the independent model. Finally, the J-S B and M-β functions are applied to recover actual measurement aerosol PSDs over Beijing and Shanghai obtained from the aerosol robotic network (AERONET). The numerical simulation and experimental results demonstrate that these two general functions, especially the J-S B function, can be used as a versatile distribution function to retrieve the bimodal aerosol PSD when no priori information about the PSD is available. - Highlights: • Bimodal PSDs are retrieved by ACO based on probability density function accurately. • J-S B and M-β functions can be used as the versatile function to recover bimodal PSDs. • Bimodal aerosol PSDs can be estimated by J-S B function more reasonably

  2. Superthermal photon bunching in terms of simple probability distributions

    Science.gov (United States)

    Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.

    2018-05-01

    We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.

  3. An effective inversion algorithm for retrieving bimodal aerosol particle size distribution from spectral extinction data

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Yao, Yuchen; Ruan, Liming

    2014-12-01

    The Ant Colony Optimization algorithm based on the probability density function (PDF-ACO) is applied to estimate the bimodal aerosol particle size distribution (PSD). The direct problem is solved by the modified Anomalous Diffraction Approximation (ADA, as an approximation for optically large and soft spheres, i.e., χ⪢1 and |m-1|⪡1) and the Beer-Lambert law. First, a popular bimodal aerosol PSD and three other bimodal PSDs are retrieved in the dependent model by the multi-wavelength extinction technique. All the results reveal that the PDF-ACO algorithm can be used as an effective technique to investigate the bimodal PSD. Then, the Johnson's SB (J-SB) function and the modified beta (M-β) function are employed as the general distribution function to retrieve the bimodal PSDs under the independent model. Finally, the J-SB and M-β functions are applied to recover actual measurement aerosol PSDs over Beijing and Shanghai obtained from the aerosol robotic network (AERONET). The numerical simulation and experimental results demonstrate that these two general functions, especially the J-SB function, can be used as a versatile distribution function to retrieve the bimodal aerosol PSD when no priori information about the PSD is available.

  4. Asymmetric Bimodal Exponential Power Distribution on the Real Line

    Directory of Open Access Journals (Sweden)

    Mehmet Niyazi Çankaya

    2018-01-01

    Full Text Available The asymmetric bimodal exponential power (ABEP distribution is an extension of the generalized gamma distribution to the real line via adding two parameters that fit the shape of peakedness in bimodality on the real line. The special values of peakedness parameters of the distribution are a combination of half Laplace and half normal distributions on the real line. The distribution has two parameters fitting the height of bimodality, so capacity of bimodality is enhanced by using these parameters. Adding a skewness parameter is considered to model asymmetry in data. The location-scale form of this distribution is proposed. The Fisher information matrix of these parameters in ABEP is obtained explicitly. Properties of ABEP are examined. Real data examples are given to illustrate the modelling capacity of ABEP. The replicated artificial data from maximum likelihood estimates of parameters of ABEP and other distributions having an algorithm for artificial data generation procedure are provided to test the similarity with real data. A brief simulation study is presented.

  5. Bimodal Formation Time Distribution for Infall Dark Matter Halos

    Science.gov (United States)

    Shi, Jingjing; Wang, Huiyuan; Mo, H. J.; Xie, Lizhi; Wang, Xiaoyu; Lapi, Andrea; Sheth, Ravi K.

    2018-04-01

    We use a 200 {h}-1 {Mpc} a-side N-body simulation to study the mass accretion history (MAH) of dark matter halos to be accreted by larger halos, which we call infall halos. We define a quantity {a}nf}\\equiv (1+{z}{{f}})/(1+{z}peak}) to characterize the MAH of infall halos, where {z}peak} and {z}{{f}} are the accretion and formation redshifts, respectively. We find that, at given {z}peak}, their MAH is bimodal. Infall halos are dominated by a young population at high redshift and by an old population at low redshift. For the young population, the {a}nf} distribution is narrow and peaks at about 1.2, independent of {z}peak}, while for the old population, the peak position and width of the {a}nf} distribution both increase with decreasing {z}peak} and are both larger than those of the young population. This bimodal distribution is found to be closely connected to the two phases in the MAHs of halos. While members of the young population are still in the fast accretion phase at z peak, those of the old population have already entered the slow accretion phase at {z}peak}. This bimodal distribution is not found for the whole halo population, nor is it seen in halo merger trees generated with the extended Press–Schechter formalism. The infall halo population at {z}peak} are, on average, younger than the whole halo population of similar masses identified at the same redshift. We discuss the implications of our findings in connection to the bimodal color distribution of observed galaxies and to the link between central and satellite galaxies.

  6. Bimodal distribution of glucose is not universally useful for diagnosing diabetes

    DEFF Research Database (Denmark)

    Vistisen, Dorte; Colagiuri, Stephen; Borch-Johnsen, Knut

    2009-01-01

    OBJECTIVE: Bimodality in the distribution of glucose has been used to define the cut point for the diagnosis of diabetes. Previous studies on bimodality have primarily been in populations with a high prevalence of type 2 diabetes, including one study in a white Caucasian population. All studies i...

  7. A bimodal flexible distribution for lifetime data

    OpenAIRE

    Ramires, Thiago G.; Ortega, Edwin M. M.; Cordeiro, Gauss M.; Hens, Niel

    2016-01-01

    A four-parameter extended bimodal lifetime model called the exponentiated log-sinh Cauchy distribution is proposed. It extends the log-sinh Cauchy and folded Cauchy distributions. We derive some of its mathematical properties including explicit expressions for the ordinary moments and generating and quantile functions. The method of maximum likelihood is used to estimate the model parameters. We implement the fit of the model in the GAMLSS package and provide the codes. The flexibility of the...

  8. TRACING OUTFLOWS AND ACCRETION: A BIMODAL AZIMUTHAL DEPENDENCE OF Mg II ABSORPTION

    International Nuclear Information System (INIS)

    Kacprzak, Glenn G.; Churchill, Christopher W.; Nielsen, Nikole M.

    2012-01-01

    We report a bimodality in the azimuthal angle distribution of gas around galaxies as traced by Mg II absorption: halo gas prefers to exist near the projected galaxy major and minor axes. The bimodality is demonstrated by computing the mean azimuthal angle probability distribution function using 88 spectroscopically confirmed Mg II-absorption-selected galaxies [W r (2796) ≥ 0.1 Å] and 35 spectroscopically confirmed non-absorbing galaxies [W r (2796) r (2796) r (2796) distribution for gas along the major axis is likely skewed toward weaker Mg II absorption than for gas along the projected minor axis. These combined results are highly suggestive that the bimodality is driven by gas accreted along the galaxy major axis and outflowing along the galaxy minor axis. Adopting these assumptions, we find that the opening angle of outflows and inflows to be 100° and 40°, respectively. We find that the probability of detecting outflows is ∼60%, implying that winds are more commonly observed.

  9. Evidence for a bimodal distribution in human communication.

    Science.gov (United States)

    Wu, Ye; Zhou, Changsong; Xiao, Jinghua; Kurths, Jürgen; Schellnhuber, Hans Joachim

    2010-11-02

    Interacting human activities underlie the patterns of many social, technological, and economic phenomena. Here we present clear empirical evidence from Short Message correspondence that observed human actions are the result of the interplay of three basic ingredients: Poisson initiation of tasks and decision making for task execution in individual humans as well as interaction among individuals. This interplay leads to new types of interevent time distribution, neither completely Poisson nor power-law, but a bimodal combination of them. We show that the events can be separated into independent bursts which are generated by frequent mutual interactions in short times following random initiations of communications in longer times by the individuals. We introduce a minimal model of two interacting priority queues incorporating the three basic ingredients which fits well the distributions using the parameters extracted from the empirical data. The model can also embrace a range of realistic social interacting systems such as e-mail and letter communications when taking the time scale of processing into account. Our findings provide insight into various human activities both at the individual and network level. Our analysis and modeling of bimodal activity in human communication from the viewpoint of the interplay between processes of different time scales is likely to shed light on bimodal phenomena in other complex systems, such as interevent times in earthquakes, rainfall, forest fire, and economic systems, etc.

  10. Estimation of Bimodal Urban Link Travel Time Distribution and Its Applications in Traffic Analysis

    Directory of Open Access Journals (Sweden)

    Yuxiong Ji

    2015-01-01

    Full Text Available Vehicles travelling on urban streets are heavily influenced by traffic signal controls, pedestrian crossings, and conflicting traffic from cross streets, which would result in bimodal travel time distributions, with one mode corresponding to travels without delays and the other travels with delays. A hierarchical Bayesian bimodal travel time model is proposed to capture the interrupted nature of urban traffic flows. The travel time distributions obtained from the proposed model are then considered to analyze traffic operations and estimate travel time distribution in real time. The advantage of the proposed bimodal model is demonstrated using empirical data, and the results are encouraging.

  11. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  12. Multiphase flow modeling of a crude-oil spill site with a bimodal permeability distribution

    Science.gov (United States)

    Dillard, Leslie A.; Essaid, Hedeff I.; Herkelrath, William N.

    1997-01-01

    Fluid saturation, particle-size distribution, and porosity measurements were obtained from 269 core samples collected from six boreholes along a 90-m transect at a subregion of a crude-oil spill site, the north pool, near Bemidji, Minnesota. The oil saturation data, collected 11 years after the spill, showed an irregularly shaped oil body that appeared to be affected by sediment spatial variability. The particle-size distribution data were used to estimate the permeability (k) and retention curves for each sample. An additional 344 k estimates were obtained from samples previously collected at the north pool. The 613 k estimates were distributed bimodal lognormally with the two population distributions corresponding to the two predominant lithologies: a coarse glacial outwash deposit and fine-grained interbedded lenses. A two-step geostatistical approach was used to generate a conditioned realization of k representing the bimodal heterogeneity. A cross-sectional multiphase flow model was used to simulate the flow of oil and water in the presence of air along the north pool transect for an 11-year period. The inclusion of a representation of the bimodal aquifer heterogeneity was crucial for reproduction of general features of the observed oil body. If the bimodal heterogeneity was characterized, hysteresis did not have to be incorporated into the model because a hysteretic effect was produced by the sediment spatial variability. By revising the relative permeability functional relation, an improved reproduction of the observed oil saturation distribution was achieved. The inclusion of water table fluctuations in the model did not significantly affect the simulated oil saturation distribution.

  13. TRACING OUTFLOWS AND ACCRETION: A BIMODAL AZIMUTHAL DEPENDENCE OF Mg II ABSORPTION

    Energy Technology Data Exchange (ETDEWEB)

    Kacprzak, Glenn G. [Swinburne University of Technology, Victoria 3122 (Australia); Churchill, Christopher W.; Nielsen, Nikole M., E-mail: gkacprzak@astro.swin.edu.au [New Mexico State University, Las Cruces, NM 88003 (United States)

    2012-11-20

    We report a bimodality in the azimuthal angle distribution of gas around galaxies as traced by Mg II absorption: halo gas prefers to exist near the projected galaxy major and minor axes. The bimodality is demonstrated by computing the mean azimuthal angle probability distribution function using 88 spectroscopically confirmed Mg II-absorption-selected galaxies [W{sub r} (2796) {>=} 0.1 A] and 35 spectroscopically confirmed non-absorbing galaxies [W{sub r} (2796) < 0.1 A] imaged with Hubble Space Telescope and Sloan Digital Sky Survey. The azimuthal angle distribution for non-absorbers is flat, indicating no azimuthal preference for gas characterized by W{sub r} (2796) < 0.1 A. We find that blue star-forming galaxies clearly drive the bimodality while red passive galaxies may exhibit an excess along their major axis. These results are consistent with galaxy evolution scenarios where star-forming galaxies accrete new gas, forming new stars and producing winds, while red galaxies exist passively due to reduced gas reservoirs. We further compute an azimuthal angle dependent Mg II absorption covering fraction, which is enhanced by as much as 20%-30% along the major and minor axes. The W{sub r} (2796) distribution for gas along the major axis is likely skewed toward weaker Mg II absorption than for gas along the projected minor axis. These combined results are highly suggestive that the bimodality is driven by gas accreted along the galaxy major axis and outflowing along the galaxy minor axis. Adopting these assumptions, we find that the opening angle of outflows and inflows to be 100 Degree-Sign and 40 Degree-Sign , respectively. We find that the probability of detecting outflows is {approx}60%, implying that winds are more commonly observed.

  14. Bimodal grain-size distribution of Chinese loess, and its palaeoclimatic implications

    NARCIS (Netherlands)

    Sun, D.G.; Bloemendal, J.; Rea, D.K.; An, Z.S.; Vandenberghe, J.; Lu, H.; Su, R.; Liu, T.S.

    2004-01-01

    Grain-size analysis indicates that Chinese loess generally shows a bimodal distribution with a coarse and a fine component. The coarse component, comprising the main part of the loess, has pronounced kurtosis and is well sorted, which is interpreted to be the product of dust storms generated by

  15. Preparation of mesoporous NiO with a bimodal pore size distribution and application in electrochemical capacitors

    Energy Technology Data Exchange (ETDEWEB)

    Wang Dengchao; Ni Wenbin; Pang Huan; Lu Qingyi; Huang Zhongjie [Key Laboratory of Analytical Chemistry for Life Science (MOE), School of Chemistry and Chemical Engineering, Nanjing University, Nanjing 210008 (China); Zhao Jianwei, E-mail: zhaojw@nju.edu.c [Key Laboratory of Analytical Chemistry for Life Science (MOE), School of Chemistry and Chemical Engineering, Nanjing University, Nanjing 210008 (China)

    2010-09-01

    Mesoporous nickel oxide with a porous structure exhibiting a bimodal pore size distribution (2.6 and 30.3 nm diameter pores) has been synthesized in this paper. Firstly, a mesoporous precursor of coordination complex Ni{sub 3}(btc){sub 2}.12H{sub 2}O (btc = 1,3,5-benzenrtricarboxylic acid) is synthesized based on the metal-organic coordination mechanism by a hydrothermal method. Then mesoporous NiO with a bimodal size distribution is obtained by calcining the precursor in the air, and characterized by transmission electron microscopy and N{sub 2} adsorption measurements. Such unique multiple porous structure indicates a promising application of the obtained NiO as electrode materials for supercapacitors. The electrochemical behavior has been investigated by cyclic voltammogram, electrochemical impedance spectra and chronopotentiometry in 3 wt.% KOH aqueous electrolyte. The results reveal that the prepared NiO has high-capacitance retention at high scan rate and exhibits excellent cycle-life stability due to its special mesoporous character with bimodal size distribution.

  16. Beta-binomial regression and bimodal utilization.

    Science.gov (United States)

    Liu, Chuan-Fen; Burgess, James F; Manning, Willard G; Maciejewski, Matthew L

    2013-10-01

    To illustrate how the analysis of bimodal U-shaped distributed utilization can be modeled with beta-binomial regression, which is rarely used in health services research. Veterans Affairs (VA) administrative data and Medicare claims in 2001-2004 for 11,123 Medicare-eligible VA primary care users in 2000. We compared means and distributions of VA reliance (the proportion of all VA/Medicare primary care visits occurring in VA) predicted from beta-binomial, binomial, and ordinary least-squares (OLS) models. Beta-binomial model fits the bimodal distribution of VA reliance better than binomial and OLS models due to the nondependence on normality and the greater flexibility in shape parameters. Increased awareness of beta-binomial regression may help analysts apply appropriate methods to outcomes with bimodal or U-shaped distributions. © Health Research and Educational Trust.

  17. In-situ observations of a bi-modal ion distribution in the outer coma of comet P/Halley

    Science.gov (United States)

    Thomsen, M. F.; Feldman, W. C.; Wilken, B.; Jockers, K.; Stuedemann, W.

    1987-01-01

    Observations obtained by the Johnstone Plasma Analyzer on the Giotto fly-by of comet Halley showed a fairly sudden decrease in the count rate of energetic (about 30 KeV) water-group ions inside about 500,000 km from the nucleus. This decrease was accompanied by the appearance of a new water-group ion population at slightly lower energies (less than 10 KeV). Close inspection reveals that this lower-energy peak was also present somewhat earlier in the postshock flow but only became prominent near the sudden transition just described. It is shown that the observed bimodal ion distribution is well explained in terms of the velocity history of the accreting solar wind flow in the outer coma. The decline in count rate of the energetic pick-up distribution is due to a relatively sudden slowing of the bulk flow there and not to a loss of particles. Hence, charge-exchange cooling of the flow is probably not important at these distances from the nucleus. The observations suggest that pitch-angle scattering is fairly efficient at least after the bow shock, but that energy diffusion is probably not very efficient.

  18. Bimodality emerges from transport model calculations of heavy ion collisions at intermediate energy

    Science.gov (United States)

    Mallik, S.; Das Gupta, S.; Chaudhuri, G.

    2016-04-01

    This work is a continuation of our effort [S. Mallik, S. Das Gupta, and G. Chaudhuri, Phys. Rev. C 91, 034616 (2015)], 10.1103/PhysRevC.91.034616 to examine if signatures of a phase transition can be extracted from transport model calculations of heavy ion collisions at intermediate energy. A signature of first-order phase transition is the appearance of a bimodal distribution in Pm(k ) in finite systems. Here Pm(k ) is the probability that the maximum of the multiplicity distribution occurs at mass number k . Using a well-known model for event generation [Botzmann-Uehling-Uhlenbeck (BUU) plus fluctuation], we study two cases of central collision: mass 40 on mass 40 and mass 120 on mass 120. Bimodality is seen in both the cases. The results are quite similar to those obtained in statistical model calculations. An intriguing feature is seen. We observe that at the energy where bimodality occurs, other phase-transition-like signatures appear. There are breaks in certain first-order derivatives. We then examine if such breaks appear in standard BUU calculations without fluctuations. They do. The implication is interesting. If first-order phase transition occurs, it may be possible to recognize that from ordinary BUU calculations. Probably the reason this has not been seen already is because this aspect was not investigated before.

  19. Inversion of multiwavelength Raman lidar data for retrieval of bimodal aerosol size distribution

    Science.gov (United States)

    Veselovskii, Igor; Kolgotin, Alexei; Griaznov, Vadim; Müller, Detlef; Franke, Kathleen; Whiteman, David N.

    2004-02-01

    We report on the feasibility of deriving microphysical parameters of bimodal particle size distributions from Mie-Raman lidar based on a triple Nd:YAG laser. Such an instrument provides backscatter coefficients at 355, 532, and 1064 nm and extinction coefficients at 355 and 532 nm. The inversion method employed is Tikhonov's inversion with regularization. Special attention has been paid to extend the particle size range for which this inversion scheme works to ~10 μm, which makes this algorithm applicable to large particles, e.g., investigations concerning the hygroscopic growth of aerosols. Simulations showed that surface area, volume concentration, and effective radius are derived to an accuracy of ~50% for a variety of bimodal particle size distributions. For particle size distributions with an effective radius of rims along which anthropogenic pollution mixes with marine aerosols. Measurement cases obtained from the Institute for Tropospheric Research six-wavelength aerosol lidar observations during the Indian Ocean Experiment were used to test the capabilities of the algorithm for experimental data sets. A benchmark test was attempted for the case representing anthropogenic aerosols between a broken cloud deck. A strong contribution of particle volume in the coarse mode of the particle size distribution was found.

  20. Bimodal distribution of the magnetic dipole moment in nanoparticles with a monomodal distribution of the physical size

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2015-01-01

    High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal distribution of the magnetic dipole moment. Here, we test this assumption for different types of superparamagnetic iron oxide nanoparticles in the 5–20 nm range, by multimodal fitting of magnetization curves using the MINORIM inversion method. The particles are studied while in dilute colloidal dispersion in a liquid, thereby preventing hysteresis and diminishing the effects of magnetic anisotropy on the interpretation of the magnetization curves. For two different types of well crystallized particles, the magnetic distribution is indeed log-normal, as expected from the physical size distribution. However, two other types of particles, with twinning defects or inhomogeneous oxide phases, are found to have a bimodal magnetic distribution. Our qualitative explanation is that relatively low fields are sufficient to begin aligning the particles in the liquid on the basis of their net dipole moment, whereas higher fields are required to align the smaller domains or less magnetic phases inside the particles. - Highlights: • Multimodal fits of dilute ferrofluids reveal when the particles are multidomain. • No a priori shape of the distribution is assumed by the MINORIM inversion method. • Well crystallized particles have log-normal TEM and magnetic size distributions. • Defective particles can combine a monomodal size and a bimodal dipole moment

  1. X-ray diffraction microstructural analysis of bimodal size distribution MgO nano powder

    International Nuclear Information System (INIS)

    Suminar Pratapa; Budi Hartono

    2009-01-01

    Investigation on the characteristics of x-ray diffraction data for MgO powdered mixture of nano and sub-nano particles has been carried out to reveal the crystallite-size-related microstructural information. The MgO powders were prepared by co-precipitation method followed by heat treatment at 500 degree Celsius and 1200 degree Celsius for 1 hour, being the difference in the temperature was to obtain two powders with distinct crystallite size and size-distribution. The powders were then blended in air to give the presumably bimodal-size- distribution MgO nano powder. High-quality laboratory X-ray diffraction data for the powders were collected and then analysed using Rietveld-based MAUD software using the lognormal size distribution. Results show that the single-mode powders exhibit spherical crystallite size (R) of 20(1) nm and 160(1) nm for the 500 degree Celsius and 1200 degree Celsius data respectively with the nano metric powder displays narrower crystallite size distribution character, indicated by lognormal dispersion parameter of 0.21 as compared to 0.01 for the sub-nano metric powder. The mixture exhibits relatively more asymmetric peak broadening. Analysing the x-ray diffraction data for the latter specimen using single phase approach give unrealistic results. Introducing two phase models for the double-phase mixture to accommodate the bimodal-size-distribution characteristics give R = 100(6) and σ = 0.62 for the nano metric phase and R = 170(5) and σ= 0.12 for the σ sub-nano metric phase. (author)

  2. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  3. Elaboration of austenitic stainless steel samples with bimodal grain size distributions and investigation of their mechanical behavior

    Science.gov (United States)

    Flipon, B.; de la Cruz, L. Garcia; Hug, E.; Keller, C.; Barbe, F.

    2017-10-01

    Samples of 316L austenitic stainless steel with bimodal grain size distributions are elaborated using two distinct routes. The first one is based on powder metallurgy using spark plasma sintering of two powders with different particle sizes. The second route applies the reverse-annealing method: it consists in inducing martensitic phase transformation by plastic strain and further annealing in order to obtain two austenitic grain populations with different sizes. Microstructural analy ses reveal that both methods are suitable to generate significative grain size contrast and to control this contrast according to the elaboration conditions. Mechanical properties under tension are then characterized for different grain size distributions. Crystal plasticity finite element modelling is further applied in a configuration of bimodal distribution to analyse the role played by coarse grains within a matrix of fine grains, considering not only their volume fraction but also their spatial arrangement.

  4. Bimodal height distribution of self-assembled germanium islands grown on Si0.84Ge0.16 pseudo-substrates

    DEFF Research Database (Denmark)

    Pedersen, Erik Vesterlund; Jensen, Flemming; Shiryaev, Sergey Y.

    1998-01-01

    We have investigated the size distribution of germanium islands deposited onto a Si0.84Ge0.16 buffer layer, by atomic force microscopy. The size distribution was found to be bimodal at 630-740 degrees C and consisted of one group of smaller 'pyramidal' islands with a broad distribution of diameters...

  5. Bimodal distribution of damage morphology generated by ion implantation

    International Nuclear Information System (INIS)

    Mok, K.R.C.; Jaraiz, M.; Martin-Bragado, I.; Rubio, J.E.; Castrillo, P.; Pinacho, R.; Srinivasan, M.P.; Benistant, F.

    2005-01-01

    A nucleation and evolution model of damage based on amorphous pockets (APs) has recently been developed and implemented in an atomistic kinetic Monte Carlo simulator. In the model, APs are disordered structures (I n V m ), which are agglomerates of interstitials (I) and vacancies (V). This model has been used to study the composition and size distribution of APs during different ion implantations. Depending strongly on the dose rate, ion mass and implant temperature, the APs can evolve to a defect population where the agglomerates have a similar number of I and V (n ∼ m), or to a defect population with pure I (m ∼ 0) and pure V (n ∼ 0) clusters, or a mixture of APs and clusters. This behaviour corresponds to a bimodal (APs/clusters) distribution of damage. As the AP have different thermal stability compared to the I and V clusters, the same damage concentration obtained through different implant conditions has a different damage morphology and, consequently, exhibit a different resistance to subsequent thermal treatments

  6. Bimodal Nuclear Thermal Rocket Analysis Developments

    Science.gov (United States)

    Belair, Michael; Lavelle, Thomas; Saimento, Charles; Juhasz, Albert; Stewart, Mark

    2014-01-01

    Nuclear thermal propulsion has long been considered an enabling technology for human missions to Mars and beyond. One concept of operations for these missions utilizes the nuclear reactor to generate electrical power during coast phases, known as bimodal operation. This presentation focuses on the systems modeling and analysis efforts for a NERVA derived concept. The NERVA bimodal operation derives the thermal energy from the core tie tube elements. Recent analysis has shown potential temperature distributions in the tie tube elements that may limit the thermodynamic efficiency of the closed Brayton cycle used to generate electricity with the current design. The results of this analysis are discussed as well as the potential implications to a bimodal NERVA type reactor.

  7. Reactive Sintering of Bimodal WC-Co Hardmetals

    Directory of Open Access Journals (Sweden)

    Marek Tarraste

    2015-09-01

    Full Text Available Bimodal WC-Co hardmetals were produced using novel technology - reactive sintering. Milled and activated tungsten and graphite powders were mixed with commercial coarse grained WC-Co powder and then sintered. The microstructure of produced materials was free of defects and consisted of evenly distributed coarse and fine tungsten carbide grains in cobalt binder. The microstructure, hardness and fracture toughness of reactive sintered bimodal WC-Co hardmetals is exhibited. Developed bimodal hardmetal has perspective for demanding wear applications for its increased combined hardness and toughness. Compared to coarse material there is only slight decrease in fracture toughness (K1c is 14.7 for coarse grained and 14.4 for bimodal, hardness is increased from 1290 to 1350 HV units.DOI: http://dx.doi.org/10.5755/j01.ms.21.3.7511

  8. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  9. Bimodality in heavy ions collisions: systematic and comparisons

    International Nuclear Information System (INIS)

    Mercier, D.

    2008-11-01

    During the last few years, bi-modality in heavy ions collisions has been observed for different systems, on large energy scale (from 35 MeV/u up to 1 GeV/u). In this thesis, the bimodal behaviour of the largest fragment distribution (Zmax) is studied for different INDRA data sets. For peripheral collisions (Au+Au from 60 to 150 MeV/u, Xe+Sn 80-100 MeV/u), the influence of sorting and selections on bi-modality is tested. Then, two different approaches based on models are considered. In the first one (ELIE), bi-modality would reflect mainly the collision geometry and the Fermi motion of the nucleon. In the second one (SMM), bi-modality would reflect a phase transition of nuclear matter. The data are in favour of the second model. Zmax can then be considered as an order parameter of the transition. A re-weighting procedure producing a flat excitation energy distribution is used to achieve comparisons between various bombarding energies and theoretical predictions based on a canonical approach. A latent heat of the transition is extracted. For central collisions (Ni+Ni from 32 to 74 MeV/u and Xe+Sn from 25 to 50 MeV/u) single source events are isolated by a Discriminant Factor Analysis. Bi-modality is then looked for, in cumulating the different incident energies and in applying the re-weighting procedure of the corresponding excitation energy as done for peripheral collisions. The bi-modality behaviour is less manifest for central collisions than for peripheral ones. The possible reasons of this difference are discussed. (author)

  10. Modeling the Hydrological Cycle in the Atmosphere of Mars: Influence of a Bimodal Size Distribution of Aerosol Nucleation Particles

    Science.gov (United States)

    Shaposhnikov, Dmitry S.; Rodin, Alexander V.; Medvedev, Alexander S.; Fedorova, Anna A.; Kuroda, Takeshi; Hartogh, Paul

    2018-02-01

    We present a new implementation of the hydrological cycle scheme into a general circulation model of the Martian atmosphere. The model includes a semi-Lagrangian transport scheme for water vapor and ice and accounts for microphysics of phase transitions between them. The hydrological scheme includes processes of saturation, nucleation, particle growth, sublimation, and sedimentation under the assumption of a variable size distribution. The scheme has been implemented into the Max Planck Institute Martian general circulation model and tested assuming monomodal and bimodal lognormal distributions of ice condensation nuclei. We present a comparison of the simulated annual variations, horizontal and vertical distributions of water vapor, and ice clouds with the available observations from instruments on board Mars orbiters. The accounting for bimodality of aerosol particle distribution improves the simulations of the annual hydrological cycle, including predicted ice clouds mass, opacity, number density, and particle radii. The increased number density and lower nucleation rates bring the simulated cloud opacities closer to observations. Simulations show a weak effect of the excess of small aerosol particles on the simulated water vapor distributions.

  11. A HYPOTHESIS FOR THE COLOR BIMODALITY OF JUPITER TROJANS

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Ian; Brown, Michael E., E-mail: iwong@caltech.edu [Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, CA 91125 (United States)

    2016-10-01

    One of the most enigmatic and hitherto unexplained properties of Jupiter Trojans is their bimodal color distribution. This bimodality is indicative of two sub-populations within the Trojans, which have distinct size distributions. In this paper, we present a simple, plausible hypothesis for the origin and evolution of the two Trojan color sub-populations. In the framework of dynamical instability models of early solar system evolution, which suggest a common primordial progenitor population for both Trojans and Kuiper Belt objects, we use observational constraints to assert that the color bimodalities evident in both minor body populations developed within the primordial population prior to the onset of instability. We show that, beginning with an initial composition of rock and ices, location-dependent volatile loss through sublimation in this primordial population could have led to sharp changes in the surface composition with heliocentric distance. We propose that the depletion or retention of H{sub 2}S ice on the surface of these objects was the key factor in creating an initial color bimodality. Objects that retained H{sub 2}S on their surfaces developed characteristically redder colors upon irradiation than those that did not. After the bodies from the primordial population were scattered and emplaced into their current positions, they preserved this primordial color bimodality to the present day. We explore predictions of the volatile loss model—in particular, the effect of collisions within the Trojan population on the size distributions of the two sub-populations—and propose further experimental and observational tests of our hypothesis.

  12. A HYPOTHESIS FOR THE COLOR BIMODALITY OF JUPITER TROJANS

    International Nuclear Information System (INIS)

    Wong, Ian; Brown, Michael E.

    2016-01-01

    One of the most enigmatic and hitherto unexplained properties of Jupiter Trojans is their bimodal color distribution. This bimodality is indicative of two sub-populations within the Trojans, which have distinct size distributions. In this paper, we present a simple, plausible hypothesis for the origin and evolution of the two Trojan color sub-populations. In the framework of dynamical instability models of early solar system evolution, which suggest a common primordial progenitor population for both Trojans and Kuiper Belt objects, we use observational constraints to assert that the color bimodalities evident in both minor body populations developed within the primordial population prior to the onset of instability. We show that, beginning with an initial composition of rock and ices, location-dependent volatile loss through sublimation in this primordial population could have led to sharp changes in the surface composition with heliocentric distance. We propose that the depletion or retention of H 2 S ice on the surface of these objects was the key factor in creating an initial color bimodality. Objects that retained H 2 S on their surfaces developed characteristically redder colors upon irradiation than those that did not. After the bodies from the primordial population were scattered and emplaced into their current positions, they preserved this primordial color bimodality to the present day. We explore predictions of the volatile loss model—in particular, the effect of collisions within the Trojan population on the size distributions of the two sub-populations—and propose further experimental and observational tests of our hypothesis.

  13. Are star formation rates of galaxies bimodal?

    Science.gov (United States)

    Feldmann, Robert

    2017-09-01

    Star formation rate (SFR) distributions of galaxies are often assumed to be bimodal with modes corresponding to star-forming and quiescent galaxies, respectively. Both classes of galaxies are typically studied separately, and SFR distributions of star-forming galaxies are commonly modelled as lognormals. Using both observational data and results from numerical simulations, I argue that this division into star-forming and quiescent galaxies is unnecessary from a theoretical point of view and that the SFR distributions of the whole population can be well fitted by zero-inflated negative binomial distributions. This family of distributions has three parameters that determine the average SFR of the galaxies in the sample, the scatter relative to the star-forming sequence and the fraction of galaxies with zero SFRs, respectively. The proposed distributions naturally account for (I) the discrete nature of star formation, (II) the presence of 'dead' galaxies with zero SFRs and (III) asymmetric scatter. Excluding 'dead' galaxies, the distribution of log SFR is unimodal with a peak at the star-forming sequence and an extended tail towards low SFRs. However, uncertainties and biases in the SFR measurements can create the appearance of a bimodal distribution.

  14. Bimodal SLD Ice Accretion on a NACA 0012 Airfoil Model

    Science.gov (United States)

    Potapczuk, Mark; Tsao, Jen-Ching; King-Steen, Laura

    2016-01-01

    This presentation describes the results of ice accretion measurements on a NACA 0012 airfoil model, from the NASA Icing Research Tunnel, using an icing cloud composed of a bimodal distribution of Supercooled Large Droplets. The data consists of photographs, laser scans of the ice surface, and measurements of the mass of ice for each icing condition. The results of ice shapes accumulated as a result of exposure to an icing cloud with a bimodal droplet distribution were compared to the ice shapes resulting from an equivalent cloud composed of a droplet distribution with a standard bell curve shape.

  15. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  16. Measuring oxygen uptake in fishes with bimodal respiration.

    Science.gov (United States)

    Lefevre, S; Bayley, M; McKenzie, D J

    2016-01-01

    Respirometry is a robust method for measurement of oxygen uptake as a proxy for metabolic rate in fishes, and how species with bimodal respiration might meet their demands from water v. air has interested researchers for over a century. The challenges of measuring oxygen uptake from both water and air, preferably simultaneously, have been addressed in a variety of ways, which are briefly reviewed. These methods are not well-suited for the long-term measurements necessary to be certain of obtaining undisturbed patterns of respiratory partitioning, for example, to estimate traits such as standard metabolic rate. Such measurements require automated intermittent-closed respirometry that, for bimodal fishes, has only recently been developed. This paper describes two approaches in enough detail to be replicated by the interested researcher. These methods are for static respirometry. Measuring oxygen uptake by bimodal fishes during exercise poses specific challenges, which are described to aid the reader in designing experiments. The respiratory physiology and behaviour of air-breathing fishes is very complex and can easily be influenced by experimental conditions, and some general considerations are listed to facilitate the design of experiments. Air breathing is believed to have evolved in response to aquatic hypoxia and, probably, associated hypercapnia. The review ends by considering what realistic hypercapnia is, how hypercapnic tropical waters can become and how this might influence bimodal animals' gas exchange. © 2015 The Fisheries Society of the British Isles.

  17. Thermal induced carrier's transfer in bimodal size distribution InAs/GaAs quantum dots

    Science.gov (United States)

    Ilahi, B.; Alshehri, K.; Madhar, N. A.; Sfaxi, L.; Maaref, H.

    2018-06-01

    This work reports on the investigation of the thermal induced carriers' transfer mechanism in vertically stacked bimodal size distribution InAs/GaAs quantum dots (QD). A model treating the QD as a localized states ensemble (LSE) has been employed to fit the atypical temperature dependence of the photoluminescence (PL) emission energies and linewidth. The results suggest that thermally activated carriers transfer within the large size QD family occurs through the neighboring smaller size QD as an intermediate channel before direct carriers redistribution. The obtained activation energy suggests also the possible contribution of the wetting layer (WL) continuum states as a second mediator channel for carriers transfer.

  18. The effect of oxide particles on the strength and ductility of bulk iron with a bimodal grain size distribution

    Energy Technology Data Exchange (ETDEWEB)

    Casas, C.; Tejedor, R. [Department of Materials Science and Metallurgical Engineering, ETSEIB, Universitat Politècnica de Catalunya, Av. Diagonal 647, 08028 Barcelona (Spain); Rodríguez-baracaldo, R. [Department of Mechanical Engineering, Universidad Nacional de Colombia, Bogotá. Colombia (Colombia); Benito, J.A., E-mail: Josep.a.benito@upc.edu [Department of Materials Science and Metallurgical Engineering, EUETIB, Universitat Politècnica de Catalunya, Comte d' Urgell 187, 08036 Barcelona (Spain); Fundació CTM Centre Tecnològic de Manresa, Plaça de la Ciencia, 2, 08243 Manresa (Spain); Cabrera, J.M. [Department of Materials Science and Metallurgical Engineering, ETSEIB, Universitat Politècnica de Catalunya, Av. Diagonal 647, 08028 Barcelona (Spain); Fundació CTM Centre Tecnològic de Manresa, Plaça de la Ciencia, 2, 08243 Manresa (Spain)

    2015-03-11

    The strength and ductility of bulk nanostructured and ultrafine-grained iron containing 0.39% oxygen by weight was determined by tensile tests. Samples were obtained by consolidation of milled iron powder at 500 °C. Heat treatments were designed to cover a wide range of grain sizes spanning from 100 to 2000 nm with different percentages of coarse and nanostructured grain areas, which was defined as a bimodal grain size distribution. Transmission electron microscopy was used to determine the diameter, volume fraction and location of oxides in the microstructure. The strength was analysed following two approaches. The first one was based on the strong effect of oxides and involved the use of a mixed particle-grain boundary strengthening model, and the second one was based on simple grain boundary strengthening. The mixed model underestimated the strength of nanostructured samples, whereas the simple grain boundary model worked better. However, for specimens with a bimodal grain size, the fitting of the mixed model was better. In this case, the more effective particle strengthening was related to the dispersion of oxides inside the large ferrite grains. In addition, the bimodal samples showed an acceptable combination of strength and ductility. Again, the ferrite grains containing oxides promoted strain hardening due to the increase in dislocation activity.

  19. Transfer learning for bimodal biometrics recognition

    Science.gov (United States)

    Dan, Zhiping; Sun, Shuifa; Chen, Yanfei; Gan, Haitao

    2013-10-01

    Biometrics recognition aims to identify and predict new personal identities based on their existing knowledge. As the use of multiple biometric traits of the individual may enables more information to be used for recognition, it has been proved that multi-biometrics can produce higher accuracy than single biometrics. However, a common problem with traditional machine learning is that the training and test data should be in the same feature space, and have the same underlying distribution. If the distributions and features are different between training and future data, the model performance often drops. In this paper, we propose a transfer learning method for face recognition on bimodal biometrics. The training and test samples of bimodal biometric images are composed of the visible light face images and the infrared face images. Our algorithm transfers the knowledge across feature spaces, relaxing the assumption of same feature space as well as same underlying distribution by automatically learning a mapping between two different but somewhat similar face images. According to the experiments in the face images, the results show that the accuracy of face recognition has been greatly improved by the proposed method compared with the other previous methods. It demonstrates the effectiveness and robustness of our method.

  20. Joint Probability Distributions for a Class of Non-Markovian Processes

    OpenAIRE

    Baule, A.; Friedrich, R.

    2004-01-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H.C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single time probability distributions to the case of N-time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fr...

  1. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...

  2. Bimodal Nanoparticle Size Distributions Produced by Laser Ablation of Microparticles in Aerosols

    International Nuclear Information System (INIS)

    Nichols, William T.; Malyavanatham, Gokul; Henneke, Dale E.; O'Brien, Daniel T.; Becker, Michael F.; Keto, John W.

    2002-01-01

    Silver nanoparticles were produced by laser ablation of a continuously flowing aerosol of microparticles in nitrogen at varying laser fluences. Transmission electron micrographs were analyzed to determine the effect of laser fluence on the nanoparticle size distribution. These distributions exhibited bimodality with a large number of particles in a mode at small sizes (3-6-nm) and a second, less populated mode at larger sizes (11-16-nm). Both modes shifted to larger sizes with increasing laser fluence, with the small size mode shifting by 35% and the larger size mode by 25% over a fluence range of 0.3-4.2-J/cm 2 . Size histograms for each mode were found to be well represented by log-normal distributions. The distribution of mass displayed a striking shift from the large to the small size mode with increasing laser fluence. These results are discussed in terms of a model of nanoparticle formation from two distinct laser-solid interactions. Initially, laser vaporization of material from the surface leads to condensation of nanoparticles in the ambient gas. Material evaporation occurs until the plasma breakdown threshold of the microparticles is reached, generating a shock wave that propagates through the remaining material. Rapid condensation of the vapor in the low-pressure region occurs behind the traveling shock wave. Measurement of particle size distributions versus gas pressure in the ablation region, as well as, versus microparticle feedstock size confirmed the assignment of the larger size mode to surface-vaporization and the smaller size mode to shock-formed nanoparticles

  3. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  4. Joint probability distributions for a class of non-Markovian processes.

    Science.gov (United States)

    Baule, A; Friedrich, R

    2005-02-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.

  5. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  6. Some applications of the fractional Poisson probability distribution

    International Nuclear Information System (INIS)

    Laskin, Nick

    2009-01-01

    Physical and mathematical applications of the recently invented fractional Poisson probability distribution have been presented. As a physical application, a new family of quantum coherent states has been introduced and studied. As mathematical applications, we have developed the fractional generalization of Bell polynomials, Bell numbers, and Stirling numbers of the second kind. The appearance of fractional Bell polynomials is natural if one evaluates the diagonal matrix element of the evolution operator in the basis of newly introduced quantum coherent states. Fractional Stirling numbers of the second kind have been introduced and applied to evaluate the skewness and kurtosis of the fractional Poisson probability distribution function. A representation of the Bernoulli numbers in terms of fractional Stirling numbers of the second kind has been found. In the limit case when the fractional Poisson probability distribution becomes the Poisson probability distribution, all of the above listed developments and implementations turn into the well-known results of the quantum optics and the theory of combinatorial numbers.

  7. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  8. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  9. Bimodal emotion congruency is critical to preverbal infants' abstract rule learning.

    Science.gov (United States)

    Tsui, Angeline Sin Mei; Ma, Yuen Ki; Ho, Anna; Chow, Hiu Mei; Tseng, Chia-huei

    2016-05-01

    Extracting general rules from specific examples is important, as we must face the same challenge displayed in various formats. Previous studies have found that bimodal presentation of grammar-like rules (e.g. ABA) enhanced 5-month-olds' capacity to acquire a rule that infants failed to learn when the rule was presented with visual presentation of the shapes alone (circle-triangle-circle) or auditory presentation of the syllables (la-ba-la) alone. However, the mechanisms and constraints for this bimodal learning facilitation are still unknown. In this study, we used audio-visual relation congruency between bimodal stimulation to disentangle possible facilitation sources. We exposed 8- to 10-month-old infants to an AAB sequence consisting of visual faces with affective expressions and/or auditory voices conveying emotions. Our results showed that infants were able to distinguish the learned AAB rule from other novel rules under bimodal stimulation when the affects in audio and visual stimuli were congruently paired (Experiments 1A and 2A). Infants failed to acquire the same rule when audio-visual stimuli were incongruently matched (Experiment 2B) and when only the visual (Experiment 1B) or the audio (Experiment 1C) stimuli were presented. Our results highlight that bimodal facilitation in infant rule learning is not only dependent on better statistical probability and redundant sensory information, but also the relational congruency of audio-visual information. A video abstract of this article can be viewed at https://m.youtube.com/watch?v=KYTyjH1k9RQ. © 2015 John Wiley & Sons Ltd.

  10. THE SLUGGS SURVEY: NGC 3115, A CRITICAL TEST CASE FOR METALLICITY BIMODALITY IN GLOBULAR CLUSTER SYSTEMS

    International Nuclear Information System (INIS)

    Brodie, Jean P.; Conroy, Charlie; Arnold, Jacob A.; Romanowsky, Aaron J.; Usher, Christopher; Forbes, Duncan A.; Strader, Jay

    2012-01-01

    Due to its proximity (9 Mpc) and the strongly bimodal color distribution of its spectroscopically well-sampled globular cluster (GC) system, the early-type galaxy NGC 3115 provides one of the best available tests of whether the color bimodality widely observed in GC systems generally reflects a true metallicity bimodality. Color bimodality has alternatively been attributed to a strongly nonlinear color-metallicity relation reflecting the influence of hot horizontal-branch stars. Here, we couple Subaru Suprime-Cam gi photometry with Keck/DEIMOS spectroscopy to accurately measure GC colors and a CaT index that measures the Ca II triplet. We find the NGC 3115 GC system to be unambiguously bimodal in both color and the CaT index. Using simple stellar population models, we show that the CaT index is essentially unaffected by variations in horizontal-branch morphology over the range of metallicities relevant to GC systems (and is thus a robust indicator of metallicity) and confirm bimodality in the metallicity distribution. We assess the existing evidence for and against multiple metallicity subpopulations in early- and late-type galaxies and conclude that metallicity bi/multimodality is common. We briefly discuss how this fundamental characteristic links directly to the star formation and assembly histories of galaxies.

  11. THE SLUGGS SURVEY: NGC 3115, A CRITICAL TEST CASE FOR METALLICITY BIMODALITY IN GLOBULAR CLUSTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    Brodie, Jean P.; Conroy, Charlie; Arnold, Jacob A.; Romanowsky, Aaron J. [University of California Observatories and Department of Astronomy and Astrophysics, University of California, Santa Cruz, CA 95064 (United States); Usher, Christopher; Forbes, Duncan A. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Strader, Jay, E-mail: brodie@ucolick.org [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States)

    2012-11-10

    Due to its proximity (9 Mpc) and the strongly bimodal color distribution of its spectroscopically well-sampled globular cluster (GC) system, the early-type galaxy NGC 3115 provides one of the best available tests of whether the color bimodality widely observed in GC systems generally reflects a true metallicity bimodality. Color bimodality has alternatively been attributed to a strongly nonlinear color-metallicity relation reflecting the influence of hot horizontal-branch stars. Here, we couple Subaru Suprime-Cam gi photometry with Keck/DEIMOS spectroscopy to accurately measure GC colors and a CaT index that measures the Ca II triplet. We find the NGC 3115 GC system to be unambiguously bimodal in both color and the CaT index. Using simple stellar population models, we show that the CaT index is essentially unaffected by variations in horizontal-branch morphology over the range of metallicities relevant to GC systems (and is thus a robust indicator of metallicity) and confirm bimodality in the metallicity distribution. We assess the existing evidence for and against multiple metallicity subpopulations in early- and late-type galaxies and conclude that metallicity bi/multimodality is common. We briefly discuss how this fundamental characteristic links directly to the star formation and assembly histories of galaxies.

  12. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  13. Stellar Rotation with Kepler and Gaia: Evidence for a Bimodal Star Formation History

    Science.gov (United States)

    Davenport, James

    2018-01-01

    Kepler stars with rotation periods measured via starspot modulations in their light curves have been matched against the astrometric data from Gaia Data Release 1. A total of 1,299 bright rotating stars were recovered, most with temperatures hotter than 5000 K. From these, 894 were selected as being near the main sequence. These main sequence stars show a bimodality in their rotation period distribution, centered around a ~600 Myr rotation-isochrone. This feature matches the bimodal period distribution found in cooler stars with Kepler, but was previously undetected for solar-type stars due to sample contamination by subgiant and binary stars. A tenuous connection between the rotation period and total proper motion is found, suggesting the period bimodality is due to the age distribution of stars within 300pc of the Sun, rather than a phase of rapid angular momentum loss. I will discuss how the combination of Kepler/K2/TESS with Gaia will enable us to map the star formation history of our galactic neighborhood.

  14. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  15. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  16. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  17. 'Bi-modal' isoscalar giant dipole strength in 58Ni

    International Nuclear Information System (INIS)

    Nayak, B.K.; Garg, U.; Hedden, M.; Koss, M.; Li, T.; Liu, Y.; Madhusudhana Rao, P.V.; Zhu, S.; Itoh, M.; Sakaguchi, H.; Takeda, H.; Uchida, M.; Yasuda, Y.; Yosoi, M.; Fujimura, H.; Fujiwara, M.; Hara, K.; Kawabata, T.; Akimune, H.; Harakeh, M.N.

    2006-01-01

    The strength distribution of the isoscalar giant dipole resonance (ISGDR) in 58 Ni has been obtained over the energy range 10.5-49.5 MeV via extreme forward angle scattering (including 0 deg.) of 386 MeV α particles. We observe a 'bi-modal' E1 strength distribution for the first time in an A<90 nucleus. The observed ISGDR strength distribution is in reasonable agreement with the predictions of a recent RPA calculation

  18. Dynamical and statistical bimodality in nuclear fragmentation

    Science.gov (United States)

    Mallik, S.; Chaudhuri, G.; Gulminelli, F.

    2018-02-01

    The origin of bimodal behavior in the residue distribution experimentally measured in heavy ion reactions is reexamined using Boltzmann-Uehling-Uhlenbeck simulations. We suggest that, depending on the incident energy and impact parameter of the reaction, both entrance channel and exit channel effects can be at the origin of the observed behavior. Specifically, fluctuations in the reaction mechanism induced by fluctuations in the collision rate, as well as thermal bimodality directly linked to the nuclear liquid-gas phase transition, are observed in our simulations. Both phenomenologies were previously proposed in the literature but presented as incompatible and contradictory interpretations of the experimental measurements. These results indicate that heavy ion collisions at intermediate energies can be viewed as a powerful tool to study both bifurcations induced by out-of-equilibrium critical phenomena, as well as finite-size precursors of thermal phase transitions.

  19. Resolving the age bimodality of galaxy stellar populations on kpc scales

    NARCIS (Netherlands)

    Zibetti, Stefano; Gallazzi, Anna R.; Ascasibar, Y.; Charlot, S.; Galbany, L.; García Benito, R.; Kehrig, C.; de Lorenzo-Cáceres, A.; Lyubenova, M.; Marino, R. A.; Márquez, I.; Sánchez, S. F.; van de Ven, G.; Walcher, C. J.; Wisotzki, L.

    2017-01-01

    Galaxies in the local Universe are known to follow bimodal distributions in the global stellar population properties. We analyse the distribution of the local average stellar population ages of 654 053 sub-galactic regions resolved on ˜1 kpc scales in a volume-corrected sample of 394 galaxies, drawn

  20. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  1. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  2. Spectra of globular clusters in the Sombrero galaxy: evidence for spectroscopic metallicity bimodality

    Science.gov (United States)

    Alves-Brito, Alan; Hau, George K. T.; Forbes, Duncan A.; Spitler, Lee R.; Strader, Jay; Brodie, Jean P.; Rhode, Katherine L.

    2011-11-01

    We present a large sample of over 200 integrated-light spectra of confirmed globular clusters (GCs) associated with the Sombrero (M104) galaxy taken with the Deep Imaging Multi-Object Spectrograph (DEIMOS) instrument on the Keck telescope. A significant fraction of the spectra have signal-to-noise ratio levels high enough to allow measurements of GC metallicities using the method of Brodie & Huchra. We find a distribution of spectroscopic metallicities in the range -2.2 < [Fe/H] < +0.1 that is bimodal, with peaks at [Fe/H]˜-1.4 and -0.6. Thus, the GC system of the Sombrero galaxy, like a few other galaxies now studied in detail, reveals a bimodal spectroscopic metallicity distribution supporting the long-held belief that colour bimodality reflects two metallicity subpopulations. This further suggests that the transformation from optical colour to metallicity for old stellar populations, such as GCs, is not strongly non-linear. We also explore the radial and magnitude distribution with metallicity for GC subpopulations but small number statistics prevent any clear trends in these distributions. Based on observations obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California and the National Aeronautics and Space Administration.

  3. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  4. Acid-base and ion balance in fishes with bimodal respiration.

    Science.gov (United States)

    Shartau, R B; Brauner, C J

    2014-03-01

    The evolution of air breathing during the Devonian provided early fishes with bimodal respiration with a stable O2 supply from air. This was, however, probably associated with challenges and trade-offs in terms of acid-base balance and ionoregulation due to reduced gill:water interaction and changes in gill morphology associated with air breathing. While many aspects of acid-base and ionoregulation in air-breathing fishes are similar to water breathers, the specific cellular and molecular mechanisms involved remain largely unstudied. In general, reduced ionic permeability appears to be an important adaptation in the few bimodal fishes investigated but it is not known if this is a general characteristic. The kidney appears to play an important role in minimizing ion loss to the freshwater environment in the few species investigated, and while ion uptake across the gut is probably important, it has been largely unexplored. In general, air breathing in facultative air-breathing fishes is associated with an acid-base disturbance, resulting in an increased partial pressure of arterial CO2 and a reduction in extracellular pH (pHE ); however, several fishes appear to be capable of tightly regulating tissue intracellular pH (pHI ), despite a large sustained reduction in pHE , a trait termed preferential pHI regulation. Further studies are needed to determine whether preferential pHI regulation is a general trait among bimodal fishes and if this confers reduced sensitivity to acid-base disturbances, including those induced by hypercarbia, exhaustive exercise and hypoxia or anoxia. Additionally, elucidating the cellular and molecular mechanisms may yield insight into whether preferential pHI regulation is a trait ultimately associated with the early evolution of air breathing in vertebrates. © 2014 The Fisheries Society of the British Isles.

  5. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  6. Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2017-01-01

    The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days

  7. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  8. The Taylor-expansion method of moments for the particle system with bimodal distribution

    Directory of Open Access Journals (Sweden)

    Liu Yan-Hua

    2013-01-01

    Full Text Available This paper derives the multipoint Taylor expansion method of moments for the bimodal particle system. The collision effects are modeled by the internal and external coagulation terms. Simple theory and numerical tests are performed to prove the effect of the current model.

  9. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  10. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  11. Galex Lyman-alpha Emitters: Physical Properties, Luminosity Bimodality, And Morphologies.

    Science.gov (United States)

    Mallery, Ryan P.

    2010-01-01

    The Galaxy Evolution Explorer spectroscopic survey has uncovered a large statistically significant sample of Lyman-alpha emitters at z sim0.3. ACS imaging of these sources in the COSMOS and AEGIS deep fields reveals that these Lyman-alpha emitters consist of two distinct galaxy morphologies, face on spiral galaxies and compact starburst/merging systems. The morphology bimodality also results in a bimodal distribution of optical luminosity. A comparison between the UV photometry and MIPS 24 micron detections of these sources indicates that they are bluer, and have less dust extinction than similar star forming galaxies that lack Lyman-alpha detection. Our findings show how the global gas and dust distribution of star forming galaxies inhibits Lyman-alpha emission in star forming galaxies. GALEX is a NASA Small Explorer, launched in April 2003. We gratefully acknowledge NASA's support for construction, operation, and science analysis for the GALEX mission, developed in cooperation with the CNES of France and the Korean Ministry of Science and Technology.

  12. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Here we derive the most probable degree distribution emerging ... the structural entropy of power-law networks is an increasing function of the expo- .... tition function Z of the network as the sum over all degree distributions, with given energy.

  13. Geometry of q-Exponential Family of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Shun-ichi Amari

    2011-06-01

    Full Text Available The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability estimator.

  14. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  15. Nonlinear Color–Metallicity Relations of Globular Clusters. VII. Nonlinear Absorption-line Index versus Metallicity Relations and Bimodal Index Distributions of NGC 5128 Globular Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sooyoung; Yoon, Suk-Jin, E-mail: sjyoon0691@yonsei.ac.kr [Department of Astronomy and Center for Galaxy Evolution Research, Yonsei University, Seoul 120-749 (Korea, Republic of)

    2017-07-01

    Spectroscopy on the globular cluster (GC) system of NGC 5128 revealed bimodality in absorption-line index distributions of its old GCs. GC division is a widely observed and studied phenomenon whose interpretation has depicted host galaxy formation and evolution such that it harbors two distinct metallicity groups. Such a conventional view of GC bimodality has mainly been based on photometry. The recent GC photometric data, however, presented an alternative perspective in which the nonlinear metallicity-to-color transformation is responsible for color bimodality of GC systems. Here we apply the same line of analysis to the spectral indices and examine the absorption-line index versus metallicity relations for the NGC 5128 GC system. NGC 5128 GCs display nonlinearity in the metallicity-index planes, most prominently for the Balmer lines and by a non-negligible degree for the metallicity-sensitive magnesium line. We demonstrate that the observed spectroscopic division of NGC 5128 GCs can be caused by the nonlinear nature of the metallicity-to-index conversions and thus one does not need to resort to two separate GC subgroups. Our analysis incorporating this nonlinearity provides a new perspective on the structure of NGC 5128's GC system, and a further piece to the global picture of the formation of GC systems and their host galaxies.

  16. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  17. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  18. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  19. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  20. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  1. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  2. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  3. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  4. Significant Corrosion Resistance in an Ultrafine-Grained Al6063 Alloy with a Bimodal Grain-Size Distribution through a Self-Anodic Protection Mechanism

    Directory of Open Access Journals (Sweden)

    Mahdieh Shakoori Oskooie

    2016-12-01

    Full Text Available The bimodal microstructures of Al6063 consisting of 15, 30, and 45 vol. % coarse-grained (CG bands within the ultrafine-grained (UFG matrix were synthesized via blending of high-energy mechanically milled powders with unmilled powders followed by hot powder extrusion. The corrosion behavior of the bimodal specimens was assessed by means of polarization, steady-state cyclic polarization and impedance tests, whereas their microstructural features and corrosion products were examined using optical microscopy (OM, scanning transmission electron microscopy (STEM, field emission scanning electron microscopy (FE-SEM, electron backscattered diffraction (EBSD, energy dispersive spectroscopy (EDS, and X-ray diffraction (XRD techniques. The bimodal Al6063 containing 15 vol. % CG phase exhibits the highest corrosion resistance among the bimodal microstructures and even superior electrochemical behavior compared with the plain UFG and CG materials in the 3.5% NaCl solution. The enhanced corrosion resistance is attributed to the optimum cathode to anode surface area ratio that gives rise to the formation of an effective galvanic couple between CG areas and the UFG matrix. The operational galvanic coupling leads to the domination of a “self-anodic protection system” on bimodal microstructure and consequently forms a uniform thick protective passive layer over it. In contrast, the 45 vol. % CG bimodal specimen shows the least corrosion resistance due to the catastrophic galvanic corrosion in UFG regions. The observed results for UFG Al6063 suggest that metallurgical tailoring of the grain structure in terms of bimodal microstructures leads to simultaneous enhancement in the electrochemical behavior and mechanical properties of passivable alloys that are usually inversely correlated. The mechanism of self-anodic protection for passivable metals with bimodal microstructures is discussed here for the first time.

  5. Language choice in bimodal bilingual development

    Directory of Open Access Journals (Sweden)

    Diane eLillo-Martin

    2014-10-01

    Full Text Available Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children.Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending – expressions in both speech and sign simultaneously – an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children’s language choices.This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult.Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant

  6. The Centaurus cluster of galaxies. II. The bimodal-velocity structure

    International Nuclear Information System (INIS)

    Lucey, J.R.; Currie, M.J.; Dickens, R.J.

    1985-09-01

    This is the second paper in a series that describes an extensive study of the Centaurus cluster of galaxies. The paper concerns the bimodal velocity distribution of the galaxies in the cluster. The likely location of the two main cluster components is discussed. The data strongly favours the hypothesis that the two components lie within the same cluster. (UK)

  7. Bimodal Programming: A Survey of Current Clinical Practice.

    Science.gov (United States)

    Siburt, Hannah W; Holmes, Alice E

    2015-06-01

    The purpose of this study was to determine the current clinical practice in approaches to bimodal programming in the United States. To be specific, if clinicians are recommending bimodal stimulation, who programs the hearing aid in the bimodal condition, and what method is used for programming the hearing aid? An 11-question online survey was created and sent via email to a comprehensive list of cochlear implant programming centers in the United States. The survey was sent to 360 recipients. Respondents in this study represented a diverse group of clinical settings (response rate: 26%). Results indicate little agreement about who programs the hearing aids, when they are programmed, and how they are programmed in the bimodal condition. Analysis of small versus large implant centers indicated small centers are less likely to add a device to the contralateral ear. Although a growing number of cochlear implant recipients choose to wear a hearing aid on the contralateral ear, there is inconsistency in the current clinical approach to bimodal programming. These survey results provide evidence of large variability in the current bimodal programming practices and indicate a need for more structured clinical recommendations and programming approaches.

  8. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  9. Evaluation of burst probability for tubes by Weibull distributions

    International Nuclear Information System (INIS)

    Kao, S.

    1975-10-01

    The investigations of candidate distributions that best describe the burst pressure failure probability characteristics of nuclear power steam generator tubes has been continued. To date it has been found that the Weibull distribution provides an acceptable fit for the available data from both the statistical and physical viewpoints. The reasons for the acceptability of the Weibull distribution are stated together with the results of tests for the suitability of fit. In exploring the acceptability of the Weibull distribution for the fitting, a graphical method to be called the ''density-gram'' is employed instead of the usual histogram. With this method a more sensible graphical observation on the empirical density may be made for cases where the available data is very limited. Based on these methods estimates of failure pressure are made for the left-tail probabilities

  10. Tribological properties and morphology of bimodal elastomeric nitrile butadiene rubber networks

    International Nuclear Information System (INIS)

    Guo, Yin; Wang, Jiaxu; Li, Kang; Ding, Xingwu

    2013-01-01

    Highlights: • Bimodal elastomeric NBR as a new material was developed. • The structure of bimodal elastomeric NBR networks was determined. • The relationship between structure and mechanical properties was investigated. • The tribological properties and mechanisms of bimodal NBR were analyzed. • The benefits of bimodal NBR in the field of tribology were discussed. - Abstract: Bimodal nitrile butadiene rubber (NBR) was examined in this study. The molecular structure was determined by dynamic mechanical analysis and transmission electron microscopy. The relationship between the structure and the mechanical properties related to elastomeric tribological properties was investigated. The properties and the mechanisms of friction and wear of bimodal elastomeric NBR networks were also analyzed. The lubricating characteristics of bimodal NBR networks were revealed based on the mechanisms of friction and wear. Results show that bimodal NBR networks are similar to bimodal polydimethylsiloxane networks. The form and density of the network structure can be controlled from elastomeric networks to thermosetting resin networks. The mechanical properties of bimodal NBR networks, such as elasticity, elongation at break, fatigue characteristic, tensile strength, elastic modulus, and thermal stability can be precisely controlled following the variation in network structure. The friction, wear, and lubrication of bimodal NBR networks can be clearly described according to the principles of tribology. Common elastomers cannot simultaneously reduce friction and wear because of the different mechanisms of friction and wear; however, bimodal elastomer networks can efficiently address this problem

  11. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  12. Simulation of Daily Weather Data Using Theoretical Probability Distributions.

    Science.gov (United States)

    Bruhn, J. A.; Fry, W. E.; Fick, G. W.

    1980-09-01

    A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previous day. Both minimum relative humidity and total solar radiation are assumed to be normally distributed. The values of the parameters describing the distribution of minimum relative humidity is dependent on rainfall occurrence on the previous day and current day. Parameter values for total solar radiation are dependent on the occurrence of rain on the current day. The assumptions made during model construction were found to be appropriate for actual weather data from Geneva, New York. The performance of the weather model was evaluated by comparing the cumulative frequency distributions of simulated weather data with the distributions of actual weather data from Geneva, New York and Fort Collins, Colorado. For each location, simulated weather data were similar to actual weather data in terms of mean response, variability and autocorrelation. The possible applications of this model when used with models of other components of the agro-ecosystem are discussed.

  13. Probabilistic analysis and fatigue damage assessment of offshore mooring system due to non-Gaussian bimodal tension processes

    Science.gov (United States)

    Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng

    2017-08-01

    Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.

  14. Diverse Kir expression contributes to distinct bimodal distribution of resting potentials and vasotone responses of arterioles.

    Directory of Open Access Journals (Sweden)

    Yuqin Yang

    Full Text Available The resting membrane potential (RP of vascular smooth muscle cells (VSMCs is a major determinant of cytosolic calcium concentration and vascular tone. The heterogeneity of RPs and its underlying mechanism among different vascular beds remain poorly understood. We compared the RPs and vasomotion properties between the guinea pig spiral modiolar artery (SMA, brain arterioles (BA and mesenteric arteries (MA. We found: 1 RPs showed a robust bimodal distribution peaked at -76 and -40 mV evenly in the SMA, unevenly at -77 and -51 mV in the BA and ~-71 and -52 mV in the MA. Ba(2+ 0.1 mM eliminated their high RP peaks ~-75 mV. 2 Cells with low RP (~-45 mV hyperpolarized in response to 10 mM extracellular K(+, while cells with a high RP depolarized, and cells with intermediate RP (~-58 mV displayed an initial hyperpolarization followed by prolonged depolarization. Moderate high K(+ typically induced dilation, constriction and a dilation followed by constriction in the SMA, MA and BA, respectively. 3 Boltzmann-fit analysis of the Ba(2+-sensitive inward rectifier K(+ (Kir whole-cell current showed that the maximum Kir conductance density significantly differed among the vessels, and the half-activation voltage was significantly more negative in the MA. 4 Corresponding to the whole-cell data, computational modeling simulated the three RP distribution patterns and the dynamics of RP changes obtained experimentally, including the regenerative swift shifts between the two RP levels after reaching a threshold. 5 Molecular works revealed strong Kir2.1 and Kir2.2 transcripts and Kir2.1 immunolabeling in all 3 vessels, while Kir2.3 and Kir2.4 transcript levels varied. We conclude that a dense expression of functional Kir2.X channels underlies the more negative RPs in endothelial cells and a subset of VSMC in these arterioles, and the heterogeneous Kir function is primarily responsible for the distinct bimodal RPs among these arterioles. The fast Kir

  15. The Development of Bimodal Bilingualism: Implications for Linguistic Theory.

    Science.gov (United States)

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2016-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.

  16. Alert-derivative bimodal space power and propulsion systems

    International Nuclear Information System (INIS)

    Houts, M.G.; Ranken, W.A.; Buksa, J.J.

    1994-01-01

    Safe, reliable, low-mass bimodal space power and propulsion systems could have numerous civilian and military applications. This paper discusses potential bimodal systems that could be derived from the ALERT space fission power supply concept. These bimodal concepts have the potential for providing 5 to 10 kW of electrical power and a total impulse of 100 MN-s at an average specific impulse of 770 s. System mass is on the order of 1000 kg

  17. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  18. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    Science.gov (United States)

    Potdar, Alka A; Jeon, Junhwan; Weaver, Alissa M; Quaranta, Vito; Cummings, Peter T

    2010-03-10

    Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells) that exist in multi-cellular organisms (humans) follow a bimodal correlated random walk (BCRW). Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases) each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation). Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  19. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    Directory of Open Access Journals (Sweden)

    Alka A Potdar

    2010-03-01

    Full Text Available Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells that exist in multi-cellular organisms (humans follow a bimodal correlated random walk (BCRW.Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation.Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  20. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  1. Velocity selection for ultra-cold atoms using bimodal mazer cavity

    International Nuclear Information System (INIS)

    Irshad, A.; Qamar, S.

    2009-04-01

    In this paper, we discuss the velocity selection of ultra-cold three-level atoms in Λ configuration using a micromazer. Our model is the same as discussed by Arun et al., for mazer action in a bimodal cavity. We have shown that significantly narrowed velocity distribution of ultra-cold atoms can be obtained in this system due to the presence of dark states. (author)

  2. Age bimodality in the central region of pseudo-bulges in S0 galaxies

    Science.gov (United States)

    Mishra, Preetish K.; Barway, Sudhanshu; Wadadekar, Yogesh

    2017-11-01

    We present evidence for bimodal stellar age distribution of pseudo-bulges of S0 galaxies as probed by the Dn(4000) index. We do not observe any bimodality in age distribution for pseudo-bulges in spiral galaxies. Our sample is flux limited and contains 2067 S0 and 2630 spiral galaxies drawn from the Sloan Digital Sky Survey. We identify pseudo-bulges in S0 and spiral galaxies, based on the position of the bulge on the Kormendy diagram and their central velocity dispersion. Dividing the pseudo-bulges of S0 galaxies into those containing old and young stellar populations, we study the connection between global star formation and pseudo-bulge age on the u - r colour-mass diagram. We find that most old pseudo-bulges are hosted by passive galaxies while majority of young bulges are hosted by galaxies that are star forming. Dividing our sample of S0 galaxies into early-type S0s and S0/a galaxies, we find that old pseudo-bulges are mainly hosted by early-type S0 galaxies while most of the pseudo-bulges in S0/a galaxies are young. We speculate that morphology plays a strong role in quenching of star formation in the disc of these S0 galaxies, which stops the growth of pseudo-bulges, giving rise to old pseudo-bulges and the observed age bimodality.

  3. Fragment size distribution in viscous bag breakup of a drop

    Science.gov (United States)

    Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.

    2015-11-01

    In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.

  4. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  5. THE BIMODAL METALLICITY DISTRIBUTION OF THE COOL CIRCUMGALACTIC MEDIUM AT z ∼< 1

    International Nuclear Information System (INIS)

    Lehner, N.; Howk, J. C.; Tripp, T. M.; Tumlinson, J.; Thom, C.; Fox, A. J.; Prochaska, J. X.; Werk, J. K.; O'Meara, J. M.; Ribaudo, J.

    2013-01-01

    We assess the metal content of the cool (∼10 4 K) circumgalactic medium (CGM) about galaxies at z ∼ H I ∼ H I selection avoids metallicity biases inherent in many previous studies of the low-redshift CGM. We compare the column densities of weakly ionized metal species (e.g., O II, Si II, Mg II) to N H I in the strongest H I component of each absorber. We find that the metallicity distribution of the LLS (and hence the cool CGM) is bimodal with metal-poor and metal-rich branches peaking at [X/H] ≅ –1.6 and –0.3 (or about 2.5% and 50% solar metallicities). The cool CGM probed by these LLS is predominantly ionized. The metal-rich branch of the population likely traces winds, recycled outflows, and tidally stripped gas; the metal-poor branch has properties consistent with cold accretion streams thought to be a major source of fresh gas for star forming galaxies. Both branches have a nearly equal number of absorbers. Our results thus demonstrate there is a significant mass of previously undiscovered cold metal-poor gas and confirm the presence of metal enriched gas in the CGM of z ∼< 1 galaxies.

  6. Feynman quasi probability distribution for spin-(1/2), and its generalizations

    International Nuclear Information System (INIS)

    Colucci, M.

    1999-01-01

    It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects

  7. Influence of dose distribution homogeneity on the tumor control probability in heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan

    2001-01-01

    In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%

  8. Flexible transparent conducting films with embedded silver networks composed of bimodal-sized nanoparticles for heater application

    Science.gov (United States)

    Park, Ji Sun; Song, Yookyung; Park, Daseul; Kim, Yeon-Won; Kim, Yoon Jin

    2018-06-01

    A facile one-pot synthetic method for preparing the Ag nanoparticle inks with a bimodal size distribution was newly devised and they were successfully employed as a conducting filler to form the metal-mesh type transparent conducting electrodes on the flexible substrate. Bimodal-sized Ag nanoparticles were synthesized through the polyol process, and their size variation was occurred via finely tuned composition ratio between Ag+ ions and polymeric capping agents. The prepared bimodal-sized Ag nanoparticles exhibited the form of well-dispersed Ag nanoparticle inks without adding any dispersants and dispersion process. By filling the patterned micro-channels engraved on the flexible polymer substrate using a bimodal-sized Ag nanoparticle ink, a metal-mesh type transparent electrode (transmittance: 90% at 550 nm, haze: 1.5, area: 8 × 8 cm2) was fabricated. By applying DC voltage to the mesh type electrode, a flexible transparent joule heater was successfully achieved with a performance of 4.5 °C s‑1 heat-up rate at a low input power density.

  9. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  10. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  11. Gaze-independent ERP-BCIs: augmenting performance through location-congruent bimodal stimuli

    Science.gov (United States)

    Thurlings, Marieke E.; Brouwer, Anne-Marie; Van Erp, Jan B. F.; Werkhoven, Peter

    2014-01-01

    Gaze-independent event-related potential (ERP) based brain-computer interfaces (BCIs) yield relatively low BCI performance and traditionally employ unimodal stimuli. Bimodal ERP-BCIs may increase BCI performance due to multisensory integration or summation in the brain. An additional advantage of bimodal BCIs may be that the user can choose which modality or modalities to attend to. We studied bimodal, visual-tactile, gaze-independent BCIs and investigated whether or not ERP components’ tAUCs and subsequent classification accuracies are increased for (1) bimodal vs. unimodal stimuli; (2) location-congruent vs. location-incongruent bimodal stimuli; and (3) attending to both modalities vs. to either one modality. We observed an enhanced bimodal (compared to unimodal) P300 tAUC, which appeared to be positively affected by location-congruency (p = 0.056) and resulted in higher classification accuracies. Attending either to one or to both modalities of the bimodal location-congruent stimuli resulted in differences between ERP components, but not in classification performance. We conclude that location-congruent bimodal stimuli improve ERP-BCIs, and offer the user the possibility to switch the attended modality without losing performance. PMID:25249947

  12. Geometry planning and image registration in magnetic particle imaging using bimodal fiducial markers

    International Nuclear Information System (INIS)

    Werner, F.; Hofmann, M.; Them, K.; Knopp, T.; Jung, C.; Salamon, J.; Kaul, M. G.; Mummert, T.; Adam, G.; Ittrich, H.; Werner, R.; Säring, D.; Weber, O. M.

    2016-01-01

    Purpose: Magnetic particle imaging (MPI) is a quantitative imaging modality that allows the distribution of superparamagnetic nanoparticles to be visualized. Compared to other imaging techniques like x-ray radiography, computed tomography (CT), and magnetic resonance imaging (MRI), MPI only provides a signal from the administered tracer, but no additional morphological information, which complicates geometry planning and the interpretation of MP images. The purpose of the authors’ study was to develop bimodal fiducial markers that can be visualized by MPI and MRI in order to create MP–MR fusion images. Methods: A certain arrangement of three bimodal fiducial markers was developed and used in a combined MRI/MPI phantom and also during in vivo experiments in order to investigate its suitability for geometry planning and image fusion. An algorithm for automated marker extraction in both MR and MP images and rigid registration was established. Results: The developed bimodal fiducial markers can be visualized by MRI and MPI and allow for geometry planning as well as automated registration and fusion of MR–MP images. Conclusions: To date, exact positioning of the object to be imaged within the field of view (FOV) and the assignment of reconstructed MPI signals to corresponding morphological regions has been difficult. The developed bimodal fiducial markers and the automated image registration algorithm help to overcome these difficulties.

  13. Geometry planning and image registration in magnetic particle imaging using bimodal fiducial markers

    Energy Technology Data Exchange (ETDEWEB)

    Werner, F., E-mail: f.werner@uke.de; Hofmann, M.; Them, K.; Knopp, T. [Section for Biomedical Imaging, University Medical Center Hamburg-Eppendorf, Hamburg 20246, Germany and Institute for Biomedical Imaging, Hamburg University of Technology, Hamburg 21073 (Germany); Jung, C.; Salamon, J.; Kaul, M. G.; Mummert, T.; Adam, G.; Ittrich, H. [Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Hamburg 20246 (Germany); Werner, R.; Säring, D. [Institute for Computational Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg 20246 (Germany); Weber, O. M. [Philips Medical Systems DMC GmbH, Hamburg 22335 (Germany)

    2016-06-15

    Purpose: Magnetic particle imaging (MPI) is a quantitative imaging modality that allows the distribution of superparamagnetic nanoparticles to be visualized. Compared to other imaging techniques like x-ray radiography, computed tomography (CT), and magnetic resonance imaging (MRI), MPI only provides a signal from the administered tracer, but no additional morphological information, which complicates geometry planning and the interpretation of MP images. The purpose of the authors’ study was to develop bimodal fiducial markers that can be visualized by MPI and MRI in order to create MP–MR fusion images. Methods: A certain arrangement of three bimodal fiducial markers was developed and used in a combined MRI/MPI phantom and also during in vivo experiments in order to investigate its suitability for geometry planning and image fusion. An algorithm for automated marker extraction in both MR and MP images and rigid registration was established. Results: The developed bimodal fiducial markers can be visualized by MRI and MPI and allow for geometry planning as well as automated registration and fusion of MR–MP images. Conclusions: To date, exact positioning of the object to be imaged within the field of view (FOV) and the assignment of reconstructed MPI signals to corresponding morphological regions has been difficult. The developed bimodal fiducial markers and the automated image registration algorithm help to overcome these difficulties.

  14. Cancerous epithelial cell lines shed extracellular vesicles with a bimodal size distribution that is sensitive to glutamine inhibition

    International Nuclear Information System (INIS)

    Santana, Steven Michael; Kirby, Brian J; Antonyak, Marc A; Cerione, Richard A

    2014-01-01

    Extracellular shed vesicles (ESVs) facilitate a unique mode of cell–cell communication wherein vesicle uptake can induce a change in the recipient cell's state. Despite the intensity of ESV research, currently reported data represent the bulk characterization of concentrated vesicle samples with little attention paid to heterogeneity. ESV populations likely represent diversity in mechanisms of formation, cargo and size. To better understand ESV subpopulations and the signaling cascades implicated in their formation, we characterize ESV size distributions to identify subpopulations in normal and cancerous epithelial cells. We have discovered that cancer cells exhibit bimodal ESV distributions, one small-diameter and another large-diameter population, suggesting that two mechanisms may govern ESV formation, an exosome population and a cancer-specific microvesicle population. Altered glutamine metabolism in cancer is thought to fuel cancer growth but may also support metastatic niche formation through microvesicle production. We describe the role of a glutaminase inhibitor, compound 968, in ESV production. We have discovered that inhibiting glutamine metabolism significantly impairs large-diameter microvesicle production in cancer cells. (paper)

  15. Disentangling internal and external factors in bimodal acquisition

    NARCIS (Netherlands)

    Hulk, A.; Van den Bogaerde, B.

    2016-01-01

    In this commentary we address some of the internal and external factors which are generally found to interact with purely linguistic factors in the languages of bimodal children, and which we think should be taken into account while analysing the bimodal data.

  16. Generalization of Poisson distribution for the case of changing probability of consequential events

    International Nuclear Information System (INIS)

    Kushnirenko, E.

    1995-01-01

    The generalization of the Poisson distribution for the case of changing probabilities of the consequential events is done. It is shown that the classical Poisson distribution is the special case of this generalized distribution when the probabilities of the consequential events are constant. The using of the generalized Poisson distribution gives the possibility in some cases to obtain analytical result instead of making Monte-Carlo calculation

  17. High temperature tensile properties and fracture characteristics of bimodal 12Cr-ODS steel

    International Nuclear Information System (INIS)

    Chauhan, Ankur; Litvinov, Dimitri; Aktaa, Jarir

    2016-01-01

    This article describes the tensile properties and fracture characteristics of a 12Cr oxide dispersion strengthened (ODS) ferritic steel with unique elongated bimodal grain size distribution. The tensile tests were carried out at four different temperatures, ranging from room temperature to 700 °C, at a nominal strain rate of 10"−"3 s"−"1. At room temperature the material exhibits a high tensile strength of 1294 MPa and high yield strength of 1200 MPa. At 700 °C, the material still exhibits relatively high tensile strength of 300 MPa. The total elongation-to-failure exceeds 18% over the whole temperature range and has a maximum value of 29% at 600 °C. This superior ductility is attributed to the material's bimodal grain size distribution. In comparison to other commercial, as well as experimental, ODS steels, the material shows an excellent compromise between strength and ductility. The fracture surface studies reveal a change in fracture behavior from a mixed mode fracture at room temperature to fully ductile fracture at 600 °C. At 700 °C, the fracture path changes from intragranular to intergranular fracture, which is associated with a reduced ductility. - Highlights: • The steel has a unique elongated bimodal grain size distribution. • The steel shows an excellent compromise between strength and ductility. • Superior ductility in comparison to other commercial and experimental ODS steels. • Fracture behavior changes from mixed mode fracture at room temperature to fully ductile fracture at 600 °C. • Fracture path changes from intragranular to intergranular fracture at 700 °C.

  18. Family of probability distributions derived from maximal entropy principle with scale invariant restrictions.

    Science.gov (United States)

    Sonnino, Giorgio; Steinbrecher, György; Cardinali, Alessandro; Sonnino, Alberto; Tlidi, Mustapha

    2013-01-01

    Using statistical thermodynamics, we derive a general expression of the stationary probability distribution for thermodynamic systems driven out of equilibrium by several thermodynamic forces. The local equilibrium is defined by imposing the minimum entropy production and the maximum entropy principle under the scale invariance restrictions. The obtained probability distribution presents a singularity that has immediate physical interpretation in terms of the intermittency models. The derived reference probability distribution function is interpreted as time and ensemble average of the real physical one. A generic family of stochastic processes describing noise-driven intermittency, where the stationary density distribution coincides exactly with the one resulted from entropy maximization, is presented.

  19. Study on probability distribution of fire scenarios in risk assessment to emergency evacuation

    International Nuclear Information System (INIS)

    Chu Guanquan; Wang Jinhui

    2012-01-01

    Event tree analysis (ETA) is a frequently-used technique to analyze the probability of probable fire scenario. The event probability is usually characterized by definite value. It is not appropriate to use definite value as these estimates may be the result of poor quality statistics and limited knowledge. Without addressing uncertainties, ETA will give imprecise results. The credibility of risk assessment will be undermined. This paper presents an approach to address event probability uncertainties and analyze probability distribution of probable fire scenario. ETA is performed to construct probable fire scenarios. The activation time of every event is characterized as stochastic variable by considering uncertainties of fire growth rate and other input variables. To obtain probability distribution of probable fire scenario, Markov Chain is proposed to combine with ETA. To demonstrate the approach, a case study is presented.

  20. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    International Nuclear Information System (INIS)

    Huang Zhifu; Lin Bihong; ChenJincan

    2009-01-01

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  1. Collective motions of globally coupled oscillators and some probability distributions on circle

    Energy Technology Data Exchange (ETDEWEB)

    Jaćimović, Vladimir [Faculty of Natural Sciences and Mathematics, University of Montenegro, Cetinjski put, bb., 81000 Podgorica (Montenegro); Crnkić, Aladin, E-mail: aladin.crnkic@hotmail.com [Faculty of Technical Engineering, University of Bihać, Ljubijankićeva, bb., 77000 Bihać, Bosnia and Herzegovina (Bosnia and Herzegovina)

    2017-06-28

    In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.

  2. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  3. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximate...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... such as the renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation...

  4. Comparative DNA isolation behaviours of silica and polymer based sorbents in batch fashion: monodisperse silica microspheres with bimodal pore size distribution as a new sorbent for DNA isolation.

    Science.gov (United States)

    Günal, Gülçin; Kip, Çiğdem; Eda Öğüt, S; İlhan, Hasan; Kibar, Güneş; Tuncel, Ali

    2018-02-01

    Monodisperse silica microspheres with bimodal pore-size distribution were proposed as a high performance sorbent for DNA isolation in batch fashion under equilibrium conditions. The proposed sorbent including both macroporous and mesoporous compartments was synthesized 5.1 μm in-size, by a "staged shape templated hydrolysis and condensation method". Hydrophilic polymer based sorbents were also obtained in the form of monodisperse-macroporous microspheres ca 5.5 μm in size, with different functionalities, by a developed "multi-stage microsuspension copolymerization" technique. The batch DNA isolation performance of proposed material was comparatively investigated using polymer based sorbents with similar morphologies. Among all sorbents tried, the best DNA isolation performance was achieved with the monodisperse silica microspheres with bimodal pore size distribution. The collocation of interconnected mesoporous and macroporous compartments within the monodisperse silica microspheres provided a high surface area and reduced the intraparticular mass transfer resistance and made easier both the adsorption and desorption of DNA. Among the polymer based sorbents, higher DNA isolation yields were achieved with the monodisperse-macroporous polymer microspheres carrying trimethoxysilyl and quaternary ammonium functionalities. However, batch DNA isolation performances of polymer based sorbents were significantly lower with respect to the silica microspheres.

  5. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  7. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  8. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  9. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  10. Probability Distribution and Deviation Information Fusion Driven Support Vector Regression Model and Its Application

    Directory of Open Access Journals (Sweden)

    Changhao Fan

    2017-01-01

    Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.

  11. Microstructure, plastic deformation and strengthening mechanisms of an Al–Mg–Si alloy with a bimodal grain structure

    International Nuclear Information System (INIS)

    Shakoori Oskooie, M.; Asgharzadeh, H.; Kim, H.S.

    2015-01-01

    Highlights: • Al6063 with bimodal grain structures was fabricated by a powder metallurgy route. • The bimodal alloys showed a reasonable ductility together with a high strength. • Grain boundary strengthening was reduced at higher fraction of coarse grains. • The enhanced tensile ductility was attributed to crack blunting and delamination. - Abstract: Al6063 alloys with bimodal grain size distributions comprised of ultrafine-grained (UFG) and coarse-grained (CG) regions were produced via mechanical milling followed by hot extrusion. High-energy planetary ball milling for 22.5 h with a rotational speed of 350 rpm was employed for the synthesis of nanocrystalline Al6063 powders. The as-milled Al6063 powders were mixed with 15, 30, and 45 vol.% of the unmilled powders and then the powder mixtures were consolidated via extrusion at 450 °C with an extrusion ratio of 9:1. The microstructure of the bimodal extrudates was investigated using optical microscope, transmission electron microscope (TEM) and field emission scanning electron microscope equipped with an electron backscattered diffraction (EBSD) detector. The deformation behavior was investigated by means of uniaxial tensile tests. The bimodal Al6063 exhibited balanced mechanical properties, including high yield stress and ultimate tensile strength resulting from the UFG regions together with reasonable ductility attained from the CG areas. The fracture surfaces demonstrated a ductile fracture mode, in which the dimple size was correlated with the grain structure. The strengthening mechanisms are discussed based on the dislocation models and the functions of the CGs in the deformation behavior and ductility enhancement of bimodal Al6063 are explored

  12. Irreducible complexity of iterated symmetric bimodal maps

    Directory of Open Access Journals (Sweden)

    J. P. Lampreia

    2005-01-01

    Full Text Available We introduce a tree structure for the iterates of symmetric bimodal maps and identify a subset which we prove to be isomorphic to the family of unimodal maps. This subset is used as a second factor for a ∗-product that we define in the space of bimodal kneading sequences. Finally, we give some properties for this product and study the ∗-product induced on the associated Markov shifts.

  13. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... are thoroughly discussed in the case of rectangular representation of uncertainty by the uniform probability distribution and the interval, respectively. Also triangular representations are dealt with and compared. Calculation of monotonic as well as non-monotonic functions of variables represented...

  14. WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Smagin

    2016-12-01

    Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.

  15. Probability distribution of long-run indiscriminate felling of trees in ...

    African Journals Online (AJOL)

    The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...

  16. Theoretical derivation of wind power probability distribution function and applications

    International Nuclear Information System (INIS)

    Altunkaynak, Abdüsselam; Erdik, Tarkan; Dabanlı, İsmail; Şen, Zekai

    2012-01-01

    Highlights: ► Derivation of wind power stochastic characteristics are standard deviation and the dimensionless skewness. ► The perturbation is expressions for the wind power statistics from Weibull probability distribution function (PDF). ► Comparisons with the corresponding characteristics of wind speed PDF abides by the Weibull PDF. ► The wind power abides with the Weibull-PDF. -- Abstract: The instantaneous wind power contained in the air current is directly proportional with the cube of the wind speed. In practice, there is a record of wind speeds in the form of a time series. It is, therefore, necessary to develop a formulation that takes into consideration the statistical parameters of such a time series. The purpose of this paper is to derive the general wind power formulation in terms of the statistical parameters by using the perturbation theory, which leads to a general formulation of the wind power expectation and other statistical parameter expressions such as the standard deviation and the coefficient of variation. The formulation is very general and can be applied specifically for any wind speed probability distribution function. Its application to two-parameter Weibull probability distribution of wind speeds is presented in full detail. It is concluded that provided wind speed is distributed according to a Weibull distribution, the wind power could be derived based on wind speed data. It is possible to determine wind power at any desired risk level, however, in practical studies most often 5% or 10% risk levels are preferred and the necessary simple procedure is presented for this purpose in this paper.

  17. Numerical Loading of a Maxwellian Probability Distribution Function

    International Nuclear Information System (INIS)

    Lewandowski, J.L.V.

    2003-01-01

    A renormalization procedure for the numerical loading of a Maxwellian probability distribution function (PDF) is formulated. The procedure, which involves the solution of three coupled nonlinear equations, yields a numerically loaded PDF with improved properties for higher velocity moments. This method is particularly useful for low-noise particle-in-cell simulations with electron dynamics

  18. On the probability distribution of the stochastic saturation scale in QCD

    International Nuclear Information System (INIS)

    Marquet, C.; Soyez, G.; Xiao Bowen

    2006-01-01

    It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations

  19. A bimodal biometric identification system

    Science.gov (United States)

    Laghari, Mohammad S.; Khuwaja, Gulzar A.

    2013-03-01

    Biometrics consists of methods for uniquely recognizing humans based upon one or more intrinsic physical or behavioral traits. Physicals are related to the shape of the body. Behavioral are related to the behavior of a person. However, biometric authentication systems suffer from imprecision and difficulty in person recognition due to a number of reasons and no single biometrics is expected to effectively satisfy the requirements of all verification and/or identification applications. Bimodal biometric systems are expected to be more reliable due to the presence of two pieces of evidence and also be able to meet the severe performance requirements imposed by various applications. This paper presents a neural network based bimodal biometric identification system by using human face and handwritten signature features.

  20. Calculation of magnetization curves and probability distribution for monoclinic and uniaxial systems

    International Nuclear Information System (INIS)

    Sobh, Hala A.; Aly, Samy H.; Yehia, Sherif

    2013-01-01

    We present the application of a simple classical statistical mechanics-based model to selected monoclinic and hexagonal model systems. In this model, we treat the magnetization as a classical vector whose angular orientation is dictated by the laws of equilibrium classical statistical mechanics. We calculate for these anisotropic systems, the magnetization curves, energy landscapes and probability distribution for different sets of relevant parameters and magnetic fields of different strengths and directions. Our results demonstrate a correlation between the most probable orientation of the magnetization vector, the system's parameters, and the external magnetic field. -- Highlights: ► We calculate magnetization curves and probability angular distribution of the magnetization. ► The magnetization curves are consistent with probability results for the studied systems. ► Monoclinic and hexagonal systems behave differently due to their different anisotropies

  1. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  2. Aggressive Bimodal Communication in Domestic Dogs, Canis familiaris.

    Science.gov (United States)

    Déaux, Éloïse C; Clarke, Jennifer A; Charrier, Isabelle

    2015-01-01

    Evidence of animal multimodal signalling is widespread and compelling. Dogs' aggressive vocalisations (growls and barks) have been extensively studied, but without any consideration of the simultaneously produced visual displays. In this study we aimed to categorize dogs' bimodal aggressive signals according to the redundant/non-redundant classification framework. We presented dogs with unimodal (audio or visual) or bimodal (audio-visual) stimuli and measured their gazing and motor behaviours. Responses did not qualitatively differ between the bimodal and two unimodal contexts, indicating that acoustic and visual signals provide redundant information. We could not further classify the signal as 'equivalent' or 'enhancing' as we found evidence for both subcategories. We discuss our findings in relation to the complex signal framework, and propose several hypotheses for this signal's function.

  3. Quantum Fourier transform, Heisenberg groups and quasi-probability distributions

    International Nuclear Information System (INIS)

    Patra, Manas K; Braunstein, Samuel L

    2011-01-01

    This paper aims to explore the inherent connection between Heisenberg groups, quantum Fourier transform (QFT) and (quasi-probability) distribution functions. Distribution functions for continuous and finite quantum systems are examined from three perspectives and all of them lead to Weyl-Gabor-Heisenberg groups. The QFT appears as the intertwining operator of two equivalent representations arising out of an automorphism of the group. Distribution functions correspond to certain distinguished sets in the group algebra. The marginal properties of a particular class of distribution functions (Wigner distributions) arise from a class of automorphisms of the group algebra of the Heisenberg group. We then study the reconstruction of the Wigner function from the marginal distributions via inverse Radon transform giving explicit formulae. We consider some applications of our approach to quantum information processing and quantum process tomography.

  4. New family of probability distributions with applications to Monte Carlo studies

    International Nuclear Information System (INIS)

    Johnson, M.E.; Tietjen, G.L.; Beckman, R.J.

    1980-01-01

    A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution

  5. Research on bimodal particle extinction coefficient during Brownian coagulation and condensation for the entire particle size regime

    International Nuclear Information System (INIS)

    Tang Hong; Lin Jianzhong

    2011-01-01

    The extinction coefficient of atmospheric aerosol particles influences the earth’s radiation balance directly or indirectly, and it can be determined by the scattering and absorption characteristics of aerosol particles. The problem of estimating the change of extinction coefficient due to time evolution of bimodal particle size distribution is studied, and two improved methods for calculating the Brownian coagulation coefficient and the condensation growth rate are proposed, respectively. Through the improved method based on Otto kernel, the Brownian coagulation coefficient can be expressed simply in powers of particle volume for the entire particle size regime based on the fitted polynomials of the mean enhancement function. Meanwhile, the improved method based on Fuchs–Sutugin kernel is developed to obtain the condensation growth rate for the entire particle size regime. And then, the change of the overall extinction coefficient of bimodal distributions undergoing Brownian coagulation and condensation can be estimated comprehensively for the entire particle size regime. Simulation experiments indicate that the extinction coefficients obtained with the improved methods coincide fairly well with the true values, which provide a simple, reliable, and general method to estimate the change of extinction coefficient for the entire particle size regime during the bimodal particle dynamic processes.

  6. Aggressive Bimodal Communication in Domestic Dogs, Canis familiaris.

    Directory of Open Access Journals (Sweden)

    Éloïse C Déaux

    Full Text Available Evidence of animal multimodal signalling is widespread and compelling. Dogs' aggressive vocalisations (growls and barks have been extensively studied, but without any consideration of the simultaneously produced visual displays. In this study we aimed to categorize dogs' bimodal aggressive signals according to the redundant/non-redundant classification framework. We presented dogs with unimodal (audio or visual or bimodal (audio-visual stimuli and measured their gazing and motor behaviours. Responses did not qualitatively differ between the bimodal and two unimodal contexts, indicating that acoustic and visual signals provide redundant information. We could not further classify the signal as 'equivalent' or 'enhancing' as we found evidence for both subcategories. We discuss our findings in relation to the complex signal framework, and propose several hypotheses for this signal's function.

  7. Precipitation intensity probability distribution modelling for hydrological and construction design purposes

    International Nuclear Information System (INIS)

    Koshinchanov, Georgy; Dimitrov, Dobri

    2008-01-01

    The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: - The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; - As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. - Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; - Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; - Next method is considering only

  8. Evidence of A Bimodal US GDP Growth Rate Distribution: A Wavelet Approach

    Directory of Open Access Journals (Sweden)

    Sandro Claudio Lera

    2017-04-01

    Full Text Available We present a quantitative characterisation of the fluctuations of the annualized growth rate of the real US GDP per capita at many scales, using a wavelet transform analysis of two data sets, quarterly data from 1947 to 2015 and annual data from 1800 to 2010. The chosen mother wavelet (first derivative of the Gaussian function applied to the logarithm of the real US GDP per capita provides a robust estimation of the instantaneous growth rate at different scales. Our main finding is that business cycles appear at all scales and the distribution of GDP growth rates can be well approximated by a bimodal function associated to a series of switches between regimes of strong growth rate $\\rho_\\text{high}$ and regimes of low growth rate $\\rho_\\text{low}$. The succession of such two regimes compounds to produce a remarkably stable long term average real annualized growth rate of 1.6% from 1800 to 2010 and $\\approx 2.0\\%$ since 1950, which is the result of a subtle compensation between the high and low growth regimes that alternate continuously. Thus, the overall growth dynamics of the US economy is punctuated, with phases of strong growth that are intrinsically unsustainable, followed by corrections or consolidation until the next boom starts. We interpret these findings within the theory of "social bubbles" and argue as a consequence that estimations of the cost of the 2008 crisis may be misleading. We also interpret the absence of strong recovery since 2008 as a protracted low growth regime $\\rho_\\text{low}$ associated with the exceptional nature of the preceding large growth regime.

  9. NONLINEAR COLOR-METALLICITY RELATIONS OF GLOBULAR CLUSTERS. II. A TEST ON THE NONLINEARITY SCENARIO FOR COLOR BIMODALITY USING THE u-BAND COLORS: THE CASE OF M87 (NGC 4486)

    International Nuclear Information System (INIS)

    Yoon, Suk-Jin; Lee, Sang-Yoon; Kim, Hak-Sub; Cho, Jaeil; Chung, Chul; Sohn, Sangmo T.; Blakeslee, John P.

    2011-01-01

    The optical color distributions of globular clusters (GCs) in most large elliptical galaxies are bimodal. Based on the assumed linear relationship between GC colors and their metallicities, the bimodality has been taken as evidence of two GC subsystems with different metallicities in each galaxy and has led to a number of theories in the context of galaxy formation. More recent observations and modeling of GCs, however, suggests that the color-metallicity relations (CMRs) are inflected, and thus colors likely trace metallicities in a nonlinear manner. The nonlinearity could produce bimodal color distributions from a broad underlying metallicity spread, even if it is unimodal. Despite the far-reaching implications, whether CMRs are nonlinear and whether the nonlinearity indeed causes the color bimodality are still open questions. Given that the spectroscopic refinement of CMRs is still very challenging, we here propose a new photometric technique to probe the possible nonlinear nature of CMRs. In essence, a color distribution of GCs is a 'projected' distribution of their metallicities. Since the form of CMRs hinges on which color is used, the shape of color distributions varies depending significantly on the colors. Among other optical colors, the u-band related colors (e.g., u – g and u – z) are theoretically predicted to exhibit significantly less inflected CMRs than other preferred CMRs (e.g., for g – z). As a case study, we performed the Hubble Space Telescope (HST)/WFPC2 archival u-band photometry for the M87 (NGC 4486) GC system with confirmed color bimodality. We show that the u-band color distributions are significantly different from that of g – z and consistent with our model predictions. With more u-band measurements, this method will support or rule out the nonlinear CMR scenario for the origin of GC color bimodality with high confidence. The HST/WFC3 observations in F336W for nearby large elliptical galaxies are highly anticipated in this regard.

  10. Movement, drivers and bimodality of the South Asian High

    Directory of Open Access Journals (Sweden)

    M. Nützel

    2016-11-01

    Full Text Available The South Asian High (SAH is an important component of the summer monsoon system in Asia. In this study we investigate the location and drivers of the SAH at 100 hPa during the boreal summers of 1979 to 2014 on interannual, seasonal and synoptic timescales using seven reanalyses and observational data. Our comparison of the different reanalyses focuses especially on the bimodality of the SAH, i.e. the two preferred modes of the SAH centre location: the Iranian Plateau to the west and the Tibetan Plateau to the east. We find that only the National Centers for Environmental Prediction–National Center of Atmospheric Research (NCEP–NCAR reanalysis shows a clear bimodal structure of the SAH centre distribution with respect to daily and pentad (5 day mean data. Furthermore, the distribution of the SAH centre location is highly variable from year to year. As in simple model studies, which connect the SAH to heating in the tropics, we find that the mean seasonal cycle of the SAH and its centre are dominated by the expansion of convection in the South Asian region (70–130° E  ×  15–30° N on the south-eastern border of the SAH. A composite analysis of precipitation and outgoing long-wave radiation data with respect to the location of the SAH centre reveals that a more westward (eastward location of the SAH is related to stronger (weaker convection and rainfall over India and weaker (stronger precipitation over the western Pacific.

  11. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  12. Bimodality and negative heat capacity in multifragmentation

    International Nuclear Information System (INIS)

    Tamain, B.; Bougault, R.; Lopez, O.; Pichon, M.

    2003-01-01

    This contribution addresses the question of the possible link between multifragmentation and the liquid-gas phase transition of nuclear matter. Bi-modality seems to be a robust signal of this link in the sense that theoretical calculations indicate that it is preserved even if a sizeable fraction of the available energy has not been shared among all the degrees of freedom. The corresponding measured properties are coherent with what is expected in a liquid-gas phase transition picture. Moreover, bi-modality and negative heat capacity are observed for the same set of events. (authors)

  13. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  14. Visualisation and characterisation of heterogeneous bimodal PDMS networks

    DEFF Research Database (Denmark)

    Bahrt, Frederikke; Daugaard, Anders Egede; Fleury, Clemence

    2014-01-01

    The existence of short-chain domains in heterogeneous bimodal PDMS networks has been confirmed visually, for the first time, through confocal fluorescence microscopy. The networks were prepared using a controlled reaction scheme where short PDMS chains were reacted below the gelation point...... bimodal networks with short-chain domains within a long-chain network. The average sizes of the short-chain domains were found to vary from 2.1 to 5.7 mm depending on the short-chain content. The visualised network structure could be correlated thereafter to the elastic properties, which were determined...... by rheology. All heterogeneous bimodal networks displayed significantly lower moduli than mono-modal PDMS elastomers prepared from the long polymer chains. Low-loss moduli as well as low-sol fractions indicate that low-elastic moduli can be obtained without compromising the network's structure...

  15. Bimodal magmatism produced by progressively inhibited crustal assimilation 2 (PICA)

    NARCIS (Netherlands)

    Meade, F.C.; Troll, V.R.; Ellam, R.M.; Freda, C.; Font Morales, L.; Donaldson, C.H.; Klonowska, I.

    2014-01-01

    The origin of bimodal (mafic-felsic) rock suites is a fundamental question in volcanology. Here we use major and trace elements, high-resolution Sr, Nd and Pb isotope analyses, experimental petrology and thermodynamic modelling to investigate bimodal magmatism at the iconic Carlingford Igneous

  16. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  17. Saturated hydraulic conductivity model computed from bimodal water retention curves for a range of New Zealand soils

    Directory of Open Access Journals (Sweden)

    J. A. P. Pollacco

    2017-06-01

    Full Text Available Descriptions of soil hydraulic properties, such as the soil moisture retention curve, θ(h, and saturated hydraulic conductivities, Ks, are a prerequisite for hydrological models. Since the measurement of Ks is expensive, it is frequently derived from statistical pedotransfer functions (PTFs. Because it is usually more difficult to describe Ks than θ(h from pedotransfer functions, Pollacco et al. (2013 developed a physical unimodal model to compute Ks solely from hydraulic parameters derived from the Kosugi θ(h. This unimodal Ks model, which is based on a unimodal Kosugi soil pore-size distribution, was developed by combining the approach of Hagen–Poiseuille with Darcy's law and by introducing three tortuosity parameters. We report here on (1 the suitability of the Pollacco unimodal Ks model to predict Ks for a range of New Zealand soils from the New Zealand soil database (S-map and (2 further adaptations to this model to adapt it to dual-porosity structured soils by computing the soil water flux through a continuous function of an improved bimodal pore-size distribution. The improved bimodal Ks model was tested with a New Zealand data set derived from historical measurements of Ks and θ(h for a range of soils derived from sandstone and siltstone. The Ks data were collected using a small core size of 10 cm diameter, causing large uncertainty in replicate measurements. Predictions of Ks were further improved by distinguishing topsoils from subsoil. Nevertheless, as expected, stratifying the data with soil texture only slightly improved the predictions of the physical Ks models because the Ks model is based on pore-size distribution and the calibrated parameters were obtained within the physically feasible range. The improvements made to the unimodal Ks model by using the new bimodal Ks model are modest when compared to the unimodal model, which is explained by the poor accuracy of measured total porosity. Nevertheless, the new bimodal

  18. Saturated hydraulic conductivity model computed from bimodal water retention curves for a range of New Zealand soils

    Science.gov (United States)

    Pollacco, Joseph Alexander Paul; Webb, Trevor; McNeill, Stephen; Hu, Wei; Carrick, Sam; Hewitt, Allan; Lilburne, Linda

    2017-06-01

    Descriptions of soil hydraulic properties, such as the soil moisture retention curve, θ(h), and saturated hydraulic conductivities, Ks, are a prerequisite for hydrological models. Since the measurement of Ks is expensive, it is frequently derived from statistical pedotransfer functions (PTFs). Because it is usually more difficult to describe Ks than θ(h) from pedotransfer functions, Pollacco et al. (2013) developed a physical unimodal model to compute Ks solely from hydraulic parameters derived from the Kosugi θ(h). This unimodal Ks model, which is based on a unimodal Kosugi soil pore-size distribution, was developed by combining the approach of Hagen-Poiseuille with Darcy's law and by introducing three tortuosity parameters. We report here on (1) the suitability of the Pollacco unimodal Ks model to predict Ks for a range of New Zealand soils from the New Zealand soil database (S-map) and (2) further adaptations to this model to adapt it to dual-porosity structured soils by computing the soil water flux through a continuous function of an improved bimodal pore-size distribution. The improved bimodal Ks model was tested with a New Zealand data set derived from historical measurements of Ks and θ(h) for a range of soils derived from sandstone and siltstone. The Ks data were collected using a small core size of 10 cm diameter, causing large uncertainty in replicate measurements. Predictions of Ks were further improved by distinguishing topsoils from subsoil. Nevertheless, as expected, stratifying the data with soil texture only slightly improved the predictions of the physical Ks models because the Ks model is based on pore-size distribution and the calibrated parameters were obtained within the physically feasible range. The improvements made to the unimodal Ks model by using the new bimodal Ks model are modest when compared to the unimodal model, which is explained by the poor accuracy of measured total porosity. Nevertheless, the new bimodal model provides an

  19. Functionalized bimodal mesoporous silicas as carriers for controlled aspirin delivery

    Science.gov (United States)

    Gao, Lin; Sun, Jihong; Li, Yuzhen

    2011-08-01

    The bimodal mesoporous silica modified with 3-aminopropyltriethoxysilane was performed as the aspirin carrier. The samples' structure, drug loading and release profiles were characterized with X-ray diffraction, scanning electron microscopy, N 2 adsorption and desorption, Fourier transform infrared spectroscopy, TG analysis, elemental analysis and UV-spectrophotometer. For further exploring the effects of the bimodal mesopores on the drug delivery behavior, the unimodal mesoporous material MCM-41 was also modified as the aspirin carrier. Meantime, Korsmeyer-Peppas equation ft= ktn was employed to analyze the dissolution data in details. It is indicated that the bimodal mesopores are beneficial for unrestricted drug molecules diffusing and therefore lead to a higher loading and faster releasing than that of MCM-41. The results show that the aspirin delivery properties are influenced considerably by the mesoporous matrix, whereas the large pore of bimodal mesoporous silica is the key point for the improved controlled-release properties.

  20. THE BIMODAL METALLICITY DISTRIBUTION OF THE COOL CIRCUMGALACTIC MEDIUM AT z {approx}< 1

    Energy Technology Data Exchange (ETDEWEB)

    Lehner, N.; Howk, J. C. [Department of Physics, University of Notre Dame, 225 Nieuwland Science Hall, Notre Dame, IN 46556 (United States); Tripp, T. M. [Department of Astronomy, University of Massachusetts, Amherst, MA 01003 (United States); Tumlinson, J.; Thom, C.; Fox, A. J. [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Prochaska, J. X.; Werk, J. K. [UCO/Lick Observatory, University of California, Santa Cruz, CA (United States); O' Meara, J. M. [Department of Physics, Saint Michael' s College, Vermont, One Winooski Park, Colchester, VT 05439 (United States); Ribaudo, J. [Department of Physics, Utica College, 1600 Burrstone Road, Utica, New York 13502 (United States)

    2013-06-20

    We assess the metal content of the cool ({approx}10{sup 4} K) circumgalactic medium (CGM) about galaxies at z {approx}< 1 using an H I-selected sample of 28 Lyman limit systems (LLS; defined here as absorbers with 16.2 {approx}< log N{sub H{sub I}} {approx}< 18.5) observed in absorption against background QSOs by the Cosmic Origins Spectrograph on board the Hubble Space Telescope. The N{sub H{sub I}} selection avoids metallicity biases inherent in many previous studies of the low-redshift CGM. We compare the column densities of weakly ionized metal species (e.g., O II, Si II, Mg II) to N{sub H{sub I}} in the strongest H I component of each absorber. We find that the metallicity distribution of the LLS (and hence the cool CGM) is bimodal with metal-poor and metal-rich branches peaking at [X/H] {approx_equal} -1.6 and -0.3 (or about 2.5% and 50% solar metallicities). The cool CGM probed by these LLS is predominantly ionized. The metal-rich branch of the population likely traces winds, recycled outflows, and tidally stripped gas; the metal-poor branch has properties consistent with cold accretion streams thought to be a major source of fresh gas for star forming galaxies. Both branches have a nearly equal number of absorbers. Our results thus demonstrate there is a significant mass of previously undiscovered cold metal-poor gas and confirm the presence of metal enriched gas in the CGM of z {approx}< 1 galaxies.

  1. Bounds for the probability distribution function of the linear ACD process

    OpenAIRE

    Fernandes, Marcelo

    2003-01-01

    Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

  2. On Selection of the Probability Distribution for Representing the Maximum Annual Wind Speed in East Cairo, Egypt

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh. I.; El-Hemamy, S.T.

    2013-01-01

    The main objective of this paper is to identify an appropriate probability model and best plotting position formula which represent the maximum annual wind speed in east Cairo. This model can be used to estimate the extreme wind speed and return period at a particular site as well as to determine the radioactive release distribution in case of accident occurrence at a nuclear power plant. Wind speed probabilities can be estimated by using probability distributions. An accurate determination of probability distribution for maximum wind speed data is very important in expecting the extreme value . The probability plots of the maximum annual wind speed (MAWS) in east Cairo are fitted to six major statistical distributions namely: Gumbel, Weibull, Normal, Log-Normal, Logistic and Log- Logistic distribution, while eight plotting positions of Hosking and Wallis, Hazen, Gringorten, Cunnane, Blom, Filliben, Benard and Weibull are used for determining exceedance of their probabilities. A proper probability distribution for representing the MAWS is selected by the statistical test criteria in frequency analysis. Therefore, the best plotting position formula which can be used to select appropriate probability model representing the MAWS data must be determined. The statistical test criteria which represented in: the probability plot correlation coefficient (PPCC), the root mean square error (RMSE), the relative root mean square error (RRMSE) and the maximum absolute error (MAE) are used to select the appropriate probability position and distribution. The data obtained show that the maximum annual wind speed in east Cairo vary from 44.3 Km/h to 96.1 Km/h within duration of 39 years . Weibull plotting position combined with Normal distribution gave the highest fit, most reliable, accurate predictions and determination of the wind speed in the study area having the highest value of PPCC and lowest values of RMSE, RRMSE and MAE

  3. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  4. Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence

    Directory of Open Access Journals (Sweden)

    C. C. Wu

    2011-04-01

    Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.

  5. Probability distributions for first neighbor distances between resonances that belong to two different families

    International Nuclear Information System (INIS)

    Difilippo, F.C.

    1994-01-01

    For a mixture of two families of resonances, we found the probability distribution for the distance, as first neighbors, between resonances that belong to different families. Integration of this distribution gives the probability of accidental overlapping of resonances of one isotope by resonances of the other, provided that the resonances of each isotope belong to a single family. (author)

  6. Event-related potentials to visual, auditory, and bimodal (combined auditory-visual) stimuli.

    Science.gov (United States)

    Isoğlu-Alkaç, Ummühan; Kedzior, Karina; Keskindemirci, Gonca; Ermutlu, Numan; Karamursel, Sacit

    2007-02-01

    The purpose of this study was to investigate the response properties of event related potentials to unimodal and bimodal stimulations. The amplitudes of N1 and P2 were larger during bimodal evoked potentials (BEPs) than auditory evoked potentials (AEPs) in the anterior sites and the amplitudes of P1 were larger during BEPs than VEPs especially at the parieto-occipital locations. Responses to bimodal stimulation had longer latencies than responses to unimodal stimulation. The N1 and P2 components were larger in amplitude and longer in latency during the bimodal paradigm and predominantly occurred at the anterior sites. Therefore, the current bimodal paradigm can be used to investigate the involvement and location of specific neural generators that contribute to higher processing of sensory information. Moreover, this paradigm may be a useful tool to investigate the level of sensory dysfunctions in clinical samples.

  7. Diachronic changes in word probability distributions in daily press

    Directory of Open Access Journals (Sweden)

    Stanković Jelena

    2006-01-01

    Full Text Available Changes in probability distributions of individual words and word types were investigated within two samples of daily press in the span of fifty years. Two samples of daily press were used in this study. The one derived from the Corpus of Serbian Language (CSL /Kostić, Đ., 2001/ that covers period between 1945. and 1957. and the other derived from the Ebart Media Documentation (EBR that was complied from seven daily news and five weekly magazines from 2002. and 2003. Each sample consisted of about 1 million words. The obtained results indicate that nouns and adjectives were more frequent in the CSL, while verbs and prepositions are more frequent in the EBR sample, suggesting a decrease of sentence length in the last five decades. Conspicuous changes in probability distribution of individual words were observed for nouns and adjectives, while minimal or no changes were observed for verbs and prepositions. Such an outcome suggests that nouns and adjectives are most susceptible to diachronic changes, while verbs and prepositions appear to be resistant to such changes.

  8. Probability distributions in conservative energy exchange models of multiple interacting agents

    International Nuclear Information System (INIS)

    Scafetta, Nicola; West, Bruce J

    2007-01-01

    Herein we study energy exchange models of multiple interacting agents that conserve energy in each interaction. The models differ regarding the rules that regulate the energy exchange and boundary effects. We find a variety of stochastic behaviours that manifest energy equilibrium probability distributions of different types and interaction rules that yield not only the exponential distributions such as the familiar Maxwell-Boltzmann-Gibbs distribution of an elastically colliding ideal particle gas, but also uniform distributions, truncated exponential distributions, Gaussian distributions, Gamma distributions, inverse power law distributions, mixed exponential and inverse power law distributions, and evolving distributions. This wide variety of distributions should be of value in determining the underlying mechanisms generating the statistical properties of complex phenomena including those to be found in complex chemical reactions

  9. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    Science.gov (United States)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  10. Nonlatching positive feedback enables robust bimodality by decoupling expression noise from the mean

    Energy Technology Data Exchange (ETDEWEB)

    Razooky, Brandon S. [Rockefeller Univ., New York, NY (United States). Lab. of Virology and Infectious Disease; Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Univ. of California, San Francisco, CA (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Nanophase Materials Science (CNMS); Univ. of Tennessee, Knoxville, TN (United States). Bredesen Center for Interdisciplinary; Cao, Youfang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hansen, Maike M. K. [Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Perelson, Alan S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Simpson, Michael L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Nanophase Materials Science (CNMS); Univ. of Tennessee, Knoxville, TN (United States). Bredesen Center for Interdisciplinary; Weinberger, Leor S. [Gladstone Institutes (Virology and Immunology), San Francisco, CA (United States); Univ. of California, San Francisco, CA (United States). Dept. of Biochemistry and Biophysics; Univ. of California, San Francisco, CA (United States). QB3: California Inst. of Quantitative Biosciences; Univ. of California, San Francisco, CA (United States). Dept. of Pharmaceutical Chemistry

    2017-10-18

    Fundamental to biological decision-making is the ability to generate bimodal expression patterns where two alternate expression states simultaneously exist. Here in this study, we use a combination of single-cell analysis and mathematical modeling to examine the sources of bimodality in the transcriptional program controlling HIV’s fate decision between active replication and viral latency. We find that the HIV Tat protein manipulates the intrinsic toggling of HIV’s promoter, the LTR, to generate bimodal ON-OFF expression, and that transcriptional positive feedback from Tat shifts and expands the regime of LTR bimodality. This result holds for both minimal synthetic viral circuits and full-length virus. Strikingly, computational analysis indicates that the Tat circuit’s non-cooperative ‘non-latching’ feedback architecture is optimized to slow the promoter’s toggling and generate bimodality by stochastic extinction of Tat. In contrast to the standard Poisson model, theory and experiment show that non-latching positive feedback substantially dampens the inverse noise-mean relationship to maintain stochastic bimodality despite increasing mean-expression levels. Given the rapid evolution of HIV, the presence of a circuit optimized to robustly generate bimodal expression appears consistent with the hypothesis that HIV’s decision between active replication and latency provides a viral fitness advantage. More broadly, the results suggest that positive-feedback circuits may have evolved not only for signal amplification but also for robustly generating bimodality by decoupling expression fluctuations (noise) from mean expression levels.

  11. DISCLOSING THE RADIO LOUDNESS DISTRIBUTION DICHOTOMY IN QUASARS: AN UNBIASED MONTE CARLO APPROACH APPLIED TO THE SDSS-FIRST QUASAR SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Balokovic, M. [Department of Astronomy, California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Smolcic, V. [Argelander-Institut fuer Astronomie, Auf dem Hugel 71, D-53121 Bonn (Germany); Ivezic, Z. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Zamorani, G. [INAF-Osservatorio Astronomico di Bologna, via Ranzani 1, I-40127 Bologna (Italy); Schinnerer, E. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Kelly, B. C. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States)

    2012-11-01

    We investigate the dichotomy in the radio loudness distribution of quasars by modeling their radio emission and various selection effects using a Monte Carlo approach. The existence of two physically distinct quasar populations, the radio-loud and radio-quiet quasars, is controversial and over the last decade a bimodal distribution of radio loudness of quasars has been both affirmed and disputed. We model the quasar radio luminosity distribution with simple unimodal and bimodal distribution functions. The resulting simulated samples are compared to a fiducial sample of 8300 quasars drawn from the SDSS DR7 Quasar Catalog and combined with radio observations from the FIRST survey. Our results indicate that the SDSS-FIRST sample is best described by a radio loudness distribution which consists of two components, with (12 {+-} 1)% of sources in the radio-loud component. On the other hand, the evidence for a local minimum in the loudness distribution (bimodality) is not strong and we find that previous claims for its existence were probably affected by the incompleteness of the FIRST survey close to its faint limit. We also investigate the redshift and luminosity dependence of the radio loudness distribution and find tentative evidence that at high redshift radio-loud quasars were rarer, on average louder, and exhibited a smaller range in radio loudness. In agreement with other recent work, we conclude that the SDSS-FIRST sample strongly suggests that the radio loudness distribution of quasars is not a universal function, and that more complex models than presented here are needed to fully explain available observations.

  12. DISCLOSING THE RADIO LOUDNESS DISTRIBUTION DICHOTOMY IN QUASARS: AN UNBIASED MONTE CARLO APPROACH APPLIED TO THE SDSS-FIRST QUASAR SAMPLE

    International Nuclear Information System (INIS)

    Baloković, M.; Smolčić, V.; Ivezić, Ž.; Zamorani, G.; Schinnerer, E.; Kelly, B. C.

    2012-01-01

    We investigate the dichotomy in the radio loudness distribution of quasars by modeling their radio emission and various selection effects using a Monte Carlo approach. The existence of two physically distinct quasar populations, the radio-loud and radio-quiet quasars, is controversial and over the last decade a bimodal distribution of radio loudness of quasars has been both affirmed and disputed. We model the quasar radio luminosity distribution with simple unimodal and bimodal distribution functions. The resulting simulated samples are compared to a fiducial sample of 8300 quasars drawn from the SDSS DR7 Quasar Catalog and combined with radio observations from the FIRST survey. Our results indicate that the SDSS-FIRST sample is best described by a radio loudness distribution which consists of two components, with (12 ± 1)% of sources in the radio-loud component. On the other hand, the evidence for a local minimum in the loudness distribution (bimodality) is not strong and we find that previous claims for its existence were probably affected by the incompleteness of the FIRST survey close to its faint limit. We also investigate the redshift and luminosity dependence of the radio loudness distribution and find tentative evidence that at high redshift radio-loud quasars were rarer, on average louder, and exhibited a smaller range in radio loudness. In agreement with other recent work, we conclude that the SDSS-FIRST sample strongly suggests that the radio loudness distribution of quasars is not a universal function, and that more complex models than presented here are needed to fully explain available observations.

  13. The joint probability distribution of structure factors incorporating anomalous-scattering and isomorphous-replacement data

    International Nuclear Information System (INIS)

    Peschar, R.; Schenk, H.

    1991-01-01

    A method to derive joint probability distributions of structure factors is presented which incorporates anomalous-scattering and isomorphous-replacement data in a unified procedure. The structure factors F H and F -H , whose magnitudes are different due to anomalous scattering, are shown to be isomorphously related. This leads to a definition of isomorphism by means of which isomorphous-replacement and anomalous-scattering data can be handled simultaneously. The definition and calculation of the general term of the joint probability distribution for isomorphous structure factors turns out to be crucial. Its analytical form leads to an algorithm by means of which any particular joint probability distribution of structure factors can be constructed. The calculation of the general term is discussed for the case of four isomorphous structure factors in P1, assuming the atoms to be independently and uniformly distributed. A main result is the construction of the probability distribution of the 64 triplet phase sums present in space group P1 amongst four isomorphous structure factors F H , four isomorphous F K and four isomorphous F -H-K . The procedure is readily generalized in the case where an arbitrary number of isomorphous structure factors are available for F H , F K and F -H-K . (orig.)

  14. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  15. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    International Nuclear Information System (INIS)

    Marshman, Emily; Singh, Chandralekha

    2017-01-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations. (paper)

  16. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  17. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    Science.gov (United States)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  18. Multimode Interference: Identifying Channels and Ridges in Quantum Probability Distributions

    OpenAIRE

    O'Connell, Ross C.; Loinaz, Will

    2004-01-01

    The multimode interference technique is a simple way to study the interference patterns found in many quantum probability distributions. We demonstrate that this analysis not only explains the existence of so-called "quantum carpets," but can explain the spatial distribution of channels and ridges in the carpets. With an understanding of the factors that govern these channels and ridges we have a limited ability to produce a particular pattern of channels and ridges by carefully choosing the ...

  19. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  20. Development of a Prototype Web GIS-Based Disaster Management System for Safe Operation of the Next Generation Bimodal Tram, South Korea—Focused Flooding and Snowfall

    Directory of Open Access Journals (Sweden)

    Won Seok Jang

    2014-04-01

    Full Text Available The Korea Railroad Research Institute (KRRI has developed a bimodal tram and advanced bus rapid transit (BRT system which is an optimized public transit system created by mixing the railway’s punctual operation and the bus’ easy and convenient access. The bimodal tram system provides mass-transportation service with an eco-friendly and human-centered approach. Natural disasters have been increasing worldwide in recent years, including floods, snow, and typhoons disasters. Flooding is the most frequent natural disaster in many countries and is increasingly a concern with climate change; it seriously affects people’s lives and productivity, causing considerable economic loss and significant damage. Enhanced conventional disaster management systems are needed to support comprehensive actions to secure safety and convenience. The objective of this study is to develop a prototype version of a Web GIS-based bimodal tram disaster management system (BTDMS using the Storm Water Management Model (SWMM 5.0 to enhance on-time operation and safety of the bimodal tram system. The BTDMS was tested at the bimodal tram test railroad by simulating probable maximum flood (PMF and snow melting for forecasting flooding and snow covered roads. This result could provide the basis for plans to protect against flooding disasters and snow covered roads in operating the bimodal tram system. The BTDMS will be used to assess and predict weather impacts on roadway conditions and operations and thus has the potential to influence economic growth. The methodology presented in this paper makes it possible to manage impacts of flooding and snowfall on urban transportation and enhance operation of the bimodal tram system. Such a methodology based on modeling could be created for most metropolitan areas in Korea and in many other countries.

  1. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  2. Speech Recognition and Cognitive Skills in Bimodal Cochlear Implant Users

    Science.gov (United States)

    Hua, Håkan; Johansson, Björn; Magnusson, Lennart; Lyxell, Björn; Ellis, Rachel J.

    2017-01-01

    Purpose: To examine the relation between speech recognition and cognitive skills in bimodal cochlear implant (CI) and hearing aid users. Method: Seventeen bimodal CI users (28-74 years) were recruited to the study. Speech recognition tests were carried out in quiet and in noise. The cognitive tests employed included the Reading Span Test and the…

  3. The distribution function of a probability measure on a space with a fractal structure

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Granero, M.A.; Galvez-Rodriguez, J.F.

    2017-07-01

    In this work we show how to define a probability measure with the help of a fractal structure. One of the keys of this approach is to use the completion of the fractal structure. Then we use the theory of a cumulative distribution function on a Polish ultrametric space and describe it in this context. Finally, with the help of fractal structures, we prove that a function satisfying the properties of a cumulative distribution function on a Polish ultrametric space is a cumulative distribution function with respect to some probability measure on the space. (Author)

  4. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  5. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham; Alouini, Mohamed-Slim

    2017-01-01

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP

  6. Functionalized bimodal mesoporous silicas as carriers for controlled aspirin delivery

    International Nuclear Information System (INIS)

    Gao Lin; Sun Jihong; Li Yuzhen

    2011-01-01

    The bimodal mesoporous silica modified with 3-aminopropyltriethoxysilane was performed as the aspirin carrier. The samples' structure, drug loading and release profiles were characterized with X-ray diffraction, scanning electron microscopy, N 2 adsorption and desorption, Fourier transform infrared spectroscopy, TG analysis, elemental analysis and UV-spectrophotometer. For further exploring the effects of the bimodal mesopores on the drug delivery behavior, the unimodal mesoporous material MCM-41 was also modified as the aspirin carrier. Meantime, Korsmeyer-Peppas equation f t =kt n was employed to analyze the dissolution data in details. It is indicated that the bimodal mesopores are beneficial for unrestricted drug molecules diffusing and therefore lead to a higher loading and faster releasing than that of MCM-41. The results show that the aspirin delivery properties are influenced considerably by the mesoporous matrix, whereas the large pore of bimodal mesoporous silica is the key point for the improved controlled-release properties. - Graphical abstract: Loading (A) and release profiles (B) of aspirin in N-BMMs and N-MCM-41 indicated that BMMs have more drug loading capacity and faster release rate than that MCM-41. Highlights: → Bimodal mesoporous silicas (BMMs) and MCM-41 modified with amino group via post-treatment procedure. → Loading and release profiles of aspirin in modified BMMs and MCM-41. → Modified BMMs have more drug loading capacity and faster release rate than that modified MCM-41.

  7. The p-sphere and the geometric substratum of power-law probability distributions

    International Nuclear Information System (INIS)

    Vignat, C.; Plastino, A.

    2005-01-01

    Links between power law probability distributions and marginal distributions of uniform laws on p-spheres in R n show that a mathematical derivation of the Boltzmann-Gibbs distribution necessarily passes through power law ones. Results are also given that link parameters p and n to the value of the non-extensivity parameter q that characterizes these power laws in the context of non-extensive statistics

  8. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  9. The role of martensitic transformation on bimodal grain structure in ultrafine grained AISI 304L stainless steel

    International Nuclear Information System (INIS)

    Sabooni, S.; Karimzadeh, F.; Enayati, M.H.; Ngan, A.H.W.

    2015-01-01

    In the present study, metastable AISI 304L austenitic stainless steel samples were subjected to different cold rolling reductions from 70% to 93%, followed by annealing at 700 °C for 300 min to form ultrafine grained (UFG) austenite with different grain structures. Transmission electron microscopy (TEM) and nanoindentation were used to characterize the martensitic transformation, in order to relate it to the bimodal distribution of the austenite grain size after subsequent annealing. The results showed that the martensite morphology changed from lath type in the 60% rolled sample to a mixture of lath and dislocation-cell types in the higher rolling reductions. Calculation of the Gibbs free energy change during the reversion treatment showed that the reversion mechanism is shear controlled at the annealing temperature and so the morphology of the reverted austenite is completely dependent on the morphology of the deformation induced martensite. It was found that the austenite had a bimodal grain size distribution in the 80% rolled and annealed state and this is related to the existence of different types of martensite. Increasing the rolling reduction to 93% followed by annealing caused changing of the grain structure to a monomodal like structure, which was mostly covered with small grains of around 300 nm. The existence of bimodal austenite grain size in the 80% rolled and annealed 304L stainless steel led to the improvement of ductility while maintaining a high tensile strength in comparison with the 93% rolled and annealed sample

  10. Multiple regimes of operation in bimodal AFM: understanding the energy of cantilever eigenmodes

    Directory of Open Access Journals (Sweden)

    Daniel Kiracofe

    2013-06-01

    Full Text Available One of the key goals in atomic force microscopy (AFM imaging is to enhance material property contrast with high resolution. Bimodal AFM, where two eigenmodes are simultaneously excited, confers significant advantages over conventional single-frequency tapping mode AFM due to its ability to provide contrast between regions with different material properties under gentle imaging conditions. Bimodal AFM traditionally uses the first two eigenmodes of the AFM cantilever. In this work, the authors explore the use of higher eigenmodes in bimodal AFM (e.g., exciting the first and fourth eigenmodes. It is found that such operation leads to interesting contrast reversals compared to traditional bimodal AFM. A series of experiments and numerical simulations shows that the primary cause of the contrast reversals is not the choice of eigenmode itself (e.g., second versus fourth, but rather the relative kinetic energy between the higher eigenmode and the first eigenmode. This leads to the identification of three distinct imaging regimes in bimodal AFM. This result, which is applicable even to traditional bimodal AFM, should allow researchers to choose cantilever and operating parameters in a more rational manner in order to optimize resolution and contrast during nanoscale imaging of materials.

  11. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  12. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  13. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    DEFF Research Database (Denmark)

    Helles, Glennie; Fonseca, Rasmus

    2009-01-01

    residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...... done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results: In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle...

  14. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  15. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    Science.gov (United States)

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We

  16. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  17. A transmission probability method for calculation of neutron flux distributions in hexagonal geometry

    International Nuclear Information System (INIS)

    Wasastjerna, F.; Lux, I.

    1980-03-01

    A transmission probability method implemented in the program TPHEX is described. This program was developed for the calculation of neutron flux distributions in hexagonal light water reactor fuel assemblies. The accuracy appears to be superior to diffusion theory, and the computation time is shorter than that of the collision probability method. (author)

  18. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    Science.gov (United States)

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  19. Bimodal nature in low-energy fission of light actinides

    International Nuclear Information System (INIS)

    Nagame, Yuichiro; Nishinaka, Ichiro; Tsukada, Kazuaki; Ikezoe, Hiroshi; Otsuki, Tsutomu; Sueki, Keisuke; Nakahara, Hiromichi; Kudo, Hisaaki.

    1995-01-01

    To solve various problems in the mass division process of light actinoids, some experiments on the basis of bimodal fission were carried. Mass and kinetic energy distribution of Th-232 and U-238 were determined. Pa-225 (N= 134) and Pa-227 (N=136), fission nuclei, were produced by Bi-209 + 0-16 and Bi-209 + 0-18 heavy ion nucleus reactions, and the mass yield distribution were determined by the time-of-flight method and the radiochemical procedure. From the results, two independent deforming processes were proved in the fission process of light actinoid nuclei. On the deforming process through the low fission barrier, nucleus fissioned after small deformation under the influence of stabilization of the shell structure of fission product. In the case of process through the high barrier, however, the nucleus fissioned after large deformation. The unsymmetrical mass division was derived from the former and the symmetrical one from the latter. (S.Y.)

  20. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  1. Corredor Bimodal Cafetero

    OpenAIRE

    Duque Escobar, Gonzalo

    2015-01-01

    El Corredor Bimodal Cafetero es un proyecto de infraestructura estratégica que articula la Hidrovía del Magdalena con el Corredor Férreo del río Cauca, inscrito en el Plan Nacional de Desarrollo 2014/2018 y financiable con la salida de 30 mil toneladas diarias de carbón andino a la cuenca del Pacífico. Incluye el Túnel Cumanday para cruzar la Cordillera Central, el Ferrocarril Cafetero de 150 km y 3% de pendiente entre La Dorada y el Km 41, y la Transversal Cafetera de 108 km para una vía de...

  2. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  3. Synthesis and optical characterization of copper nanoparticles ...

    Indian Academy of Sciences (India)

    TEM analysis showed that the NPs were spherical with a bimodal distribution and an average ... be suspended for a long time [15]. However, by .... However, the optimum adjusted .... They found that the probability of such a bimodal distribu-.

  4. A course in bimodal provability logic

    NARCIS (Netherlands)

    Visser, A.

    The aim of the present paper is twofold: first I am somewhat dissatisfied with current treatments of Bimodal Provability Logic: the models employed there are singled out by certain syntactical conditions, moreover they validate the logics under consideration only locally. In this paper I give a

  5. Probabilities of filaments in a Poissonian distribution of points -I

    International Nuclear Information System (INIS)

    Betancort-Rijo, J.

    1989-01-01

    Statistical techniques are devised to assess the likelihood of a Poisson sample of points in two and three dimensions, containing specific filamentary structures. For that purpose, the expression of Otto et al (1986. Astrophys. J., 304) for the probability density of clumps in a Poissonian distribution of points is generalized for any value of the density contrast. A way of counting filaments differing from that of Otto et al. is proposed, because at low density contrast the filaments counted by Otto et al. are distributed in a clumpy fashion, each clump of filaments corresponding to a distinct observed filament. (author)

  6. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  7. Idealized models of the joint probability distribution of wind speeds

    Science.gov (United States)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  8. Probability distribution functions of turbulence in seepage-affected alluvial channel

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Anurag; Kumar, Bimlesh, E-mail: anurag.sharma@iitg.ac.in, E-mail: bimk@iitg.ac.in [Department of Civil Engineering, Indian Institute of Technology Guwahati, 781039 (India)

    2017-02-15

    The present experimental study is carried out on the probability distribution functions (PDFs) of turbulent flow characteristics within near-bed-surface and away-from-bed surfaces for both no seepage and seepage flow. Laboratory experiments were conducted in the plane sand bed for no seepage (NS), 10% seepage (10%S) and 15% seepage (15%) cases. The experimental calculation of the PDFs of turbulent parameters such as Reynolds shear stress, velocity fluctuations, and bursting events is compared with theoretical expression obtained by Gram–Charlier (GC)-based exponential distribution. Experimental observations follow the computed PDF distributions for both no seepage and seepage cases. Jensen-Shannon divergence (JSD) method is used to measure the similarity between theoretical and experimental PDFs. The value of JSD for PDFs of velocity fluctuation lies between 0.0005 to 0.003 while the JSD value for PDFs of Reynolds shear stress varies between 0.001 to 0.006. Even with the application of seepage, the PDF distribution of bursting events, sweeps and ejections are well characterized by the exponential distribution of the GC series, except that a slight deflection of inward and outward interactions is observed which may be due to weaker events. The value of JSD for outward and inward interactions ranges from 0.0013 to 0.032, while the JSD value for sweep and ejection events varies between 0.0001 to 0.0025. The theoretical expression for the PDF of turbulent intensity is developed in the present study, which agrees well with the experimental observations and JSD lies between 0.007 and 0.015. The work presented is potentially applicable to the probability distribution of mobile-bed sediments in seepage-affected alluvial channels typically characterized by the various turbulent parameters. The purpose of PDF estimation from experimental data is that it provides a complete numerical description in the areas of turbulent flow either at a single or finite number of points

  9. Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum

    Directory of Open Access Journals (Sweden)

    Farshid eSepehrband

    2016-05-01

    Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.

  10. Bimodal atomic force microscopy imaging of isolated antibodies in air and liquids

    International Nuclear Information System (INIS)

    MartInez, N F; Lozano, J R; Herruzo, E T; Garcia, F; Garcia, R; Richter, C; Sulzbach, T

    2008-01-01

    We have developed a dynamic atomic force microscopy (AFM) method based on the simultaneous excitation of the first two flexural modes of the cantilever. The instrument, called a bimodal atomic force microscope, allows us to resolve the structural components of antibodies in both monomer and pentameric forms. The instrument operates in both high and low quality factor environments, i.e., air and liquids. We show that under the same experimental conditions, bimodal AFM is more sensitive to compositional changes than amplitude modulation AFM. By using theoretical and numerical methods, we study the material contrast sensitivity as well as the forces applied on the sample during bimodal AFM operation

  11. Early Bimodal Stimulation Benefits Language Acquisition for Children With Cochlear Implants.

    Science.gov (United States)

    Moberly, Aaron C; Lowenstein, Joanna H; Nittrouer, Susan

    2016-01-01

    Adding a low-frequency acoustic signal to the cochlear implant (CI) signal (i.e., bimodal stimulation) for a period of time early in life improves language acquisition. Children must acquire sensitivity to the phonemic units of language to develop most language-related skills, including expressive vocabulary, working memory, and reading. Acquiring sensitivity to phonemic structure depends largely on having refined spectral (frequency) representations available in the signal, which does not happen with CIs alone. Combining the low-frequency acoustic signal available through hearing aids with the CI signal can enhance signal quality. A period with this bimodal stimulation has been shown to improve language skills in very young children. This study examined whether these benefits persist into childhood. Data were examined for 48 children with CIs implanted under age 3 years, participating in a longitudinal study. All children wore hearing aids before receiving a CI, but upon receiving a first CI, 24 children had at least 1 year of bimodal stimulation (Bimodal group), and 24 children had only electric stimulation subsequent to implantation (CI-only group). Measures of phonemic awareness were obtained at second and fourth grades, along with measures of expressive vocabulary, working memory, and reading. Children in the Bimodal group generally performed better on measures of phonemic awareness, and that advantage was reflected in other language measures. Having even a brief period of time early in life with combined electric-acoustic input provides benefits to language learning into childhood, likely because of the enhancement in spectral representations provided.

  12. Universal Probability Distribution Function for Bursty Transport in Plasma Turbulence

    International Nuclear Information System (INIS)

    Sandberg, I.; Benkadda, S.; Garbet, X.; Ropokis, G.; Hizanidis, K.; Castillo-Negrete, D. del

    2009-01-01

    Bursty transport phenomena associated with convective motion present universal statistical characteristics among different physical systems. In this Letter, a stochastic univariate model and the associated probability distribution function for the description of bursty transport in plasma turbulence is presented. The proposed stochastic process recovers the universal distribution of density fluctuations observed in plasma edge of several magnetic confinement devices and the remarkable scaling between their skewness S and kurtosis K. Similar statistical characteristics of variabilities have been also observed in other physical systems that are characterized by convection such as the x-ray fluctuations emitted by the Cygnus X-1 accretion disc plasmas and the sea surface temperature fluctuations.

  13. Non-Gaussian probability distributions of solar wind fluctuations

    Directory of Open Access Journals (Sweden)

    E. Marsch

    Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.

  14. Deaf Children's Bimodal Bilingualism and Education

    Science.gov (United States)

    Swanwick, Ruth

    2016-01-01

    This paper provides an overview of the research into deaf children's bilingualism and bilingual education through a synthesis of studies published over the last 15 years. This review brings together the linguistic and pedagogical work on bimodal bilingualism to inform educational practice. The first section of the review provides a synthesis of…

  15. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    Science.gov (United States)

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  16. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  17. Extreme points of the convex set of joint probability distributions with ...

    Indian Academy of Sciences (India)

    Here we address the following problem: If G is a standard ... convex set of all joint probability distributions on the product Borel space (X1 ×X2, F1 ⊗. F2) which .... cannot be identically zero when X and Y vary in A1 and u and v vary in H2. Thus.

  18. Flux-probability distributions from the master equation for radiation transport in stochastic media

    International Nuclear Information System (INIS)

    Franke, Brian C.; Prinja, Anil K.

    2011-01-01

    We present numerical investigations into the accuracy of approximations in the master equation for radiation transport in discrete binary random media. Our solutions of the master equation yield probability distributions of particle flux at each element of phase space. We employ the Levermore-Pomraning interface closure and evaluate the effectiveness of closures for the joint conditional flux distribution for estimating scattering integrals. We propose a parameterized model for this joint-pdf closure, varying between correlation neglect and a full-correlation model. The closure is evaluated for a variety of parameter settings. Comparisons are made with benchmark results obtained through suites of fixed-geometry realizations of random media in rod problems. All calculations are performed using Monte Carlo techniques. Accuracy of the approximations in the master equation is assessed by examining the probability distributions for reflection and transmission and by evaluating the moments of the pdfs. The results suggest the correlation-neglect setting in our model performs best and shows improved agreement in the atomic-mix limit. (author)

  19. How the bimodal format of presentation affects working memory: an overview.

    Science.gov (United States)

    Mastroberardino, Serena; Santangelo, Valerio; Botta, Fabiano; Marucci, Francesco S; Olivetti Belardinelli, Marta

    2008-03-01

    The best format in which information that has to be recalled is presented has been investigated in several studies, which focused on the impact of bimodal stimulation on working memory performance. An enhancement of participant's performance in terms of correct recall has been repeatedly found, when bimodal formats of presentation (i.e., audiovisual) were compared to unimodal formats (i.e, either visual or auditory), in providing implications for multimedial learning. Several theoretical frameworks have been suggested in order to account for the bimodal advantage, ranging from those emphasizing early stages of processing (such as automatic alerting effects or multisensory integration processes) to those centred on late stages of processing (as postulated by the dual coding theory). The aim of this paper is to review previous contributions to this topic, providing a comprehensive theoretical framework, which is updated by the latest empirical studies.

  20. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  1. A joint probability density function of wind speed and direction for wind energy analysis

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Bueno, Celia

    2008-01-01

    A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy

  2. Non-Gaussian theory of rubberlike elasticity based on rotational isomeric state simulations of network chain configurations. II. Bimodal poly(dimethylsiloxane) networks

    International Nuclear Information System (INIS)

    Curro, J.G.; Mark, J.E.

    1984-01-01

    Bimodal, poly(dimethylsiloxane) (PDMS) networks containing a large mole fraction of very short chains have been shown to be unusually tough elastomers. The purpose of this investigation is to understand the rubber elasticity behavior of these bimodal networks. As a first approach, we have assumed that the average chain deformation is affine. This deformation, however, is partitioned nonaffinely between the long and short chains so that the free energy is minimized. Gaussian statistics are used for the long chains. The distribution function for the short chains is found from Monte Carlo calculations. This model predicts an upturn in the stress-strain curve, the steepness depending on the network composition, as is observed experimentally

  3. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  4. Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2015-09-01

    Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.

  5. Joint probability distributions and fluctuation theorems

    International Nuclear Information System (INIS)

    García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien

    2012-01-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators

  6. THREE-MOMENT BASED APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2014-03-01

    Full Text Available The paper deals with the problem of approximation of probability distributions of random variables defined in positive area of real numbers with coefficient of variation different from unity. While using queueing systems as models for computer networks, calculation of characteristics is usually performed at the level of expectation and variance. At the same time, one of the main characteristics of multimedia data transmission quality in computer networks is delay jitter. For jitter calculation the function of packets time delay distribution should be known. It is shown that changing the third moment of distribution of packets delay leads to jitter calculation difference in tens or hundreds of percent, with the same values of the first two moments – expectation value and delay variation coefficient. This means that delay distribution approximation for the calculation of jitter should be performed in accordance with the third moment of delay distribution. For random variables with coefficients of variation greater than unity, iterative approximation algorithm with hyper-exponential two-phase distribution based on three moments of approximated distribution is offered. It is shown that for random variables with coefficients of variation less than unity, the impact of the third moment of distribution becomes negligible, and for approximation of such distributions Erlang distribution with two first moments should be used. This approach gives the possibility to obtain upper bounds for relevant characteristics, particularly, the upper bound of delay jitter.

  7. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    Gupta, S.S.; Panchapakesan, S.

    1975-01-01

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  8. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  9. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  10. On the issues of probability distribution of GPS carrier phase observations

    Science.gov (United States)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In common practice the observables related to Global Positioning System (GPS) are assumed to follow a Gauss-Laplace normal distribution. Actually, full knowledge of the observables' distribution is not required for parameter estimation by means of the least-squares algorithm based on the functional relation between observations and unknown parameters as well as the associated variance-covariance matrix. However, the probability distribution of GPS observations plays a key role in procedures for quality control (e.g. outlier and cycle slips detection, ambiguity resolution) and in reliability-related assessments of the estimation results. Under non-ideal observation conditions with respect to the factors impacting GPS data quality, for example multipath effects and atmospheric delays, the validity of the normal distribution postulate of GPS observations is in doubt. This paper presents a detailed analysis of the distribution properties of GPS carrier phase observations using double difference residuals. For this purpose 1-Hz observation data from the permanent SAPOS

  11. Bimodal Reading: Benefits of a Talking Computer for Average and Less Skilled Readers.

    Science.gov (United States)

    Montali, Julie; Lewandowski, Lawrence

    1996-01-01

    Eighteen average readers and 18 less-skilled readers (grades 8 and 9) were presented with social studies and science passages via a computer either visually (on screen), auditorily (read by digitized voice), or bimodally (on screen, highlighted while being voiced). Less-skilled readers demonstrated comprehension in the bimodal condition equivalent…

  12. Quantifying Young's moduli of protein fibrils and particles with bimodal force spectroscopy.

    Science.gov (United States)

    Gilbert, Jay; Charnley, Mirren; Cheng, Christopher; Reynolds, Nicholas P; Jones, Owen G

    2017-10-19

    Force spectroscopy is a means of obtaining mechanical information of individual nanometer-scale structures in composite materials, such as protein assemblies for use in consumer films or gels. As a recently developed force spectroscopy technique, bimodal force spectroscopy relates frequency shifts in cantilevers simultaneously excited at multiple frequencies to the elastic properties of the contacted material, yet its utility for quantitative characterization of biopolymer assemblies has been limited. In this study, a linear correlation between experimental frequency shift and Young's modulus of polymer films was used to calibrate bimodal force spectroscopy and quantify Young's modulus of two protein nanostructures: β-lactoglobulin fibrils and zein nanoparticles. Cross-sectional Young's modulus of protein fibrils was determined to be 1.6 GPa while the modulus of zein nanoparticles was determined as 854 MPa. Parallel measurement of β-lactoglobulin fibril by a competing pulsed-force technique found a higher cross-sectional Young's modulus, highlighting the importance of comparative calibration against known standards in both pulsed and bimodal force spectroscopies. These findings demonstrate a successful procedure for measuring mechanical properties of individual protein assemblies with potential use in biological or packaging applications using bimodal force spectroscopy.

  13. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    International Nuclear Information System (INIS)

    Xu, Xin-Ping; Ide, Yusuke

    2016-01-01

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  14. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xin-Ping, E-mail: xuxp@mail.ihep.ac.cn [School of Physical Science and Technology, Soochow University, Suzhou 215006 (China); Ide, Yusuke [Department of Information Systems Creation, Faculty of Engineering, Kanagawa University, Yokohama, Kanagawa, 221-8686 (Japan)

    2016-10-15

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  15. The bimodal distribution spin Seebeck effect enhancement in epitaxial Ni0.65Zn0.35Al0.8Fe1.2O4 thin film

    Science.gov (United States)

    Wang, Hua; Hou, Dazhi; Kikkawa, Takashi; Ramos, Rafael; Shen, Ka; Qiu, Zhiyong; Chen, Yao; Umeda, Maki; Shiomi, Yuki; Jin, Xiaofeng; Saitoh, Eiji

    2018-04-01

    The temperature dependence of the spin Seebeck effect (SSE) in epitaxial Ni0.65Zn0.35Al0.8Fe1.2O4 (NZA ferrite) thin film has been investigated systematically. The SSE at high fields shows a bimodal distribution enhancement from 3 K to 300 K and is well fitted with a double-peak Lorentzian function. We speculate the symmetric SSE enhancement in Pt/NZA ferrite bilayer, which is different from the magnon polarons induced asymmetric spikes in the SSE of Pt/YIG [T. Kikkawa et al. Phys. Rev. Lett. 117, 207203 (2016)], may result from the magnon-phonon interactions occurring at the intersections of the quantized magnon and phonon dispersions. The SSE results are helpful for the investigation of the magnon-phonon interaction in the magnetic ultrathin films.

  16. Examining barrier distributions and, in extension, energy derivative of probabilities for surrogate experiments

    International Nuclear Information System (INIS)

    Romain, P.; Duarte, H.; Morillon, B.

    2012-01-01

    The energy derivatives of probabilities are functions suited to a best understanding of certain mechanisms. Applied to compound nuclear reactions, they can bring information on fusion barrier distributions as originally introduced, and also, as presented here, on fission barrier distributions and heights. Extendedly, they permit to access the compound nucleus spin-parity states preferentially populated according to an entrance channel, at a given energy. (authors)

  17. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  18. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    Science.gov (United States)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  19. Unimodal Versus Bimodal EEG-fMRI Neurofeedback of a Motor Imagery Task

    Directory of Open Access Journals (Sweden)

    Lorraine Perronnet

    2017-04-01

    Full Text Available Neurofeedback is a promising tool for brain rehabilitation and peak performance training. Neurofeedback approaches usually rely on a single brain imaging modality such as EEG or fMRI. Combining these modalities for neurofeedback training could allow to provide richer information to the subject and could thus enable him/her to achieve faster and more specific self-regulation. Yet unimodal and multimodal neurofeedback have never been compared before. In the present work, we introduce a simultaneous EEG-fMRI experimental protocol in which participants performed a motor-imagery task in unimodal and bimodal NF conditions. With this protocol we were able to compare for the first time the effects of unimodal EEG-neurofeedback and fMRI-neurofeedback versus bimodal EEG-fMRI-neurofeedback by looking both at EEG and fMRI activations. We also propose a new feedback metaphor for bimodal EEG-fMRI-neurofeedback that integrates both EEG and fMRI signal in a single bi-dimensional feedback (a ball moving in 2D. Such a feedback is intended to relieve the cognitive load of the subject by presenting the bimodal neurofeedback task as a single regulation task instead of two. Additionally, this integrated feedback metaphor gives flexibility on defining a bimodal neurofeedback target. Participants were able to regulate activity in their motor regions in all NF conditions. Moreover, motor activations as revealed by offline fMRI analysis were stronger during EEG-fMRI-neurofeedback than during EEG-neurofeedback. This result suggests that EEG-fMRI-neurofeedback could be more specific or more engaging than EEG-neurofeedback. Our results also suggest that during EEG-fMRI-neurofeedback, participants tended to regulate more the modality that was harder to control. Taken together our results shed first light on the specific mechanisms of bimodal EEG-fMRI-neurofeedback and on its added-value as compared to unimodal EEG-neurofeedback and fMRI-neurofeedback.

  20. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  1. A Bernstein-Von Mises Theorem for discrete probability distributions

    OpenAIRE

    Boucheron, S.; Gassiat, E.

    2008-01-01

    We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function θ0 on ℕ∖{0} and a sequence of truncation levels (kn)n satisfying kn3≤ninf i≤knθ0(i). Let θ̂ denote the maximum likelihood estimate of (θ0(i))i≤kn and let Δn(θ0) denote the kn-dimensional vector which i-th coordinate is defined by $\\sqrt{n}(\\hat{\\theta}_{n}(i)-\\theta_{0}(i))$ for 1≤i≤kn. We check that under mild ...

  2. Bi-modal G\\"odel logic over [0,1]-valued Kripke frames

    OpenAIRE

    Caicedo, Xavier; Rodriguez, Ricardo Oscar

    2011-01-01

    We consider the G\\"odel bi-modal logic determined by fuzzy Kripke models where both the propositions and the accessibility relation are infinitely valued over the standard G\\"odel algebra [0,1] and prove strong completeness of Fischer Servi intuitionistic modal logic IK plus the prelinearity axiom with respect to this semantics. We axiomatize also the bi-modal analogues of $T,$ $S4,$ and $S5$ obtained by restricting to models over frames satisfying the [0,1]-valued versions of the structural ...

  3. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  4. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  5. Microbubble embedded with upconversion nanoparticles as a bimodal contrast agent for fluorescence and ultrasound imaging

    International Nuclear Information System (INIS)

    Jin, Birui; Lin, Min; You, Minli; Xu, Feng; Lu, Tianjian; Zong, Yujin; Wan, Mingxi; Duan, Zhenfeng

    2015-01-01

    Bimodal imaging offers additional imaging signal thus finds wide spread application in clinical diagnostic imaging. Fluorescence/ultrasound bimodal imaging contrast agent using fluorescent dyes or quantum dots for fluorescence signal has emerged as a promising method, which however requires visible light or UV irradiation resulting in photobleaching, photoblinking, auto-fluorescence and limited tissue penetration depth. To surmount these problems, we developed a novel bimodal contrast agent using layer-by-layer assembly of upconversion nanoparticles onto the surface of microbubbles. The resulting microbubbles with average size of 2 μm provide enhanced ultrasound echo for ultrasound imaging and upconversion emission upon near infrared irradiation for fluorescence imaging. The developed bimodal contrast agent holds great potential to be applied in ultrasound target technique for targeted diseases diagnostics and therapy. (paper)

  6. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically......We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...... adjusted according to past claims experience....

  7. Probability distribution functions for intermittent scrape-off layer plasma fluctuations

    Science.gov (United States)

    Theodorsen, A.; Garcia, O. E.

    2018-03-01

    A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.

  8. Audio feature extraction using probability distribution function

    Science.gov (United States)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  9. Refining Bimodal Microstructure of Materials with MSTRUCT

    Czech Academy of Sciences Publication Activity Database

    Matěj, Z.; Kadlecová, A.; Janeček, M.; Matějová, Lenka; Dopita, M.; Kužel, R.

    2014-01-01

    Roč. 29, S2 (2014), S35-S41 ISSN 0885-7156 R&D Projects: GA ČR GA14-23274S Grant - others:UK(CZ) UNCE 204023/2012 Institutional support: RVO:67985858 Keywords : XRD * bimodal * crystallite size Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.636, year: 2014

  10. Looking for bimodal distributions in multi-fragmentation reactions

    International Nuclear Information System (INIS)

    Gulminelli, F.

    2007-01-01

    The presence of a phase transition in a finite system can be deduced, together with its order, from the form of the distribution of the order parameter. This issue has been extensively studied in multifragmentation experiments, with results that do not appear fully consistent. In this paper we discuss the effect of the statistical ensemble or sorting conditions on the form of fragment distributions, and propose a new method, which can be easily implemented experimentally, to discriminate between different fragmentation scenarios. This method, based on a re-weighting of the measured distribution to account for the experimental constraints linked to the energy deposit, is tested on different simple models, and appears to provide a powerful discrimination. (author)

  11. Children with dyslexia show a reduced processing benefit from bimodal speech information compared to their typically developing peers.

    Science.gov (United States)

    Schaadt, Gesa; van der Meer, Elke; Pannekamp, Ann; Oberecker, Regine; Männel, Claudia

    2018-01-17

    During information processing, individuals benefit from bimodally presented input, as has been demonstrated for speech perception (i.e., printed letters and speech sounds) or the perception of emotional expressions (i.e., facial expression and voice tuning). While typically developing individuals show this bimodal benefit, school children with dyslexia do not. Currently, it is unknown whether the bimodal processing deficit in dyslexia also occurs for visual-auditory speech processing that is independent of reading and spelling acquisition (i.e., no letter-sound knowledge is required). Here, we tested school children with and without spelling problems on their bimodal perception of video-recorded mouth movements pronouncing syllables. We analyzed the event-related potential Mismatch Response (MMR) to visual-auditory speech information and compared this response to the MMR to monomodal speech information (i.e., auditory-only, visual-only). We found a reduced MMR with later onset to visual-auditory speech information in children with spelling problems compared to children without spelling problems. Moreover, when comparing bimodal and monomodal speech perception, we found that children without spelling problems showed significantly larger responses in the visual-auditory experiment compared to the visual-only response, whereas children with spelling problems did not. Our results suggest that children with dyslexia exhibit general difficulties in bimodal speech perception independently of letter-speech sound knowledge, as apparent in altered bimodal speech perception and lacking benefit from bimodal information. This general deficit in children with dyslexia may underlie the previously reported reduced bimodal benefit for letter-speech sound combinations and similar findings in emotion perception. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  13. PHOTOMETRIC REDSHIFT PROBABILITY DISTRIBUTIONS FOR GALAXIES IN THE SDSS DR8

    International Nuclear Information System (INIS)

    Sheldon, Erin S.; Cunha, Carlos E.; Mandelbaum, Rachel; Brinkmann, J.; Weaver, Benjamin A.

    2012-01-01

    We present redshift probability distributions for galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 8 imaging data. We used the nearest-neighbor weighting algorithm to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8 and u < 29.0. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five-dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training-set redshifts. We derived P(z)'s for individual objects by using training-set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy can reduce the statistical error in measurements that depend on the redshifts of individual galaxies. The spectroscopic training sample is substantially larger than that used for the DR7 release. The newly added PRIMUS catalog is now the most important training set used in this analysis by a wide margin. We expect the primary sources of error in the N(z) reconstruction to be sample variance and spectroscopic failures: The training sets are drawn from relatively small volumes of space, and some samples have large incompleteness. Using simulations we estimated the uncertainty in N(z) due to sample variance at a given redshift to be ∼10%-15%. The uncertainty on calculations incorporating N(z) or P(z) depends on how they are used; we discuss the case of weak lensing measurements. The P(z) catalog is publicly available from the SDSS Web site.

  14. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  15. The limiting conditional probability distribution in a stochastic model of T cell repertoire maintenance.

    Science.gov (United States)

    Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen

    2010-04-01

    The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.

  16. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  17. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  18. Evolution of twinning in extruded AZ31 alloy with bimodal grain structure

    Energy Technology Data Exchange (ETDEWEB)

    Garcés, G., E-mail: ggarces@cenim.csic.es [Department of Physical Metallurgy, National Centre for Metallurgical Research CENIM-CSIC, Av. De Gregorio del Amo 8, 28040 Madrid (Spain); Oñorbe, E. [CIEMAT, Division of Structural Materials, Avenida Complutense, 40, 28040 Madrid (Spain); Gan, W. [German Engineering Materials Science Centre at MLZ, Helmholtz-Zentrum Geesthacht, Lichtebergstr. 1, D-85747 Garching (Germany); Máthis, K. [Department of Physics of Materials, Faculty of Mathematics and Physics, Charles University, KeKarlovu 5, 121 16 Praha 2 (Czech Republic); Tolnai, D. [Institute of Materials Research, Helmholtz-Zentrum Geesthacht, Max-Planck-Str. 1, 21502 Geesthacht (Germany); Horváth, K. [Department of Physics of Materials, Faculty of Mathematics and Physics, Charles University, KeKarlovu 5, 121 16 Praha 2 (Czech Republic); Pérez, P.; Adeva, P. [Department of Physical Metallurgy, National Centre for Metallurgical Research CENIM-CSIC, Av. De Gregorio del Amo 8, 28040 Madrid (Spain)

    2017-04-15

    Twinning in extruded AZ31 alloy with a bimodal grain structure is studied under compression along the extrusion direction. This study has combined in-situ measurements during the compression tests by Synchrotron Radiation Diffraction and Acoustic Emission techniques and the evaluation of the microstructure and texture in post-mortem compression samples deformed at different strains. The microstructure of the alloy is characterized by the coexistence of large areas of fine dynamic recrystallized grains and coarse non-recrystallized grains elongated along extrusion direction. Twinning occurs initially in large elongated grains before the macroscopic yield stress which is controlled by the twinning in equiaxed dynamically recrystallized grains. - Highlights: • The AZ31 extruded at low temperature exhibits a bimodal grains structure. • Twinning takes place before macroscopic yielding in coarse non-DRXed grains. • DRXed grains controls the beginning of plasticity in magnesium alloys with bimodal grain structure.

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  20. Particle filtering with path sampling and an application to a bimodal ocean current model

    International Nuclear Information System (INIS)

    Weare, Jonathan

    2009-01-01

    This paper introduces a recursive particle filtering algorithm designed to filter high dimensional systems with complicated non-linear and non-Gaussian effects. The method incorporates a parallel marginalization (PMMC) step in conjunction with the hybrid Monte Carlo (HMC) scheme to improve samples generated by standard particle filters. Parallel marginalization is an efficient Markov chain Monte Carlo (MCMC) strategy that uses lower dimensional approximate marginal distributions of the target distribution to accelerate equilibration. As a validation the algorithm is tested on a 2516 dimensional, bimodal, stochastic model motivated by the Kuroshio current that runs along the Japanese coast. The results of this test indicate that the method is an attractive alternative for problems that require the generality of a particle filter but have been inaccessible due to the limitations of standard particle filtering strategies.

  1. Impact of spike train autostructure on probability distribution of joint spike events.

    Science.gov (United States)

    Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl

    2013-05-01

    The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing.

  2. Probability Distribution and Projected Trends of Daily Precipitation in China

    Institute of Scientific and Technical Information of China (English)

    CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER

    2013-01-01

    Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.

  3. Chimera distribution amplitudes for the pion and the longitudinally polarized ρ-meson

    Energy Technology Data Exchange (ETDEWEB)

    Stefanis, N.G., E-mail: stefanis@tp2.ruhr-uni-bochum.de [Institut für Theoretische Physik II, Ruhr-Universität Bochum, D-44780 Bochum (Germany); Pimikov, A.V., E-mail: pimikov@theor.jinr.ru [Bogoliubov Laboratory of Theoretical Physics, JINR, 141980 Dubna (Russian Federation); Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou, 730000 (China)

    2016-01-15

    Using QCD sum rules with nonlocal condensates, we show that the distribution amplitude of the longitudinally polarized ρ-meson may have a shorttailed platykurtic profile in close analogy to our recently proposed platykurtic distribution amplitude for the pion. Such a chimera distribution de facto amalgamates the broad unimodal profile of the distribution amplitude, obtained with a Dyson–Schwinger equations-based computational scheme, with the suppressed tails characterizing the bimodal distribution amplitudes derived from QCD sum rules with nonlocal condensates. We argue that pattern formation, emerging from the collective synchronization of coupled oscillators, can provide a single theoretical scaffolding to study unimodal and bimodal distribution amplitudes of light mesons without recourse to particular computational schemes and the reasons for them.

  4. Nano-scale simulation based study of creep behavior of bimodal nanocrystalline face centered cubic metal.

    Science.gov (United States)

    Meraj, Md; Pal, Snehanshu

    2017-10-11

    In this paper, the creep behavior of nanocrystalline Ni having bimodal grain structure is investigated using molecular dynamics simulation. Analysis of structural evolution during the creep process has also been performed. It is observed that an increase in size of coarse grain causes improvement in creep properties of bimodal nanocrystalline Ni. Influence of bimodality (i.e., size difference between coarse and fine grains) on creep properties are found to be reduced with increasing creep temperature. The dislocation density is observed to decrease exponentially with progress of creep deformation. Grain boundary diffusion controlled creep mechanism is found to be dominant at the primary creep region and the initial part of the secondary creep region. After that shear diffusion transformation mechanism is found to be significantly responsible for deformation as bimodal nanocrystalline Ni transforms to amorphous structure with further progress of the creep process. The presence of , , and  distorted icosahedra has a significant influence on creep rate in the tertiary creep regime according to Voronoi cluster analysis.

  5. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    Science.gov (United States)

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  6. Does bimodal stimulus presentation increase ERP components usable in BCIs?

    Science.gov (United States)

    Thurlings, Marieke E.; Brouwer, Anne-Marie; Van Erp, Jan B. F.; Blankertz, Benjamin; Werkhoven, Peter J.

    2012-08-01

    Event-related potential (ERP)-based brain-computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. Typically, visual stimuli are used. Tactile stimuli have recently been suggested as a gaze-independent alternative. Bimodal stimuli could evoke additional brain activity due to multisensory integration which may be of use in BCIs. We investigated the effect of visual-tactile stimulus presentation on the chain of ERP components, BCI performance (classification accuracies and bitrates) and participants’ task performance (counting of targets). Ten participants were instructed to navigate a visual display by attending (spatially) to targets in sequences of either visual, tactile or visual-tactile stimuli. We observe that attending to visual-tactile (compared to either visual or tactile) stimuli results in an enhanced early ERP component (N1). This bimodal N1 may enhance BCI performance, as suggested by a nonsignificant positive trend in offline classification accuracies. A late ERP component (P300) is reduced when attending to visual-tactile compared to visual stimuli, which is consistent with the nonsignificant negative trend of participants’ task performance. We discuss these findings in the light of affected spatial attention at high-level compared to low-level stimulus processing. Furthermore, we evaluate bimodal BCIs from a practical perspective and for future applications.

  7. RSMASS-D nuclear thermal propulsion and bimodal system mass models

    Science.gov (United States)

    King, Donald B.; Marshall, Albert C.

    1997-01-01

    Two relatively simple models have been developed to estimate reactor, radiation shield, and balance of system masses for a particle bed reactor (PBR) nuclear thermal propulsion concept and a cermet-core power and propulsion (bimodal) concept. The approach was based on the methodology developed for the RSMASS-D models. The RSMASS-D approach for the reactor and shield sub-systems uses a combination of simple equations derived from reactor physics and other fundamental considerations along with tabulations of data from more detailed neutron and gamma transport theory computations. Relatively simple models are used to estimate the masses of other subsystem components of the nuclear propulsion and bimodal systems. Other subsystem components include instrumentation and control (I&C), boom, safety systems, radiator, thermoelectrics, heat pipes, and nozzle. The user of these models can vary basic design parameters within an allowed range to achieve a parameter choice which yields a minimum mass for the operational conditions of interest. Estimated system masses are presented for a range of reactor power levels for propulsion for the PBR propulsion concept and for both electrical power and propulsion for the cermet-core bimodal concept. The estimated reactor system masses agree with mass predictions from detailed calculations with xx percent for both models.

  8. Bimodal fuzzy analytic hierarchy process (BFAHP) for coronary heart disease risk assessment.

    Science.gov (United States)

    Sabahi, Farnaz

    2018-04-04

    Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All

  9. Bimodal distribution of risk for childhood obesity in urban Baja California, Mexico.

    Science.gov (United States)

    Wojcicki, Janet M; Jimenez-Cruz, Arturo; Bacardi-Gascon, Montserrat; Schwartz, Norah; Heyman, Melvin B

    2012-08-01

    In Mexico, higher socioeconomic status (SES) has been found to be associated with increased risk for obesity in children. Within developed urban areas, however, there may be increased risk among lower SES children. Students in grades 4-6 from five public schools in Tijuana and Tecate, Mexico, were interviewed and weight, height and waist circumference (WC) measurements were taken. Interviews consisted of questions on food frequency, food insecurity, acculturation, physical activity and lifestyle practices. Multivariate logistic models were used to assess risk factors for obesity (having a body mass index [BMI] ≥95th percentile) and abdominal obesity (a WC >90th percentile) using Stata 11.0. Five hundred and ninety students were enrolled; 43.7% were overweight or obese, and 24.3% were obese and 20.2% had abdominal obesity. Independent risk factors for obesity included watching TV in English (odds ratio [OR] 1.60, 95% confidence interval [CI] 1.06-2.41) and perceived child food insecurity (OR 1.57, 95% CI 1.05-2.36). Decreased risk for obesity was associated with female sex (OR 0.64, 95% CI 0.43-0.96), as was regular multivitamin use (OR 0.63, 95% CI 0.42-0.94). Risk obesity was also decreased with increased taco consumption (≥1×/week; OR 0.64, 95% CI 0.43-0.96). Independent risk factors for abdominal obesity included playing video games ≥1×/week (OR 1.18, 95% CI 1.11-2.96) and older age group (10-11 years, OR 2.47, 95% CI 1.29-4.73 and ≥12 years, OR 2.21, 95% CI 1.09-4.49). Increased consumption of tacos was also associated with decreased risk for abdominal obesity (≥1×/week; OR 0.56, 95% CI 0.40-1.00). We found a bimodal distribution for risk of obesity and abdominal obesity in school aged children on the Mexican border with the United States. Increased risk for obesity and abdominal obesity were associated with factors indicative of lower and higher SES including watching TV in English, increased video game playing and perceived food insecurity

  10. Scarred resonances and steady probability distribution in a chaotic microcavity

    International Nuclear Information System (INIS)

    Lee, Soo-Young; Rim, Sunghwan; Kim, Chil-Min; Ryu, Jung-Wan; Kwon, Tae-Yoon

    2005-01-01

    We investigate scarred resonances of a stadium-shaped chaotic microcavity. It is shown that two components with different chirality of the scarring pattern are slightly rotated in opposite ways from the underlying unstable periodic orbit, when the incident angles of the scarring pattern are close to the critical angle for total internal reflection. In addition, the correspondence of emission pattern with the scarring pattern disappears when the incident angles are much larger than the critical angle. The steady probability distribution gives a consistent explanation about these interesting phenomena and makes it possible to expect the emission pattern in the latter case

  11. Loaded dice in Monte Carlo : importance sampling in phase space integration and probability distributions for discrepancies

    NARCIS (Netherlands)

    Hameren, Andreas Ferdinand Willem van

    2001-01-01

    Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of

  12. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  13. Auditory-somatosensory bimodal stimulation desynchronizes brain circuitry to reduce tinnitus in guinea pigs and humans.

    Science.gov (United States)

    Marks, Kendra L; Martel, David T; Wu, Calvin; Basura, Gregory J; Roberts, Larry E; Schvartz-Leyzac, Kara C; Shore, Susan E

    2018-01-03

    The dorsal cochlear nucleus is the first site of multisensory convergence in mammalian auditory pathways. Principal output neurons, the fusiform cells, integrate auditory nerve inputs from the cochlea with somatosensory inputs from the head and neck. In previous work, we developed a guinea pig model of tinnitus induced by noise exposure and showed that the fusiform cells in these animals exhibited increased spontaneous activity and cross-unit synchrony, which are physiological correlates of tinnitus. We delivered repeated bimodal auditory-somatosensory stimulation to the dorsal cochlear nucleus of guinea pigs with tinnitus, choosing a stimulus interval known to induce long-term depression (LTD). Twenty minutes per day of LTD-inducing bimodal (but not unimodal) stimulation reduced physiological and behavioral evidence of tinnitus in the guinea pigs after 25 days. Next, we applied the same bimodal treatment to 20 human subjects with tinnitus using a double-blinded, sham-controlled, crossover study. Twenty-eight days of LTD-inducing bimodal stimulation reduced tinnitus loudness and intrusiveness. Unimodal auditory stimulation did not deliver either benefit. Bimodal auditory-somatosensory stimulation that induces LTD in the dorsal cochlear nucleus may hold promise for suppressing chronic tinnitus, which reduces quality of life for millions of tinnitus sufferers worldwide. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  14. Brain deactivation in the outperformance in bimodal tasks: an FMRI study.

    Directory of Open Access Journals (Sweden)

    Tzu-Ching Chiang

    Full Text Available While it is known that some individuals can effectively perform two tasks simultaneously, other individuals cannot. How the brain deals with performing simultaneous tasks remains unclear. In the present study, we aimed to assess which brain areas corresponded to various phenomena in task performance. Nineteen subjects were requested to sequentially perform three blocks of tasks, including two unimodal tasks and one bimodal task. The unimodal tasks measured either visual feature binding or auditory pitch comparison, while the bimodal task required performance of the two tasks simultaneously. The functional magnetic resonance imaging (fMRI results are compatible with previous studies showing that distinct brain areas, such as the visual cortices, frontal eye field (FEF, lateral parietal lobe (BA7, and medial and inferior frontal lobe, are involved in processing of visual unimodal tasks. In addition, the temporal lobes and Brodmann area 43 (BA43 were involved in processing of auditory unimodal tasks. These results lend support to concepts of modality-specific attention. Compared to the unimodal tasks, bimodal tasks required activation of additional brain areas. Furthermore, while deactivated brain areas were related to good performance in the bimodal task, these areas were not deactivated where the subject performed well in only one of the two simultaneous tasks. These results indicate that efficient information processing does not require some brain areas to be overly active; rather, the specific brain areas need to be relatively deactivated to remain alert and perform well on two tasks simultaneously. Meanwhile, it can also offer a neural basis for biofeedback in training courses, such as courses in how to perform multiple tasks simultaneously.

  15. How bilingualism protects the brain from aging: Insights from bimodal bilinguals.

    Science.gov (United States)

    Li, Le; Abutalebi, Jubin; Emmorey, Karen; Gong, Gaolang; Yan, Xin; Feng, Xiaoxia; Zou, Lijuan; Ding, Guosheng

    2017-08-01

    Bilingual experience can delay cognitive decline during aging. A general hypothesis is that the executive control system of bilinguals faces an increased load due to controlling two languages, and this increased load results in a more "tuned brain" that eventually creates a neural reserve. Here we explored whether such a neuroprotective effect is independent of language modality, i.e., not limited to bilinguals who speak two languages but also occurs for bilinguals who use a spoken and a signed language. We addressed this issue by comparing bimodal bilinguals to monolinguals in order to detect age-induced structural brain changes and to determine whether we can detect the same beneficial effects on brain structure, in terms of preservation of gray matter volume (GMV), for bimodal bilinguals as has been reported for unimodal bilinguals. Our GMV analyses revealed a significant interaction effect of age × group in the bilateral anterior temporal lobes, left hippocampus/amygdala, and left insula where bimodal bilinguals showed slight GMV increases while monolinguals showed significant age-induced GMV decreases. We further found through cortical surface-based measurements that this effect was present for surface area and not for cortical thickness. Moreover, to further explore the hypothesis that overall bilingualism provides neuroprotection, we carried out a direct comparison of GMV, extracted from the brain regions reported above, between bimodal bilinguals, unimodal bilinguals, and monolinguals. Bilinguals, regardless of language modality, exhibited higher GMV compared to monolinguals. This finding highlights the general beneficial effects provided by experience handling two language systems, whether signed or spoken. Hum Brain Mapp 38:4109-4124, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  16. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  17. The Efficiency of the Bimodal System Transportation

    Directory of Open Access Journals (Sweden)

    Nada Štrumberger

    2012-10-01

    Full Text Available The development of fast railway results in an increased applicationof Trailer Train bimodal system transportation. Thetraffic costs are multiply reduced, particularly the variablecosts. On the other hand the environmental pollution from exhaustgases is also reduced. Therefore, by the year 2010 cargotransport should be preponderant~v used which would be characterisedby fast electric trains producing less noise, at lowercosts and with clean environment.

  18. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Energy Technology Data Exchange (ETDEWEB)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B [Laboratorio de Ingenieria de Rehabilitacion e Investigaciones Neuromusculares y Sensoriales, Facultad de Ingenieria, UNER, Oro Verde (Argentina)

    2007-11-15

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  19. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    International Nuclear Information System (INIS)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B

    2007-01-01

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed

  20. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available maintaining constraints in a DC-DC converter is investigated. A probability distribution whose aim is to ensure maximal harmonic spreading and yet mainaint constraints is presented. The PDFs are determined from a direct application of the method of Maximum...

  1. Application of a bi-modal PBR nuclear propulsion and power system to military missions

    Science.gov (United States)

    Venetoklis, Peter S.

    1995-01-01

    The rapid proliferation of arms technology and space access combined with current economic realities in the United States are creating ever greater demands for more capable space-based military assets. The paper illustrates that bi-modal nuclear propulsion and power based on the Particle Bed Reactor (PBR) is a high-leverage tehcnology that can maximize utility while minimizing cost. Mission benefits offered by the bi-modal PBR, including enhanced maneuverability, lifetime, survivability, payload power, and operational flexibility, are discussed. The ability to deliver desired payloads on smaller boosters is also illustrated. System descriptions and parameters for 10 kWe and 100 kWe power output levels are summarized. It is demonstrated via design exercise that bi-modal PBR dramtically enhances performance of a military satellite in geosynchronous orbit, increasing payload mass, payload power, and maneuverability.

  2. Far-from-Equilibrium Route to Superthermal Light in Bimodal Nanolasers

    Directory of Open Access Journals (Sweden)

    Mathias Marconi

    2018-01-01

    Full Text Available Microscale and nanoscale lasers inherently exhibit rich photon statistics due to complex light-matter interaction in a strong spontaneous emission noise background. It is well known that they may display superthermal fluctuations—photon superbunching—in specific situations due to either gain competition, leading to mode-switching instabilities, or carrier-carrier coupling in superradiant microcavities. Here we show a generic route to superbunching in bimodal nanolasers by preparing the system far from equilibrium through a parameter quench. We demonstrate, both theoretically and experimentally, that transient dynamics after a short-pump-pulse-induced quench leads to heavy-tailed superthermal statistics when projected onto the weak mode. We implement a simple experimental technique to access the probability density functions that further enables quantifying the distance from thermal equilibrium via the thermodynamic entropy. The universality of this mechanism relies on the far-from-equilibrium dynamical scenario, which can be mapped to a fast cooling process of a suspension of Brownian particles in a liquid. Our results open up new avenues to mold photon statistics in multimode optical systems and may constitute a test bed to investigate out-of-equilibrium thermodynamics using micro or nanocavity arrays.

  3. Far-from-Equilibrium Route to Superthermal Light in Bimodal Nanolasers

    Science.gov (United States)

    Marconi, Mathias; Javaloyes, Julien; Hamel, Philippe; Raineri, Fabrice; Levenson, Ariel; Yacomotti, Alejandro M.

    2018-02-01

    Microscale and nanoscale lasers inherently exhibit rich photon statistics due to complex light-matter interaction in a strong spontaneous emission noise background. It is well known that they may display superthermal fluctuations—photon superbunching—in specific situations due to either gain competition, leading to mode-switching instabilities, or carrier-carrier coupling in superradiant microcavities. Here we show a generic route to superbunching in bimodal nanolasers by preparing the system far from equilibrium through a parameter quench. We demonstrate, both theoretically and experimentally, that transient dynamics after a short-pump-pulse-induced quench leads to heavy-tailed superthermal statistics when projected onto the weak mode. We implement a simple experimental technique to access the probability density functions that further enables quantifying the distance from thermal equilibrium via the thermodynamic entropy. The universality of this mechanism relies on the far-from-equilibrium dynamical scenario, which can be mapped to a fast cooling process of a suspension of Brownian particles in a liquid. Our results open up new avenues to mold photon statistics in multimode optical systems and may constitute a test bed to investigate out-of-equilibrium thermodynamics using micro or nanocavity arrays.

  4. Pluto/Charon exploration utilizing a bi-modal PBR nuclear propulsion/power system

    Science.gov (United States)

    Venetoklis, Peter S.

    1995-01-01

    The paper describes a Pluto/Charon orbiter utilizing a bi-modal nuclear propulsion and power system based on the Particle Bed Reactor. The orbiter is sized for launch to Nuclear-Safe orbit atop a Titan IV or equivalent launch veicle. The bi-modal system provides thermal propulsion for Earth orbital departure and Pluto orbital capture, and 10 kWe of electric power for payload functions and for in-system maneuvering with ion thrusters. Ion thrusters are used to perform inclination changes about Pluto, a transfer from low Pluto orbit to low Charon orbit, and inclination changes about charon. A nominal payload can be deliverd in as little as 15 years, 1000 kg in 17 years, and close to 2000 kg in 20 years. Scientific return is enormously aided by the availability of up to 10 kWe, due to greater data transfer rates and more/better instruments. The bi-modal system can provide power at Pluto/Charon for 10 or more years, enabling an extremely robust, scientifically rewarding, and cost-effective exploration mission.

  5. THE BIMODAL STRUCTURE OF THE SOLAR CYCLE

    Energy Technology Data Exchange (ETDEWEB)

    Du, Z. L., E-mail: zldu@nao.cas.cn [Key Laboratory of Solar Activity, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China)

    2015-05-01

    Some properties of the 11 yr solar cycle can be explained by the current solar dynamo models. However, some other features remain not well understood such as the asymmetry of the cycle, the double-peaked structure, and the “Waldmeier effect” that a stronger cycle tends to have less rise time and a shorter cycle length. We speculate that the solar cycle is governed by a bi-dynamo model forming two stochastic processes depicted by a bimodal Gaussian function with a time gap of about 2 yr, from which the above features can be reasonably explained. The first one describes the main properties of the cycle dominated by the current solar dynamo models, and the second one occurs either in the rising phase as a short weak explosive perturbation or in the declining phase as a long stochastic perturbation. The above function is the best one selected from several in terms of the Akaike information criterion. Through analyzing different distributions, one might speculate about the dominant physical process inside the convection zone. The secondary (main) process is found to be closely associated with complicated (simple) active ranges. In effect, the bi-dynamo model is a reduced form of a multi-dynamo model, which could occur from the base of the convection zone through its envelope and from low to high heliographic latitude, reflecting the active belts in the convection zone. These results are insensitive to the hemispheric asymmetry, smoothing filters, and distribution functions selected and are expected to be helpful in understanding the formation of solar and stellar cycles.

  6. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  7. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1990-12-01

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  8. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M S

    1990-12-15

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  9. Deformation behavior of multilayered NiFe with bimodal grain size distribution at room and elevated temperature

    Energy Technology Data Exchange (ETDEWEB)

    Fiebig, Jochen, E-mail: jmfiebig@ucdavis.edu [Department of Chemical Engineering and Materials Science, University of California, Davis, CA 95817 (United States); Jian, Jie [Department of Electrical and Computer Engineering, Texas A& M University, College Station, TX 77843-3128 (United States); Kurmanaeva, Lilia [Department of Chemical Engineering and Materials Science, University of California, Davis, CA 95817 (United States); McCrea, Jon [Integran Technologies Inc., Toronto (Canada); Wang, Haiyan [Department of Electrical and Computer Engineering, Texas A& M University, College Station, TX 77843-3128 (United States); Lavernia, Enrique [Department of Chemical Engineering and Materials Science, University of California, Davis, CA 95817 (United States); Department of Chemical Engineering and Materials Science, University of California, Irvine, CA 92697 (United States); Mukherjee, Amiya [Department of Chemical Engineering and Materials Science, University of California, Davis, CA 95817 (United States)

    2016-02-22

    We describe a study of the temperature dependent deformation behavior of a multilayered NiFe-60 wt%Fe alloy with a layer thickness of 5 μm fabricated by electrodeposition. The structure of adjacent layers alternates between a nanocrystalline and a coarse grained. Uniaxial tensile tests at temperature between 20 °C and 400 °C and strain rate of 10{sup −4}–10{sup −2} were used to determine the mechanical behavior. Microstructure observations via transmission electron microscopy and fractography were performed to provide insight into the underlying deformation mechanism. The mechanical behavior is discussed in the context of the bimodal microstructure of multilayered samples and the contribution of each sub-layer to strength and ductility. The results reveal that even at higher temperatures the nanocrystalline layer determines the mechanical performance of multilayered materials.

  10. Effects of tensile test parameters on the mechanical properties of a bimodal Al–Mg alloy

    International Nuclear Information System (INIS)

    Magee, Andrew; Ladani, Leila; Topping, Troy D.; Lavernia, Enrique J.

    2012-01-01

    The properties of aluminum alloy (AA) 5083 are shown to be significantly improved by grain size reduction through cryomilling and the incorporation of unmilled Al particles into the material, creating a bimodal grain size distribution consisting of coarse grains in a nanocrystalline matrix. To provide insight into the mechanical behavior and ultimately facilitate engineering applications, the present study reports on the effects of coarse grain ratio, anisotropy, strain rate and specimen size on the elastic–plastic behavior of bimodal AA 5083 evaluated in uniaxial tension tests using a full-factorial experiment design. To determine the governing failure mechanisms under different testing conditions, the specimens’ failure surfaces were analyzed using optical and electron microscopy. The results of the tests were found to conform to Joshi’s plasticity model. Significant anisotropy effects were observed, in a drastic reduction in strength and ductility, when tension was applied perpendicular (transverse) to the direction of extrusion. These specimens also exhibited a smooth, flat fracture surface morphology with a significantly different surface texture than specimens tested in the axial direction. It was found that decreasing specimen thickness and strain rate served to increase both the strength and ductility of the material. The failure surface morphology was found to differ between specimens of different thicknesses.

  11. On the probability distribution of daily streamflow in the United States

    Science.gov (United States)

    Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.

    2017-06-01

    Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.

  12. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  13. Near-Infrared Squaraine Dye Encapsulated Micelles for in Vivo Fluorescence and Photoacoustic Bimodal Imaging.

    Science.gov (United States)

    Sreejith, Sivaramapanicker; Joseph, James; Lin, Manjing; Menon, Nishanth Venugopal; Borah, Parijat; Ng, Hao Jun; Loong, Yun Xian; Kang, Yuejun; Yu, Sidney Wing-Kwong; Zhao, Yanli

    2015-06-23

    Combined near-infrared (NIR) fluorescence and photoacoustic imaging techniques present promising capabilities for noninvasive visualization of biological structures. Development of bimodal noninvasive optical imaging approaches by combining NIR fluorescence and photoacoustic tomography demands suitable NIR-active exogenous contrast agents. If the aggregation and photobleaching are prevented, squaraine dyes are ideal candidates for fluorescence and photoacoustic imaging. Herein, we report rational selection, preparation, and micelle encapsulation of an NIR-absorbing squaraine dye (D1) for in vivo fluorescence and photoacoustic bimodal imaging. D1 was encapsulated inside micelles constructed from a biocompatible nonionic surfactant (Pluoronic F-127) to obtain D1-encapsulated micelles (D1(micelle)) in aqueous conditions. The micelle encapsulation retains both the photophysical features and chemical stability of D1. D1(micelle) exhibits high photostability and low cytotoxicity in biological conditions. Unique properties of D1(micelle) in the NIR window of 800-900 nm enable the development of a squaraine-based exogenous contrast agent for fluorescence and photoacoustic bimodal imaging above 820 nm. In vivo imaging using D1(micelle), as demonstrated by fluorescence and photoacoustic tomography experiments in live mice, shows contrast-enhanced deep tissue imaging capability. The usage of D1(micelle) proven by preclinical experiments in rodents reveals its excellent applicability for NIR fluorescence and photoacoustic bimodal imaging.

  14. Penetration in bimodal, polydisperse granular material

    KAUST Repository

    Kouraytem, Nadia; Thoroddsen, Sigurdur T; Marston, J. O.

    2016-01-01

    We investigate the impact penetration of spheres into granular media which are compositions of two discrete size ranges, thus creating a polydisperse bimodal material. We examine the penetration depth as a function of the composition (volume fractions of the respective sizes) and impact speed. Penetration depths were found to vary between delta = 0.5D(0) and delta = 7D(0), which, for mono-modal media only, could be correlated in terms of the total drop height, H = h + delta, as in previous studies, by incorporating correction factors for the packing fraction. Bimodal data can only be collapsed by deriving a critical packing fraction for each mass fraction. The data for the mixed grains exhibit a surprising lubricating effect, which was most significant when the finest grains [d(s) similar to O(30) mu m] were added to the larger particles [d(l) similar to O(200 - 500) mu m], with a size ratio, epsilon = d(l)/d(s), larger than 3 and mass fractions over 25%, despite the increased packing fraction. We postulate that the small grains get between the large grains and reduce their intergrain friction, only when their mass fraction is sufficiently large to prevent them from simply rattling in the voids between the large particles. This is supported by our experimental observations of the largest lubrication effect produced by adding small glass beads to a bed of large sand particles with rough surfaces.

  15. Penetration in bimodal, polydisperse granular material

    KAUST Repository

    Kouraytem, N.

    2016-11-07

    We investigate the impact penetration of spheres into granular media which are compositions of two discrete size ranges, thus creating a polydisperse bimodal material. We examine the penetration depth as a function of the composition (volume fractions of the respective sizes) and impact speed. Penetration depths were found to vary between delta = 0.5D(0) and delta = 7D(0), which, for mono-modal media only, could be correlated in terms of the total drop height, H = h + delta, as in previous studies, by incorporating correction factors for the packing fraction. Bimodal data can only be collapsed by deriving a critical packing fraction for each mass fraction. The data for the mixed grains exhibit a surprising lubricating effect, which was most significant when the finest grains [d(s) similar to O(30) mu m] were added to the larger particles [d(l) similar to O(200 - 500) mu m], with a size ratio, epsilon = d(l)/d(s), larger than 3 and mass fractions over 25%, despite the increased packing fraction. We postulate that the small grains get between the large grains and reduce their intergrain friction, only when their mass fraction is sufficiently large to prevent them from simply rattling in the voids between the large particles. This is supported by our experimental observations of the largest lubrication effect produced by adding small glass beads to a bed of large sand particles with rough surfaces.

  16. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks

    International Nuclear Information System (INIS)

    Zhuang Jiancang; Ogata, Yosihiko

    2006-01-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata et al., Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method

  17. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    Science.gov (United States)

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  18. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which

  19. Polyethyleneimine-loaded bimodal porous silica as low-cost and high-capacity sorbent for CO{sub 2} capture

    Energy Technology Data Exchange (ETDEWEB)

    Witoon, Thongthai, E-mail: fengttwi@ku.ac.th [National Center of Excellence for Petroleum, Petrochemicals and Advance Material, Department of Chemical Engineering, Faculty of Engineering, Kasetsart University, Bangkok 10900 (Thailand); Center for Advanced Studies in Nanotechnology and Its Applications in Chemical Food and Agricultural Industries, Kasetsart University, Bangkok 10900 (Thailand)

    2012-11-15

    In this work, bimodal (meso-macro) porous silicas with different mesopore diameters synthesized by using rice husk ash as a low-cost silica source and chitosan as a natural template were used as a polyethyleneimine (PEI) support for CO{sub 2} capture. Unimodal porous silica supports with equivalent mesopore diameters to bimodal porous silica supports have been prepared for purpose of comparison. Effects of different PEI contents (10, 20, 30, 40 and 50 wt%) on CO{sub 2} sorption capacity have been systematically investigated. The porous silica supports and the PEI-loaded porous silica supports were characterized by N{sub 2}-sorption analysis, scanning electron microscopy, Fourier transform infrared spectroscopy and thermal gravimetric analysis. CO{sub 2} sorption measurements of all PEI-loaded porous silica supports were performed at different adsorption temperatures (60, 75, 85, 90, 95 and 105 Degree-Sign C). At low PEI contents (10-20 wt%), the CO{sub 2} sorption of all adsorbents was found to decrease as a function of adsorption temperature, which was a characteristic of a thermodynamically-controlled regime. A transition from the thermodynamically-controlled regime to a kinetically-controlled regime was found when the PEI content was increased up to 30 wt% for PEI-loaded unimodal porous silicas and 40 wt% for PEI-loaded bimodal porous silicas. At high PEI contents (40-50 wt%), the CO{sub 2} capturing efficiency of the PEI-loaded bimodal porous silicas was found to be considerably greater than that of the PEI-loaded unimodal porous silicas, indicating that most of the amine groups of PEI molecules loaded on the unimodal porous silica supports was useless, and thus the appeared macroporosity of the bimodal porous silica supports could provide a higher effective amine density to adsorb CO{sub 2}. Highlights: Black-Right-Pointing-Pointer PEI-impregnated bimodal porous silica as low-cost sorbent for CO{sub 2} capture. Black-Right-Pointing-Pointer Macropores enhances

  20. Possible detection of a bimodal cloud distribution in the atmosphere of HAT-P-32 A b from multiband photometry

    Science.gov (United States)

    Tregloan-Reed, J.; Southworth, J.; Mancini, L.; Mollière, P.; Ciceri, S.; Bruni, I.; Ricci, D.; Ayala-Loera, C.; Henning, T.

    2018-03-01

    We present high-precision photometry of eight separate transit events in the HAT-P-32 planetary system. One transit event was observed simultaneously by two telescopes of which one obtained a simultaneous multiband light curve in three optical bands, giving a total of 11 transit light curves. Due to the filter selection and in conjunction with using the defocused photometry technique, we were able to obtain an extremely high-precision, ground-based transit in the u band (350 nm), with an rms scatter of ≈1 mmag. All 11 transits were modelled using PRISM and GEMC, and the physical properties of the system calculated. We find the mass and radius of the host star to be 1.182 ± 0.041 M⊙ and 1.225 ± 0.015 R⊙, respectively. For the planet, we find a mass of 0.80 ± 0.14 MJup, a radius of 1.807 ± 0.022 RJup, and a density of 0.126 ± 0.023 ρJup. These values are consistent with those found in the literature. We also obtain a new orbital ephemeris for the system T0 = BJD/TDB 2 454 420.447187(96) + 2.15000800(10) × E. We measured the transmission spectrum of HAT-P-32 A b and compared it to theoretical transmission spectra. Our results indicate a bimodal cloud particle distribution consisting of Rayleigh-like haze and grey absorbing cloud particles within the atmosphere of HAT-P-32 A b.

  1. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  2. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  3. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  4. Research on Energy-Saving Design of Overhead Travelling Crane Camber Based on Probability Load Distribution

    Directory of Open Access Journals (Sweden)

    Tong Yifei

    2014-01-01

    Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.

  5. Tumour control probability (TCP) for non-uniform activity distribution in radionuclide therapy

    International Nuclear Information System (INIS)

    Uusijaervi, Helena; Bernhardt, Peter; Forssell-Aronsson, Eva

    2008-01-01

    Non-uniform radionuclide distribution in tumours will lead to a non-uniform absorbed dose. The aim of this study was to investigate how tumour control probability (TCP) depends on the radionuclide distribution in the tumour, both macroscopically and at the subcellular level. The absorbed dose in the cell nuclei of tumours was calculated for 90 Y, 177 Lu, 103m Rh and 211 At. The radionuclides were uniformly distributed within the subcellular compartment and they were uniformly, normally or log-normally distributed among the cells in the tumour. When all cells contain the same amount of activity, the cumulated activities required for TCP = 0.99 (A-tilde TCP=0.99 ) were 1.5-2 and 2-3 times higher when the activity was distributed on the cell membrane compared to in the cell nucleus for 103m Rh and 211 At, respectively. TCP for 90 Y was not affected by different radionuclide distributions, whereas for 177 Lu, it was slightly affected when the radionuclide was in the nucleus. TCP for 103m Rh and 211 At were affected by different radionuclide distributions to a great extent when the radionuclides were in the cell nucleus and to lesser extents when the radionuclides were distributed on the cell membrane or in the cytoplasm. When the activity was distributed in the nucleus, A-tilde TCP=0.99 increased when the activity distribution became more heterogeneous for 103m Rh and 211 At, and the increase was large when the activity was normally distributed compared to log-normally distributed. When the activity was distributed on the cell membrane, A-tilde TCP=0.99 was not affected for 103m Rh and 211 At when the activity distribution became more heterogeneous. A-tilde TCP=0.99 for 90 Y and 177 Lu were not affected by different activity distributions, neither macroscopic nor subcellular

  6. Effects of different block size distributions in pressure transient response of naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Montazeri, G.H. [Islamic Azad University, Mahshahr (Iran, Islamic Republic of). Dept. of Chemical and Petroleum Engineering], E-mail: montazeri_gh@yahoo.com; Tahami, S.A. [Mad Daneshgostar Tabnak Co. (MDT),Tehran (Iran, Islamic Republic of); Moradi, B.; Safari, E. [Iranian Central Oil Fields Co, Tehran (Iran, Islamic Republic of)], E-mail: morady.babak@gmail.com

    2011-07-15

    This paper presents a model for pressure transient and derivative analysis for naturally fractured reservoirs by a formulation of inter porosity flow incorporating variations in matrix block size, which is inversely related to fracture intensity. Geologically realistic Probability Density Functions (PDFs) of matrix block size, such as uniform, bimodal, linear and exponential distributions, are examined and pseudo-steady-state and transient models for inter porosity flow are assumed. The results have been physically interpreted, and, despite results obtained by other authors, it was found that the shape of pressure derivative curves for different PDFs are basically identical within some ranges of block size variability, inter porosity skin, PDFs parameters and matrix storage capacity. This tool can give an insight on the distribution of block sizes and shapes, together with other sources of information such as Logs and geological observations. (author)

  7. Mesoporous ethanesilica materials with bimodal and trimodal pore-size distributions synthesised in the presence of cobalt ions

    Directory of Open Access Journals (Sweden)

    Alufelwi M. Tshavhungwe

    2010-07-01

    Full Text Available Mesoporous organosilica materials containing ethane groups in their framework were formed with two and three pore sizes (i.e. bimodal and trimodal pores when synthesised by the sol-gel method in the presence of cobalt ions. The compounds 1,2-bistrimethoxysilylethane and tetraethylorthosilicate were used as silicon sources and the reactions were done in the presence of a surfactant, which served as a template. Diffuse reflectance infrared Fourier transform spectroscopy revealed that organic functional groups were incorporated into the ethanesilica. Powder X-ray diffraction and nitrogen adsorption data indicated that the mesophase and textural properties (surface area, pore volume, pore diameter of the materials were dependent on the ageing temperature, the amount/ratio of silica precursors and cobalt ion incorporation. Secondary mesopores were drastically reduced by changing the ratio of silicon precursors.

  8. Does bimodal stimulus presentation increase ERP components usable in BCIs?

    NARCIS (Netherlands)

    Thurlings, M.E.; Brouwer, A.M.; Erp, J.B.F. van; Blankertz, B.; Werkhoven, P.J.

    2012-01-01

    Event-related potential (ERP)-based brain–computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. Typically, visual stimuli are used. Tactile stimuli have recently been suggested as a gaze-independent alternative. Bimodal stimuli could evoke additional brain

  9. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  10. Various models for pion probability distributions from heavy-ion collisions

    International Nuclear Information System (INIS)

    Mekjian, A.Z.; Mekjian, A.Z.; Schlei, B.R.; Strottman, D.; Schlei, B.R.

    1998-01-01

    Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength η of a Poisson emitter and a critical density η c are connected in a thermal model by η/n c =e -m/T <1, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can be much larger than Poisson in the pion laser model and for a negative binomial description. The clan representation of the negative binomial distribution due to Van Hove and Giovannini is discussed using the present description. Applications to CERN/NA44 and CERN/NA49 data are discussed in terms of the relativistic hydrodynamic model. copyright 1998 The American Physical Society

  11. Analytical models of probability distribution and excess noise factor of solid state photomultiplier signals with crosstalk

    International Nuclear Information System (INIS)

    Vinogradov, S.

    2012-01-01

    Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.

  12. The probability distribution of maintenance cost of a system affected by the gamma process of degradation: Finite time solution

    International Nuclear Information System (INIS)

    Cheng, Tianjin; Pandey, Mahesh D.; Weide, J.A.M. van der

    2012-01-01

    The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.

  13. Measuring sensitivity in pharmacoeconomic studies. Refining point sensitivity and range sensitivity by incorporating probability distributions.

    Science.gov (United States)

    Nuijten, M J

    1999-07-01

    The aim of the present study is to describe a refinement of a previously presented method, based on the concept of point sensitivity, to deal with uncertainty in economic studies. The original method was refined by the incorporation of probability distributions which allow a more accurate assessment of the level of uncertainty in the model. In addition, a bootstrap method was used to create a probability distribution for a fixed input variable based on a limited number of data points. The original method was limited in that the sensitivity measurement was based on a uniform distribution of the variables and that the overall sensitivity measure was based on a subjectively chosen range which excludes the impact of values outside the range on the overall sensitivity. The concepts of the refined method were illustrated using a Markov model of depression. The application of the refined method substantially changed the ranking of the most sensitive variables compared with the original method. The response rate became the most sensitive variable instead of the 'per diem' for hospitalisation. The refinement of the original method yields sensitivity outcomes, which greater reflect the real uncertainty in economic studies.

  14. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  15. The role of continuous low-frequency harmonicity cues for interrupted speech perception in bimodal hearing.

    Science.gov (United States)

    Oh, Soo Hee; Donaldson, Gail S; Kong, Ying-Yee

    2016-04-01

    Low-frequency acoustic cues have been shown to enhance speech perception by cochlear-implant users, particularly when target speech occurs in a competing background. The present study examined the extent to which a continuous representation of low-frequency harmonicity cues contributes to bimodal benefit in simulated bimodal listeners. Experiment 1 examined the benefit of restoring a continuous temporal envelope to the low-frequency ear while the vocoder ear received a temporally interrupted stimulus. Experiment 2 examined the effect of providing continuous harmonicity cues in the low-frequency ear as compared to restoring a continuous temporal envelope in the vocoder ear. Findings indicate that bimodal benefit for temporally interrupted speech increases when continuity is restored to either or both ears. The primary benefit appears to stem from the continuous temporal envelope in the low-frequency region providing additional phonetic cues related to manner and F1 frequency; a secondary contribution is provided by low-frequency harmonicity cues when a continuous representation of the temporal envelope is present in the low-frequency, or both ears. The continuous temporal envelope and harmonicity cues of low-frequency speech are thought to support bimodal benefit by facilitating identification of word and syllable boundaries, and by restoring partial phonetic cues that occur during gaps in the temporally interrupted stimulus.

  16. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Science.gov (United States)

    Chen, Rongda; Wang, Ze

    2013-01-01

    Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  17. A bimodal power and propulsion system based on cermet fuel and heat pipe energy transport

    International Nuclear Information System (INIS)

    Polansky, G.F.; Gunther, N.A.; Rochow, R.F.; Bixler, C.H.

    1995-01-01

    Bimodal space reactor systems provide both thermal propulsion for the spacecraft orbital transfer and electrical power to the spacecraft bus once it is on station. These systems have the potential to increase both the available payload in high energy orbits and the available power to that payload. These increased mass and power capabilities can be used to either reduce mission cost by permitting the use of smaller launch vehicles or to provide increased mission performance from the current launch vehicle. A major barrier to the deployment of these bimodal systems has been the cost associated with their development. This paper describes a bimodal reactor system with performance potential to permit more than 70% of the instrumented payload of the Titan IV/Centaur to be launched from the Atlas IIAS. The development cost is minimized by basing the design on existing component technologies

  18. Emotional Intensity Modulates the Integration of Bimodal Angry Expressions: ERP Evidence

    Directory of Open Access Journals (Sweden)

    Zhihui Pan

    2017-06-01

    Full Text Available Integration of information from face and voice plays a central role in social interactions. The present study investigated the modulation of emotional intensity on the integration of facial-vocal emotional cues by recording EEG for participants while they were performing emotion identification task on facial, vocal, and bimodal angry expressions varying in emotional intensity. Behavioral results showed the rates of anger and reaction speed increased as emotional intensity across modalities. Critically, the P2 amplitudes were larger for bimodal expressions than for the sum of facial and vocal expressions for low emotional intensity stimuli, but not for middle and high emotional intensity stimuli. These findings suggested that emotional intensity modulates the integration of facial-vocal angry expressions, following the principle of Inverse Effectiveness (IE in multimodal sensory integration.

  19. Matching Automatic Gain Control Across Devices in Bimodal Cochlear Implant Users.

    Science.gov (United States)

    Veugen, Lidwien C E; Chalupper, Josef; Snik, Ad F M; Opstal, A John van; Mens, Lucas H M

    2016-01-01

    The purpose of this study was to improve bimodal benefit in listeners using a cochlear implant (CI) and a hearing aid (HA) in contralateral ears, by matching the time constants and the number of compression channels of the automatic gain control (AGC) of the HA to the CI. Equivalent AGC was hypothesized to support a balanced loudness for dynamically changing signals like speech and improve bimodal benefit for speech understanding in quiet and with noise presented from the side(s) at 90 degree. Fifteen subjects participated in the study, all using the same Advanced Bionics Harmony CI processor and HA (Phonak Naida S IX UP). In a 3-visit crossover design with 4 weeks between sessions, performance was measured using a HA with a standard AGC (syllabic multichannel compression with 1 ms attack time and 50 ms release time) or an AGC that was adjusted to match that of the CI processor (dual AGC broadband compression, 3 and 240 msec attack time, 80 and 1500 msec release time). In all devices, the AGC was activated above the threshold of 63 dB SPL. The authors balanced loudness across the devices for soft and loud input sounds in 3 frequency bands (0 to 548, 548 to 1000, and >1000 Hz). Speech understanding was tested in free field in quiet and in noise for three spatial speaker configurations, with target speech always presented from the front. Single-talker noise was either presented from the CI side or the HA side, or uncorrelated stationary speech-weighted noise or single-talker noise was presented from both sides. Questionnaires were administered to assess differences in perception between the two bimodal fittings. Significant bimodal benefit over the CI alone was only found for the AGC-matched HA for the speech tests with single-talker noise. Compared with the standard HA, matched AGC characteristics significantly improved speech understanding in single-talker noise by 1.9 dB when noise was presented from the HA side. AGC matching increased bimodal benefit

  20. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  1. Evaluating the suitability of wind speed probability distribution models: A case of study of east and southeast parts of Iran

    International Nuclear Information System (INIS)

    Alavi, Omid; Mohammadi, Kasra; Mostafaeipour, Ali

    2016-01-01

    Highlights: • Suitability of different wind speed probability functions is assessed. • 5 stations distributed in east and south-east of Iran are considered as case studies. • Nakagami distribution is tested for first time and compared with 7 other functions. • Due to difference in wind features, best function is not similar for all stations. - Abstract: Precise information of wind speed probability distribution is truly significant for many wind energy applications. The objective of this study is to evaluate the suitability of different probability functions for estimating wind speed distribution at five stations, distributed in the east and southeast of Iran. Nakagami distribution function is utilized for the first time to estimate the distribution of wind speed. The performance of Nakagami function is compared with seven typically used distribution functions. The achieved results reveal that the more effective function is not similar among all stations. Wind speed characteristics, quantity and quality of the recorded wind speed data can be considered as influential parameters on the performance of the distribution functions. Also, the skewness of the recorded wind speed data may have influence on the accuracy of the Nakagami distribution. For Chabahar and Khaf stations the Nakagami distribution shows the highest performance while for Lutak, Rafsanjan and Zabol stations the Gamma, Generalized Extreme Value and Inverse-Gaussian distributions offer the best fits, respectively. Based on the analysis, the Nakagami distribution can generally be considered as an effective distribution since it provides the best fits in 2 stations and ranks 3rd to 5th in the remaining stations; however, due to the close performance of the Nakagami and Weibull distributions and also flexibility of the Weibull function as its widely proven feature, more assessments on the performance of the Nakagami distribution are required.

  2. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  3. New method for extracting tumors in PET/CT images based on the probability distribution

    International Nuclear Information System (INIS)

    Nitta, Shuhei; Hontani, Hidekata; Hukami, Tadanori

    2006-01-01

    In this report, we propose a method for extracting tumors from PET/CT images by referring to the probability distribution of pixel values in the PET image. In the proposed method, first, the organs that normally take up fluorodeoxyglucose (FDG) (e.g., the liver, kidneys, and brain) are extracted. Then, the tumors are extracted from the images. The distribution of pixel values in PET images differs in each region of the body. Therefore, the threshold for detecting tumors is adaptively determined by referring to the distribution. We applied the proposed method to 37 cases and evaluated its performance. This report also presents the results of experiments comparing the proposed method and another method in which the pixel values are normalized for extracting tumors. (author)

  4. A comparison of the probability distribution of observed substorm magnitude with that predicted by a minimal substorm model

    Directory of Open Access Journals (Sweden)

    S. K. Morley

    2007-11-01

    Full Text Available We compare the probability distributions of substorm magnetic bay magnitudes from observations and a minimal substorm model. The observed distribution was derived previously and independently using the IL index from the IMAGE magnetometer network. The model distribution is derived from a synthetic AL index time series created using real solar wind data and a minimal substorm model, which was previously shown to reproduce observed substorm waiting times. There are two free parameters in the model which scale the contributions to AL from the directly-driven DP2 electrojet and loading-unloading DP1 electrojet, respectively. In a limited region of the 2-D parameter space of the model, the probability distribution of modelled substorm bay magnitudes is not significantly different to the observed distribution. The ranges of the two parameters giving acceptable (95% confidence level agreement are consistent with expectations using results from other studies. The approximately linear relationship between the two free parameters over these ranges implies that the substorm magnitude simply scales linearly with the solar wind power input at the time of substorm onset.

  5. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham

    2017-04-07

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.

  6. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    International Nuclear Information System (INIS)

    Viana, R.S.; Yoriyaz, H.; Santos, A.

    2011-01-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  7. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Viana, R.S.; Yoriyaz, H.; Santos, A., E-mail: rodrigossviana@gmail.com, E-mail: hyoriyaz@ipen.br, E-mail: asantos@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  8. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    Science.gov (United States)

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  9. The probability distribution of extreme precipitation

    Science.gov (United States)

    Korolev, V. Yu.; Gorshenin, A. K.

    2017-12-01

    On the basis of the negative binomial distribution of the duration of wet periods calculated per day, an asymptotic model is proposed for distributing the maximum daily rainfall volume during the wet period, having the form of a mixture of Frechet distributions and coinciding with the distribution of the positive degree of a random variable having the Fisher-Snedecor distribution. The method of proving the corresponding result is based on limit theorems for extreme order statistics in samples of a random volume with a mixed Poisson distribution. The adequacy of the models proposed and methods of their statistical analysis is demonstrated by the example of estimating the extreme distribution parameters based on real data.

  10. Snow-melt flood frequency analysis by means of copula based 2D probability distributions for the Narew River in Poland

    Directory of Open Access Journals (Sweden)

    Bogdan Ozga-Zielinski

    2016-06-01

    New hydrological insights for the region: The results indicated that the 2D normal probability distribution model gives a better probabilistic description of snowmelt floods characterized by the 2-dimensional random variable (Qmax,f, Vf compared to the elliptical Gaussian copula and Archimedean 1-parameter Gumbel–Hougaard copula models, in particular from the view point of probability of exceedance as well as complexity and time of computation. Nevertheless, the copula approach offers a new perspective in estimating the 2D probability distribution for multidimensional random variables. Results showed that the 2D model for snowmelt floods built using the Gumbel–Hougaard copula is much better than the model built using the Gaussian copula.

  11. Uncertainty of Hydrological Drought Characteristics with Copula Functions and Probability Distributions: A Case Study of Weihe River, China

    Directory of Open Access Journals (Sweden)

    Panpan Zhao

    2017-05-01

    Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.

  12. The monomer-to-dimer transition and bimodal growth of Co-salen on NaCl(001): a high resolution atomic force microscopy study

    International Nuclear Information System (INIS)

    Fremy, S; Schwarz, A; Laemmle, K; Wiesendanger, R; Prosenc, M

    2009-01-01

    Molecules of Co-salen, a paramagnetic metal-organic Schiff base complex, self-assemble into two different well ordered morphologies on a NaCl(001) substrate: nanowires, which form networks, and compact nanocrystallites. Their growth can be controlled by adjusting the deposition parameters. It turns out that the nanowires are metastable. Molecular resolution images suggest that the packing in both morphologies is the same as in bulk Co-salen single crystals. Only the orientation of the c-axis with respect to the substrate is different. The origin of this intriguing bimodal growth is associated with a monomer-to-dimer transition, which probably takes place during initial nucleation at step edges.

  13. The monomer-to-dimer transition and bimodal growth of Co-salen on NaCl(001): a high resolution atomic force microscopy study

    Energy Technology Data Exchange (ETDEWEB)

    Fremy, S; Schwarz, A; Laemmle, K; Wiesendanger, R [Institute of Applied Physics and Microstructure Research Center, University of Hamburg, Jungiusstrasse 11, 20355 Hamburg (Germany); Prosenc, M, E-mail: aschwarz@physnet.uni-hamburg.d [Institute of Inorganic and Applied Chemistry, University of Hamburg, Martin-Luther-King Platz 6, 20146 Hamburg (Germany)

    2009-10-07

    Molecules of Co-salen, a paramagnetic metal-organic Schiff base complex, self-assemble into two different well ordered morphologies on a NaCl(001) substrate: nanowires, which form networks, and compact nanocrystallites. Their growth can be controlled by adjusting the deposition parameters. It turns out that the nanowires are metastable. Molecular resolution images suggest that the packing in both morphologies is the same as in bulk Co-salen single crystals. Only the orientation of the c-axis with respect to the substrate is different. The origin of this intriguing bimodal growth is associated with a monomer-to-dimer transition, which probably takes place during initial nucleation at step edges.

  14. Probability distribution of dose rates in the body tissue as a function of the rhytm of Sr90 administration and the age of animals

    International Nuclear Information System (INIS)

    Rasin, I.M.; Sarapul'tsev, I.A.

    1975-01-01

    The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law

  15. Bimodality and the formation of Saturn's ring particles

    International Nuclear Information System (INIS)

    Gehrels, T.

    1980-01-01

    The F ring appears to have an outer and an inner rim, with only the latter observed by the imaging photopolarimeter (IPP) on the Pioneer Saturn spacecraft. The inside of the G ring, near 2.49 R/sub S/, may also be seen in the optical data. 1979S1 is red as well as dark. The light scattered through the B ring is noticeably red. The A ring has a dense outer rim. The Cassini Division and the French Division (Dollfus Division) have a dark gap near their centers. The C ring becomes weaker toward the center such that outer, middle, and inner C rings can be recognized. The Pioneer and earth-based observations are explained with a model for the B and A rings to some extent of a bimodal size distributions of particles; the larger ones may be original accretions, while small debris diffuses inward through the Cassini Division and the C ring. During the formation of the ring system, differential gravitation allowed only silicaceous grains of higher density (rho> or approx. =3 g cm -3 ) to coagulate. These serve as interstitial cores for snowy carbonaceous grains, between the times of accretion from interplanetary cometary grains and liberation by collision followed by diffusion inward to Saturn and final evaporation

  16. Single-crystal 40Ar/39Ar incremental heating reveals bimodal sanidine ages in the Bishop Tuff

    Science.gov (United States)

    Andersen, N. L.; Jicha, B. R.; Singer, B. S.

    2015-12-01

    The 650 km3 Bishop Tuff (BT) is among the most studied volcanic deposits because it is an extensive marker bed deposited just after the Matuyama-Brunhes boundary. Reconstructions of the vast BT magma reservoir from which high-silica rhyolite erupted have long influenced thinking about how large silicic magma systems are assembled, crystallized, and mixed. Yet, the longevity of the high silica rhyolitic melt and exact timing of the eruption remain controversial due to recent conflicting 40Ar/39Ar sanidine vs. SIMS and ID-TIMS U-Pb zircon dates. We have undertaken 21 40Ar/39Ar incremental heating ages on 2 mm BT sanidine crystals from pumice in 3 widely separated outcrops of early-erupted fall and flow units. Plateau ages yield a bimodal distribution: a younger group has a mean of 766 ka and an older group gives a range between 772 and 782 ka. The younger population is concordant with the youngest ID-TIMS and SIMS U-Pb zircon ages recently published, as well as the astronomical age of BT in marine sediment. Of 21 crystals, 17 yield older, non-plateau, steps likely affected by excess Ar that would bias traditional 40Ar/39Ar total crystal fusion ages. The small spread in older sanidine ages, together with 25+ kyr of pre-eruptive zircon growth, suggest that the older sanidines are not partially outgassed xenocrysts. A bimodal 40Ar/39Ar age distribution implies that some fraction of rhyolitic melt cooled below the Ar closure temperature at least 10 ky prior to eruption. We propose that rapid "thawing" of a crystalline mush layer released older crystals into rhyolitic melt from which sanidine also nucleated and grew immediately prior to the eruption. High precision 40Ar/39Ar dating can thus provide essential information on thermo-physical processes at the millenial time scale that are critical to interpreting U-Pb zircon age distributions that are complicated by large uncertainties associated with zircon-melt U-Th systematics.

  17. pH and its frequency distribution patterns of Acid Precipitation in Japan

    International Nuclear Information System (INIS)

    Kitamura, Moritsugu; Katou, Takunori; Sekiguchi, Kyoichi

    1991-01-01

    The pH data was collected at the 29 stations in Phase-I study of Acid Precipitation Survey over Japan by Japan Environment Agency in terms of frequency distribution patterns. This study was undertaken from April 1984 to March 1988, which was the first survey of acid precipitation over Japan with identical sampling procedures and subsequent chemical analyses. While the annual mean pH at each station ranged from 4.4 to 5.5, the monthly mean varied more widely, from 4.0 to 7.1. Its frequency distribution pattern was obtained for each station, and further grouped into four classes: class I; a mode at the rank of pH 4.5∼4.9, class II; bimodes above and below this pH region, class III; a mode at a higher pH region, class IV; a mode at a lower pH region. The bimodal pattern was suggestive of precipitation with and without incorporation of significant amounts of basic aerosol of anthropogenic origin during descent of rain droplet. The patterns of the stations were also classified on a basis of summer-winter difference into another four classes. Winter pH values were appreciably lower than summer pHs in western parts of Japan and on Japan Sea coast, we attribute the winter pH to probable contribution of acidic pollutants transported by strong winter monsoon from Eurasian Continent. At most stations in northern and eastern Japan, the pH was higher in winter months reflecting more incorporation of basic materials, e.g., NH 4 + and Ca 2+ . (author)

  18. Bimodal Networks as Candidates for Electroactive Polymers

    DEFF Research Database (Denmark)

    Bahrt, Frederikke; Daugaard, Anders Egede; Bejenariu, Anca Gabriela

    An alternative network formulation method was adopted in order to obtain a different type of silicone based elastomeric systems - the so-called bimodal networks - using two vinyl-terminated polydimethyl siloxanes (PDMS) of different molecular weight, a labelled crosslinker (3 or 4-functional), an...... themselves between the long chains and show how this leads to unexpectedly good properties for DEAP purposes due both to the low extensibility of the short chains that attach strongly the long chains and to the extensibility of the last ones that retards the rupture process....

  19. Assessing the Adequacy of Probability Distributions for Estimating the Extreme Events of Air Temperature in Dabaa Region

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2015-01-01

    Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October

  20. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Rongda Chen

    Full Text Available Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  1. Curve Fitting of the Corporate Recovery Rates: The Comparison of Beta Distribution Estimation and Kernel Density Estimation

    Science.gov (United States)

    Chen, Rongda; Wang, Ze

    2013-01-01

    Recovery rate is essential to the estimation of the portfolio’s loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody’s. However, it has a fatal defect that it can’t fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody’s new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management. PMID:23874558

  2. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    International Nuclear Information System (INIS)

    Caleyo, F.; Velazquez, J.C.; Valor, A.; Hallen, J.M.

    2009-01-01

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  3. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  4. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  5. Matrix-exponential distributions in applied probability

    CERN Document Server

    Bladt, Mogens

    2017-01-01

    This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...

  6. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  7. Controllability of a Class of Bimodal Discrete-Time Piecewise Linear Systems

    NARCIS (Netherlands)

    Yurtseven, E.; Camlibel, M.K.; Heemels, W.P.M.H.

    2013-01-01

    In this paper we will provide algebraic necessary and sufficient conditions for the controllability/reachability/null controllability of a class of bimodal discrete-time piecewise linear systems including several instances of interest that are not covered by existing works which focus primarily on

  8. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  9. First-order phase transition in the quantum spin glass at T=0

    Energy Technology Data Exchange (ETDEWEB)

    Viana, J. Roberto; Nogueira, Yamilles; Sousa, J. Ricardo de

    2003-05-26

    The van Hemmen model with transverse and random longitudinal field is studied to analyze the tricritical behavior in the quantum Ising spin glass at T=0. The free energy and order parameter are calculated for two types of probability distributions: Gaussian and bimodal. We obtain the phase diagram in the {omega}-H plane, where {omega} and H are the transverse and random longitudinal fields, respectively. For the case of Gaussian distribution the phase transition is of second order, while the bimodal distribution we observe second-order transition for high-transverse field and first-order transition for small transverse field, with a tricritical point in the phase diagram.

  10. First-order phase transition in the quantum spin glass at T=0

    International Nuclear Information System (INIS)

    Viana, J. Roberto; Nogueira, Yamilles; Sousa, J. Ricardo de

    2003-01-01

    The van Hemmen model with transverse and random longitudinal field is studied to analyze the tricritical behavior in the quantum Ising spin glass at T=0. The free energy and order parameter are calculated for two types of probability distributions: Gaussian and bimodal. We obtain the phase diagram in the Ω-H plane, where Ω and H are the transverse and random longitudinal fields, respectively. For the case of Gaussian distribution the phase transition is of second order, while the bimodal distribution we observe second-order transition for high-transverse field and first-order transition for small transverse field, with a tricritical point in the phase diagram

  11. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... of the station’s infrastructure layout and plan of operation. However, these three methods do not take the timetable at the station into consideration. Therefore, two methods are introduced in this paper, making it possible to estimate the robustness of different timetables at a station or different...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time...

  12. Description of atomic burials in compact globular proteins by Fermi-Dirac probability distributions.

    Science.gov (United States)

    Gomes, Antonio L C; de Rezende, Júlia R; Pereira de Araújo, Antônio F; Shakhnovich, Eugene I

    2007-02-01

    We perform a statistical analysis of atomic distributions as a function of the distance R from the molecular geometrical center in a nonredundant set of compact globular proteins. The number of atoms increases quadratically for small R, indicating a constant average density inside the core, reaches a maximum at a size-dependent distance R(max), and falls rapidly for larger R. The empirical curves turn out to be consistent with the volume increase of spherical concentric solid shells and a Fermi-Dirac distribution in which the distance R plays the role of an effective atomic energy epsilon(R) = R. The effective chemical potential mu governing the distribution increases with the number of residues, reflecting the size of the protein globule, while the temperature parameter beta decreases. Interestingly, betamu is not as strongly dependent on protein size and appears to be tuned to maintain approximately half of the atoms in the high density interior and the other half in the exterior region of rapidly decreasing density. A normalized size-independent distribution was obtained for the atomic probability as a function of the reduced distance, r = R/R(g), where R(g) is the radius of gyration. The global normalized Fermi distribution, F(r), can be reasonably decomposed in Fermi-like subdistributions for different atomic types tau, F(tau)(r), with Sigma(tau)F(tau)(r) = F(r), which depend on two additional parameters mu(tau) and h(tau). The chemical potential mu(tau) affects a scaling prefactor and depends on the overall frequency of the corresponding atomic type, while the maximum position of the subdistribution is determined by h(tau), which appears in a type-dependent atomic effective energy, epsilon(tau)(r) = h(tau)r, and is strongly correlated to available hydrophobicity scales. Better adjustments are obtained when the effective energy is not assumed to be necessarily linear, or epsilon(tau)*(r) = h(tau)*r(alpha,), in which case a correlation with hydrophobicity

  13. Transient bimodality in interacting particle systems

    International Nuclear Information System (INIS)

    Calderoni, P.; Pellegrinotti, A.; Presutti, E.; Vares, M.E.

    1989-01-01

    The authors consider a system of spins which have values ± 1 and evolve according to a jump Markov process whose generator is the sum of two generators, one describing a spin-flip Glauber process, the other a Kawasaki (stirring) evolution. It was proven elsewhere that if the Kawasaki dynamics is speeded up by a factor var-epsilon -2 , then, in the limit var-epsilon → 0 (continuum limit), propagation of chaos holds and the local magnetization solves a reaction-diffusion equation. They choose the parameters of the Glauber interaction so that the potential of the reaction term in the reaction-diffusion equation is a double-well potential with quartic maximum at the origin. They assume further that for each var-epsilon the system is in a finite interval of Z with var-epsilon -1 sites and periodic boundary conditions. They specify the initial measure as the product measure with 0 spin average, thus obtaining, in the continuum limit, a constant magnetic profile equal to 0, which is a stationary unstable solution to the reaction-diffusion equation. They prove that at times of the order var-epsilon -1/2 propagation of chaos does not hold any more and, in the limit as var-epsilon → 0, the state becomes a nontrivial superposition of Bernoulli measures with parameters corresponding to the minima of the reaction potential. The coefficients of such a superposition depend on time (on the scale var-epsilon -1/2 ) and at large times (on this scale) the coefficient of the term corresponding to the initial magnetization vanishes (transient bimodality). This differs from what was observed by De Masi, Presutti, and Vares, who considered a reaction potential with quadratic maximum and no bimodal effect was seen, as predicted by Broggi, Lugiato, and Colombo

  14. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    Science.gov (United States)

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Stochastic Economic Dispatch with Wind using Versatile Probability Distribution and L-BFGS-B Based Dual Decomposition

    DEFF Research Database (Denmark)

    Huang, Shaojun; Sun, Yuanzhang; Wu, Qiuwei

    2018-01-01

    This paper focuses on economic dispatch (ED) in power systems with intermittent wind power, which is a very critical issue in future power systems. A stochastic ED problem is formed based on the recently proposed versatile probability distribution (VPD) of wind power. The problem is then analyzed...

  16. Linkage disequilibrium in the insulin gene region: Size variation at the 5{prime} flanking polymorphism and bimodality among {open_quotes}Class I{close_quotes} alleles

    Energy Technology Data Exchange (ETDEWEB)

    McGinnis, R.E.; Spielman, R.S. [Univ. of Pennsylvania School of Medicine, Philadelphia, PA (United States)

    1994-09-01

    The 5{prime} flanking polymorphism (5{prime}FP), a hypervariable region at the 5{prime} end of the insulin gene, has {open_quotes}class 1{close_quotes} alleles (650-900 bp long) that are in positive linkage disequilibrium with insulin-dependent diabetes mellitus (IDDM). The authors report that precise sizing of the 5{prime}FP yields a bimodal frequency distribution of class 1 allele lengths. Class 1 alleles belonging to the lower component (650-750 bp) of the bimodal distribution were somewhat more highly associated with IDDM than were alleles from the upper component (760-900 bp), but the difference was not statistically significant. They also examined 5{prime}FP length variation in relation to allelic variation at nearby polymorphisms. At biallelic RFLPs on both sides of the 5{prime}FP, they found that one allele exhibits near-total association with the upper component of the 5FP class 1 distribution. Such associations represent a little-known but potentially wide-spread form of linkage disequilibrium. In this type of disequilibrium, a flanking allele has near-complete association with a single mode of VNTR alleles whose lengths represent consecutive numbers of tandem repeats (CNTR). Such extreme disequilibrium between a CNTR mode and flanking alleles may originate and persist because length mutations at some VNTR loci usually add or delete only one or two repeat units. 22 refs., 5 figs., 6 tabs.

  17. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    Science.gov (United States)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  18. Analyses of moments in pseudorapidity intervals at √s = 546 GeV by means of two probability distributions in pure-birth process

    International Nuclear Information System (INIS)

    Biyajima, M.; Shirane, K.; Suzuki, N.

    1988-01-01

    Moments in pseudorapidity intervals at the CERN Sp-barpS collider (√s = 546 GeV) are analyzed by means of two probability distributions in the pure-birth stochastic process. Our results show that a probability distribution obtained from the Poisson distribution as an initial condition is more useful than that obtained from the Kronecker δ function. Analyses of moments by Koba-Nielsen-Olesen scaling functions derived from solutions of the pure-birth stochastic process are also made. Moreover, analyses of preliminary data at √s = 200 and 900 GeV are added

  19. Deaf Parents of Cochlear-Implanted Children: Beliefs on Bimodal Bilingualism

    Science.gov (United States)

    Mitchiner, Julie Cantrell

    2015-01-01

    This study investigated 17 Deaf families in North America with cochlear-implanted children about their attitudes, beliefs, and practices on bimodal bilingualism (defined as using both a visual/manual language and an aural/oral language) in American Sign Language (ASL) and English. A survey and follow-up interviews with 8 families were conducted.…

  20. Probability distribution of magnetization in the one-dimensional Ising model: effects of boundary conditions

    Energy Technology Data Exchange (ETDEWEB)

    Antal, T [Physics Department, Simon Fraser University, Burnaby, BC V5A 1S6 (Canada); Droz, M [Departement de Physique Theorique, Universite de Geneve, CH 1211 Geneva 4 (Switzerland); Racz, Z [Institute for Theoretical Physics, Eoetvoes University, 1117 Budapest, Pazmany setany 1/a (Hungary)

    2004-02-06

    Finite-size scaling functions are investigated both for the mean-square magnetization fluctuations and for the probability distribution of the magnetization in the one-dimensional Ising model. The scaling functions are evaluated in the limit of the temperature going to zero (T {yields} 0), the size of the system going to infinity (N {yields} {infinity}) while N[1 - tanh(J/k{sub B}T)] is kept finite (J being the nearest neighbour coupling). Exact calculations using various boundary conditions (periodic, antiperiodic, free, block) demonstrate explicitly how the scaling functions depend on the boundary conditions. We also show that the block (small part of a large system) magnetization distribution results are identical to those obtained for free boundary conditions.

  1. Understanding the distinctively skewed and heavy tailed character of atmospheric and oceanic probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Sardeshmukh, Prashant D., E-mail: Prashant.D.Sardeshmukh@noaa.gov [CIRES, University of Colorado, Boulder, Colorado 80309 (United States); NOAA/Earth System Research Laboratory, Boulder, Colorado 80305 (United States); Penland, Cécile [NOAA/Earth System Research Laboratory, Boulder, Colorado 80305 (United States)

    2015-03-15

    The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework.

  2. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    Science.gov (United States)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  3. Multiscale probability distribution of pressure fluctuations in fluidized beds

    International Nuclear Information System (INIS)

    Ghasemi, Fatemeh; Sahimi, Muhammad; Reza Rahimi Tabar, M; Peinke, Joachim

    2012-01-01

    Analysis of flow in fluidized beds, a common chemical reactor, is of much current interest due to its fundamental as well as industrial importance. Experimental data for the successive increments of the pressure fluctuations time series in a fluidized bed are analyzed by computing a multiscale probability density function (PDF) of the increments. The results demonstrate the evolution of the shape of the PDF from the short to long time scales. The deformation of the PDF across time scales may be modeled by the log-normal cascade model. The results are also in contrast to the previously proposed PDFs for the pressure fluctuations that include a Gaussian distribution and a PDF with a power-law tail. To understand better the properties of the pressure fluctuations, we also construct the shuffled and surrogate time series for the data and analyze them with the same method. It turns out that long-range correlations play an important role in the structure of the time series that represent the pressure fluctuation. (paper)

  4. NERVA-Derived Concept for a Bimodal Nuclear Thermal Rocket

    International Nuclear Information System (INIS)

    Fusselman, Steven P.; Frye, Patrick E.; Gunn, Stanley V.; Morrison, Calvin Q.; Borowski, Stanley K.

    2005-01-01

    The Nuclear Thermal Rocket is an enabling technology for human exploration missions. The 'bimodal' NTR (BNTR) provides a novel approach to meeting both propulsion and power requirements of future manned and robotic missions. The purpose of this study was to evaluate tie-tube cooling configurations, NTR performance, Brayton cycle performance, and LOX-Augmented NTR (LANTR) feasibility to arrive at a point of departure BNTR configuration for subsequent system definition

  5. Combining bimodal presentation schemes and buzz groups improves clinical reasoning and learning at morning report.

    Science.gov (United States)

    Balslev, Thomas; Rasmussen, Astrid Bruun; Skajaa, Torjus; Nielsen, Jens Peter; Muijtjens, Arno; De Grave, Willem; Van Merriënboer, Jeroen

    2014-12-11

    Abstract Morning reports offer opportunities for intensive work-based learning. In this controlled study, we measured learning processes and outcomes with the report of paediatric emergency room patients. Twelve specialists and 12 residents were randomised into four groups and discussed the same two paediatric cases. The groups differed in their presentation modality (verbal only vs. verbal + text) and the use of buzz groups (with vs. without). The verbal interactions were analysed for clinical reasoning processes. Perceptions of learning and judgment of learning were reported in a questionnaire. Diagnostic accuracy was assessed by a 20-item multiple-choice test. Combined bimodal presentation and buzz groups increased the odds ratio of clinical reasoning to occur in the discussion of cases by a factor of 1.90 (p = 0.013), indicating superior reasoning for buzz groups working with bimodal materials. For specialists, a positive effect of bimodal presentation was found on perceptions of learning (p presentation on diagnostic accuracy was noted in the specialists (p presentation and buzz group discussion of emergency cases improves clinicians' clinical reasoning and learning.

  6. Oncogenic Nras has bimodal effects on stem cells that sustainably increase competitiveness.

    Science.gov (United States)

    Li, Qing; Bohin, Natacha; Wen, Tiffany; Ng, Victor; Magee, Jeffrey; Chen, Shann-Ching; Shannon, Kevin; Morrison, Sean J

    2013-12-05

    'Pre-leukaemic' mutations are thought to promote clonal expansion of haematopoietic stem cells (HSCs) by increasing self-renewal and competitiveness; however, mutations that increase HSC proliferation tend to reduce competitiveness and self-renewal potential, raising the question of how a mutant HSC can sustainably outcompete wild-type HSCs. Activating mutations in NRAS are prevalent in human myeloproliferative neoplasms and leukaemia. Here we show that a single allele of oncogenic Nras(G12D) increases HSC proliferation but also increases reconstituting and self-renewal potential upon serial transplantation in irradiated mice, all prior to leukaemia initiation. Nras(G12D) also confers long-term self-renewal potential to multipotent progenitors. To explore the mechanism by which Nras(G12D) promotes HSC proliferation and self-renewal, we assessed cell-cycle kinetics using H2B-GFP label retention and 5-bromodeoxyuridine (BrdU) incorporation. Nras(G12D) had a bimodal effect on HSCs, increasing the frequency with which some HSCs divide and reducing the frequency with which others divide. This mirrored bimodal effects on reconstituting potential, as rarely dividing Nras(G12D) HSCs outcompeted wild-type HSCs, whereas frequently dividing Nras(G12D) HSCs did not. Nras(G12D) caused these effects by promoting STAT5 signalling, inducing different transcriptional responses in different subsets of HSCs. One signal can therefore increase HSC proliferation, competitiveness and self-renewal through bimodal effects on HSC gene expression, cycling and reconstituting potential.

  7. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  8. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  9. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  10. Noise exposure alters long-term neural firing rates and synchrony in primary auditory and rostral belt cortices following bimodal stimulation.

    Science.gov (United States)

    Takacs, Joseph D; Forrest, Taylor J; Basura, Gregory J

    2017-12-01

    We previously demonstrated that bimodal stimulation (spinal trigeminal nucleus [Sp5] paired with best frequency tone) altered neural tone-evoked and spontaneous firing rates (SFRs) in primary auditory cortex (A1) 15 min after pairing in guinea pigs with and without noise-induced tinnitus. Neural responses were enhanced (+10 ms) or suppressed (0 ms) based on the bimodal pairing interval. Here we investigated whether bimodal stimulation leads to long-term (up to 2 h) changes in tone-evoked and SFRs and neural synchrony (correlate of tinnitus) and if the long-term bimodal effects are altered following noise exposure. To obviate the effects of permanent hearing loss on the results, firing rates and neural synchrony were measured three weeks following unilateral (left ear) noise exposure and a temporary threshold shift. Simultaneous extra-cellular single-unit recordings were made from contralateral (to noise) A1 and dorsal rostral belt (RB); an associative auditory cortical region thought to influence A1, before and after bimodal stimulation (pairing intervals of 0 ms; simultaneous Sp5-tone and +10 ms; Sp5 precedes tone). Sixty and 120 min after 0 ms pairing tone-evoked and SFRs were suppressed in sham A1; an effect only preserved 120 min following pairing in noise. Stimulation at +10 ms only affected SFRs 120 min after pairing in sham and noise-exposed A1. Within sham RB, pairing at 0 and +10 ms persistently suppressed tone-evoked and SFRs, while 0 ms pairing in noise markedly enhanced tone-evoked and SFRs up to 2 h. Together, these findings suggest that bimodal stimulation has long-lasting effects in A1 that also extend to the associative RB that is altered by noise and may have persistent implications for how noise damaged brains process multi-sensory information. Moreover, prior to bimodal stimulation, noise damage increased neural synchrony in A1, RB and between A1 and RB neurons. Bimodal stimulation led to persistent changes in neural synchrony in

  11. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  12. Bimodal microstructure and deformation of cryomilled bulk nanocrystalline Al-7.5Mg alloy

    International Nuclear Information System (INIS)

    Lee, Z.; Witkin, D.B.; Radmilovic, V.; Lavernia, E.J.; Nutt, S.R.

    2005-01-01

    The microstructure, mechanical properties and deformation response of bimodal structured nanocrystalline Al-7.5Mg alloy were investigated. Grain refinement was achieved by cryomilling of atomized Al-7.5Mg powders, and then cryomilled nanocrystalline powders blended with 15 and 30% unmilled coarse-grained powders were consolidated by hot isostatic pressing followed by extrusion to produce bulk nanocrystalline alloys. Bimodal bulk nanocrystalline Al-7.5Mg alloys, which were comprised of nanocrystalline grains separated by coarse-grain regions, show balanced mechanical properties of enhanced yield and ultimate strength and reasonable ductility and toughness compared to comparable conventional alloys and nanocrystalline metals. The investigation of tensile and hardness test suggests unusual deformation mechanisms and interactions between ductile coarse-grain bands and nanocrystalline regions

  13. Hydrological model calibration for derived flood frequency analysis using stochastic rainfall and probability distributions of peak flows

    Science.gov (United States)

    Haberlandt, U.; Radtke, I.

    2014-01-01

    Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the

  14. Design and characterization of a cough simulator.

    Science.gov (United States)

    Zhang, Bo; Zhu, Chao; Ji, Zhiming; Lin, Chao-Hsin

    2017-02-23

    Expiratory droplets from human coughing have always been considered as potential carriers of pathogens, responsible for respiratory infectious disease transmission. To study the transmission of disease by human coughing, a transient repeatable cough simulator has been designed and built. Cough droplets are generated by different mechanisms, such as the breaking of mucus, condensation and high-speed atomization from different depths of the respiratory tract. These mechanisms in coughing produce droplets of different sizes, represented by a bimodal distribution of 'fine' and 'coarse' droplets. A cough simulator is hence designed to generate transient sprays with such bimodal characteristics. It consists of a pressurized gas tank, a nebulizer and an ejector, connected in series, which are controlled by computerized solenoid valves. The bimodal droplet size distribution is characterized for the coarse droplets and fine droplets, by fibrous collection and laser diffraction, respectively. The measured size distributions of coarse and fine droplets are reasonably represented by the Rosin-Rammler and log-normal distributions in probability density function, which leads to a bimodal distribution. To assess the hydrodynamic consequences of coughing including droplet vaporization and polydispersion, a Lagrangian model of droplet trajectories is established, with its ambient flow field predetermined from a computational fluid dynamics simulation.

  15. Distribution of nuclei in equilibrium stellar matter from the free-energy density in a Wigner-Seitz cell

    Science.gov (United States)

    Grams, G.; Giraud, S.; Fantina, A. F.; Gulminelli, F.

    2018-03-01

    The aim of the present study is to calculate the nuclear distribution associated at finite temperature to any given equation of state of stellar matter based on the Wigner-Seitz approximation, for direct applications in core-collapse simulations. The Gibbs free energy of the different configurations is explicitly calculated, with special care devoted to the calculation of rearrangement terms, ensuring thermodynamic consistency. The formalism is illustrated with two different applications. First, we work out the nuclear statistical equilibrium cluster distribution for the Lattimer and Swesty equation of state, widely employed in supernova simulations. Secondly, we explore the effect of including shell structure, and consider realistic nuclear mass tables from the Brussels-Montreal Hartree-Fock-Bogoliubov model (specifically, HFB-24). We show that the whole collapse trajectory is dominated by magic nuclei, with extremely spread and even bimodal distributions of the cluster probability around magic numbers, demonstrating the importance of cluster distributions with realistic mass models in core-collapse simulations. Simple analytical expressions are given, allowing further applications of the method to any relativistic or nonrelativistic subsaturation equation of state.

  16. Probability distribution of wave packet delay time for strong overlapping of resonance levels

    International Nuclear Information System (INIS)

    Lyuboshits, V.L.

    1983-01-01

    Time behaviour of nuclear reactions in the case of high level densities is investigated basing on the theory of overlapping resonances. In the framework of a model of n equivalent channels an analytical expression is obtained for the probability distribution function for wave packet delay time at the compound nucleus production. It is shown that at strong overlapping of the resonance levels the relative fluctuation of the delay time is small at the stage of compound nucleus production. A possible increase in the duration of nuclear reactions with the excitation energy rise is discussed

  17. Probability distribution of machining center failures

    International Nuclear Information System (INIS)

    Jia Yazhou; Wang Molin; Jia Zhixin

    1995-01-01

    Through field tracing research for 24 Chinese cutter-changeable CNC machine tools (machining centers) over a period of one year, a database of operation and maintenance for machining centers was built, the failure data was fitted to the Weibull distribution and the exponential distribution, the effectiveness was tested, and the failure distribution pattern of machining centers was found. Finally, the reliability characterizations for machining centers are proposed

  18. Bimodal stimulus timing-dependent plasticity in primary auditory cortex is altered after noise exposure with and without tinnitus.

    Science.gov (United States)

    Basura, Gregory J; Koehler, Seth D; Shore, Susan E

    2015-12-01

    Central auditory circuits are influenced by the somatosensory system, a relationship that may underlie tinnitus generation. In the guinea pig dorsal cochlear nucleus (DCN), pairing spinal trigeminal nucleus (Sp5) stimulation with tones at specific intervals and orders facilitated or suppressed subsequent tone-evoked neural responses, reflecting spike timing-dependent plasticity (STDP). Furthermore, after noise-induced tinnitus, bimodal responses in DCN were shifted from Hebbian to anti-Hebbian timing rules with less discrete temporal windows, suggesting a role for bimodal plasticity in tinnitus. Here, we aimed to determine if multisensory STDP principles like those in DCN also exist in primary auditory cortex (A1), and whether they change following noise-induced tinnitus. Tone-evoked and spontaneous neural responses were recorded before and 15 min after bimodal stimulation in which the intervals and orders of auditory-somatosensory stimuli were randomized. Tone-evoked and spontaneous firing rates were influenced by the interval and order of the bimodal stimuli, and in sham-controls Hebbian-like timing rules predominated as was seen in DCN. In noise-exposed animals with and without tinnitus, timing rules shifted away from those found in sham-controls to more anti-Hebbian rules. Only those animals with evidence of tinnitus showed increased spontaneous firing rates, a purported neurophysiological correlate of tinnitus in A1. Together, these findings suggest that bimodal plasticity is also evident in A1 following noise damage and may have implications for tinnitus generation and therapeutic intervention across the central auditory circuit. Copyright © 2015 the American Physiological Society.

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. An innovative method for offshore wind farm site selection based on the interval number with probability distribution

    Science.gov (United States)

    Wu, Yunna; Chen, Kaifeng; Xu, Hu; Xu, Chuanbo; Zhang, Haobo; Yang, Meng

    2017-12-01

    There is insufficient research relating to offshore wind farm site selection in China. The current methods for site selection have some defects. First, information loss is caused by two aspects: the implicit assumption that the probability distribution on the interval number is uniform; and ignoring the value of decision makers' (DMs') common opinion on the criteria information evaluation. Secondly, the difference in DMs' utility function has failed to receive attention. An innovative method is proposed in this article to solve these drawbacks. First, a new form of interval number and its weighted operator are proposed to reflect the uncertainty and reduce information loss. Secondly, a new stochastic dominance degree is proposed to quantify the interval number with a probability distribution. Thirdly, a two-stage method integrating the weighted operator with stochastic dominance degree is proposed to evaluate the alternatives. Finally, a case from China proves the effectiveness of this method.

  1. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  2. Bimodal Bilingual Language Development of Hearing Children of Deaf Parents

    Science.gov (United States)

    Hofmann, Kristin; Chilla, Solveig

    2015-01-01

    Adopting a bimodal bilingual language acquisition model, this qualitative case study is the first in Germany to investigate the spoken and sign language development of hearing children of deaf adults (codas). The spoken language competence of six codas within the age range of 3;10 to 6;4 is assessed by a series of standardised tests (SETK 3-5,…

  3. Disentangling the climate-driven bimodal growth pattern in coastal and continental Mediterranean pine stands.

    Science.gov (United States)

    Pacheco, Arturo; Camarero, J Julio; Ribas, Montse; Gazol, Antonio; Gutierrez, E; Carrer, Marco

    2018-02-15

    Mediterranean climate promotes two distinct growth peaks separated by summer quiescence in trees. This bimodal pattern has been associated to favourable growing conditions during spring and autumn when mild temperatures and soil-water availability enhance cambial activity. Climatic models predict progressive warming and drying for the Mediterranean Basin, which could shorten or shift the spring and autumn growing seasons. We explored this idea by comparing two sites with different Mediterranean climate types (continental/dry and coastal/wet) and studied how climate drives the bimodal growth pattern in Aleppo pine (Pinus halepensis). Specifically we investigated the intra-annual changes in wood anatomy and the corresponding formation of density fluctuations (IADF). Trees on both sites were analyzed by dendrometer monitoring and by developing chronologies of wood anatomical traits. Radial-increment dynamics followed a similar bimodal pattern in both sites but coastal trees showed higher increments during the spring and autumn growth peaks, especially in autumn. The summer rest of cambium activity occurs almost one month earlier in the coastal than in the inland site. Lumen area and cell-wall thickness were significantly smaller in the continental site, while the increment rate of cell-wall thickness during an IADF event was much higher in the coastal pines. The accumulated soil moisture deficit was the main climatic constraint of tracheid enlargement in continental pines. Intra-annual density fluctuations were more frequent in the coastal trees where wood anatomy features recover to average values after such events, meanwhile inland trees presented a much lower recovery rate. Growth bimodality and the formation of density fluctuations were linked, but mild climate of the coastal site allows a longer growing season, which explains why trees in this area showed higher and more variable growth rates. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Seed-dispersal distributions by trumpeter hornbills in fragmented landscapes

    Science.gov (United States)

    Lenz, Johanna; Fiedler, Wolfgang; Caprano, Tanja; Friedrichs, Wolfgang; Gaese, Bernhard H.; Wikelski, Martin; Böhning-Gaese, Katrin

    2011-01-01

    Frugivorous birds provide important ecosystem services by transporting seeds of fleshy fruited plants. It has been assumed that seed-dispersal kernels generated by these animals are generally leptokurtic, resulting in little dispersal among habitat fragments. However, little is known about the seed-dispersal distribution generated by large frugivorous birds in fragmented landscapes. We investigated movement and seed-dispersal patterns of trumpeter hornbills (Bycanistes bucinator) in a fragmented landscape in South Africa. Novel GPS loggers provide high-quality location data without bias against recording long-distance movements. We found a very weakly bimodal seed-dispersal distribution with potential dispersal distances up to 14.5 km. Within forest, the seed-dispersal distribution was unimodal with an expected dispersal distance of 86 m. In the fragmented agricultural landscape, the distribution was strongly bimodal with peaks at 18 and 512 m. Our results demonstrate that seed-dispersal distributions differed when birds moved in different habitat types. Seed-dispersal distances in fragmented landscapes show that transport among habitat patches is more frequent than previously assumed, allowing plants to disperse among habitat patches and to track the changing climatic conditions. PMID:21177686

  5. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  6. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  7. Spatial distribution and occurrence probability of regional new particle formation events in eastern China

    Directory of Open Access Journals (Sweden)

    X. Shen

    2018-01-01

    Full Text Available In this work, the spatial extent of new particle formation (NPF events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ in the North China Plain (NCP, Mt. Tai (TS in central eastern China, and Lin'an (LAN in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100–200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR and new particle formation rate (J than air masses from Inner Mongolia (IM. At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial

  8. Mixture distributions of wind speed in the UAE

    Science.gov (United States)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for

  9. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  10. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  11. Bimodal pollination system of the bromeliad Aechmea nudicaulis involving hummingbirds and bees.

    Science.gov (United States)

    Schmid, S; Schmid, V S; Zillikens, A; Harter-Marques, B; Steiner, J

    2011-01-01

    In order to compare the effectiveness of birds and insects as pollinators, we studied the floral biology of the bromeliad Aechmea nudicaulis (L.) Grisebach in the biome of the Atlantic rain forest, southern Brazil. On Santa Catarina Island, flowering extends from mid-September to the end of December, with diurnal anthesis. The reproductive system is obligatory xenogamy, thus pollinator-dependent. Flowers secrete 31.84 μl of nectar per day, with a mean sugar concentration of 23.2%. Highest nectar volume and sugar concentration occur at the beginning of anthesis. Most floral traits are characteristic for ornithophily, and nectar production appears to be adapted to the energy demand of hummingbirds. Continued secretion of the sucrose-dominated nectar attracts and binds visitors to inflorescences, strengthening trapline foraging behaviour. Experiments assessing seed set after single flower visits were performed with the most frequent visitors, revealing the hummingbird Thalurania glaucopis as the most effective pollen vector. In addition, bees are also functional pollinators, as substantiated by their high visitation frequency. We conclude that this pollination system is bimodal. Thus, there is redundancy in the pollination service provided by birds and bees, granting a high probability of successful reproduction in Ae. nudicaulis. © 2010 German Botanical Society and The Royal Botanical Society of the Netherlands.

  12. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  13. Probability distributions of placental morphological measurements and origins of variability of placental shapes.

    Science.gov (United States)

    Yampolsky, M; Salafia, C M; Shlakhter, O

    2013-06-01

    While the mean shape of human placenta is round with centrally inserted umbilical cord, significant deviations from this ideal are fairly common, and may be clinically meaningful. Traditionally, they are explained by trophotropism. We have proposed a hypothesis explaining typical variations in placental shape by randomly determined fluctuations in the growth process of the vascular tree. It has been recently reported that umbilical cord displacement in a birth cohort has a log-normal probability distribution, which indicates that the displacement between an initial point of origin and the centroid of the mature shape is a result of accumulation of random fluctuations of the dynamic growth of the placenta. To confirm this, we investigate statistical distributions of other features of placental morphology. In a cohort of 1023 births at term digital photographs of placentas were recorded at delivery. Excluding cases with velamentous cord insertion, or missing clinical data left 1001 (97.8%) for which placental surface morphology features were measured. Best-fit statistical distributions for them were obtained using EasyFit. The best-fit distributions of umbilical cord displacement, placental disk diameter, area, perimeter, and maximal radius calculated from the cord insertion point are of heavy-tailed type, similar in shape to log-normal distributions. This is consistent with a stochastic origin of deviations of placental shape from normal. Deviations of placental shape descriptors from average have heavy-tailed distributions similar in shape to log-normal. This evidence points away from trophotropism, and towards a spontaneous stochastic evolution of the variants of placental surface shape features. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Comparison of bimodal and bilateral cochlear implant users on speech recognition with competing talker, music perception, affective prosody discrimination, and talker identification.

    Science.gov (United States)

    Cullington, Helen E; Zeng, Fan-Gang

    2011-02-01

    Despite excellent performance in speech recognition in quiet, most cochlear implant users have great difficulty with speech recognition in noise, music perception, identifying tone of voice, and discriminating different talkers. This may be partly due to the pitch coding in cochlear implant speech processing. Most current speech processing strategies use only the envelope information; the temporal fine structure is discarded. One way to improve electric pitch perception is to use residual acoustic hearing via a hearing aid on the nonimplanted ear (bimodal hearing). This study aimed to test the hypothesis that bimodal users would perform better than bilateral cochlear implant users on tasks requiring good pitch perception. Four pitch-related tasks were used. 1. Hearing in Noise Test (HINT) sentences spoken by a male talker with a competing female, male, or child talker. 2. Montreal Battery of Evaluation of Amusia. This is a music test with six subtests examining pitch, rhythm and timing perception, and musical memory. 3. Aprosodia Battery. This has five subtests evaluating aspects of affective prosody and recognition of sarcasm. 4. Talker identification using vowels spoken by 10 different talkers (three men, three women, two boys, and two girls). Bilateral cochlear implant users were chosen as the comparison group. Thirteen bimodal and 13 bilateral adult cochlear implant users were recruited; all had good speech perception in quiet. There were no significant differences between the mean scores of the bimodal and bilateral groups on any of the tests, although the bimodal group did perform better than the bilateral group on almost all tests. Performance on the different pitch-related tasks was not correlated, meaning that if a subject performed one task well they would not necessarily perform well on another. The correlation between the bimodal users' hearing threshold levels in the aided ear and their performance on these tasks was weak. Although the bimodal cochlear

  15. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    Science.gov (United States)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  16. Mobile Education: Towards Affective Bi-modal Interaction for Adaptivity

    Directory of Open Access Journals (Sweden)

    Efthymios Alepis

    2009-04-01

    Full Text Available One important field where mobile technology can make significant contributions is education. However one criticism in mobile education is that students receive impersonal teaching. Affective computing may give a solution to this problem. In this paper we describe an affective bi-modal educational system for mobile devices. In our research we describe a novel approach of combining information from two modalities namely the keyboard and the microphone through a multi-criteria decision making theory.

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. 'Bimodal' NTR and LANTR propulsion for human missions to Mars/Phobos

    International Nuclear Information System (INIS)

    Borowski, Stanley K.; Dudzinski, Leonard A.; McGuire, Melissa L.

    1999-01-01

    The nuclear thermal rocket (NTR) is one of the leading propulsion options for future human missions to Mars due to its high specific impulse (Isp ∼850-1000 s) and attractive engine thrust-to-weight ratio (∼3-10). Because only a miniscule amount of enriched uranium-235 fuel is consumed in a NTR during the primary propulsion maneuvers of a typical Mars mission, engines configured for both propulsive thrust and modest power generation (referred to as 'bimodal' operation) provide the basis for a robust, 'power-rich' stage enabling propulsive Mars capture and reuse capability. A family of modular 'bimodal' NTR (BNTR) vehicles are described which utilize a common 'core' stage powered by three 66.7 kN (∼15 klbf) BNTRs that produce 50 kWe of total electrical power for crew life support, an active refrigeration/reliquification system for long term, 'zero-boiloff' liquid hydrogen (LH 2 ) storage, and high data rate communications. Compared to other propulsion options, a Mars mission architecture using BNTR transfer vehicles requires fewer transportation system elements which reduces mission mass, cost and risk because of simplified space operations. For difficult Mars options, such as a Phobos rendezvous and sample return mission, volume (not mass) constraints limit the performance of the 'all LH 2 ' BNTR stage. The use of ''LOX-augmented' NTR (LANTR) engines, operating at a modest oxygen-to-hydrogen (O/H) mixture ratio (MR) of 0.5, helps to increase 'bulk' propellant density and total thrust during the trans-Mars injection (TMI) burn. On all subsequent burns, the bimodal LANTR engines operate on LH 2 only (MR=0) to maximize vehicle performance while staying within the mass limits of two ∼80 t 'Magnum' heavy lift launch vehicles (HLLVs)

  19. Authentically radiolabelled Mn(II) complexes as bimodal PET/MR tracers

    Energy Technology Data Exchange (ETDEWEB)

    Vanasschen, Christian; Brandt, Marie; Ermert, Johannes [Institute of Neuroscience and Medicine, INM-5 - Nuclear Chemistry, Forschungszentrum Jülich (Germany); Neumaier, Bernd [Institute for Radiochemistry and Experimental Molecular Imaging, Medical Clinics, University of Cologne (Germany); Coenen, Heinz H [Institute of Neuroscience and Medicine, INM-5 - Nuclear Chemistry, Forschungszentrum Jülich (Germany)

    2015-05-18

    The development of small molecule bimodal PET/MR tracers is mainly hampered by the lack of dedicated preparation methods. Authentic radiolabelling of MR contrast agents ensures easy access to such probes: a ligand, chelating a paramagnetic metal ion (e.g. Mn2+) and the corresponding PET isotope (e.g. 52gMn), leads to a “cocktail mixture” where both imaging reporters exhibit the same pharmacokinetics. Paramagnetic [55Mn(CDTA)]2- shows an excellent compromise between thermodynamic stability, kinetic inertness and MR contrast enhancement. Therefore, the aim of this study was to develop new PET/MR tracers by labelling CDTA ligands with paramagnetic manganese and the β+-emitter 52gMn. N.c.a. 52gMn (t1/2: 5.6 d; Eβ+: 575.8 keV (29.6%)) was produced by proton irradiation of a natCr target followed by cation-exchange chromatography. CDTA was radiolabelled with n.c.a. 52gMn2+ in NaOAc buffer (pH 6) at RT. The complex was purified by RP-HPLC and its stability tested in PBS and blood plasma at 37°C. The redox stability was assessed by monitoring the T1 relaxation (20 MHz) in HEPES buffer (pH 7.4). A functionalized CDTA ligand was synthesized in 5 steps. [52gMn(CDTA)]2- was quantitatively formed within 30 min at RT. The complex was stable for at least 6 days in PBS and blood plasma at 37°C and no oxidation occurred within 7 months storage at RT. Labelling CDTA with an isotopic 52g/55Mn2+ mixture led to the corresponding bimodal PET/MR tracer. Furthermore, a functionalized CDTA ligand was synthesized with an overall yield of 18-25%. [52g/55Mn(CDTA)]2-, the first manganese-based bimodal PET/MR tracer prepared, exhibits excellent stability towards decomplexation and oxidation. This makes the functionalized CDTA ligand highly suitable for designing PET/MR tracers with high relaxivity or targeting properties.

  20. Authentically radiolabelled Mn(II) complexes as bimodal PET/MR tracers

    International Nuclear Information System (INIS)

    Vanasschen, Christian; Brandt, Marie; Ermert, Johannes; Neumaier, Bernd; Coenen, Heinz H

    2015-01-01

    The development of small molecule bimodal PET/MR tracers is mainly hampered by the lack of dedicated preparation methods. Authentic radiolabelling of MR contrast agents ensures easy access to such probes: a ligand, chelating a paramagnetic metal ion (e.g. Mn2+) and the corresponding PET isotope (e.g. 52gMn), leads to a “cocktail mixture” where both imaging reporters exhibit the same pharmacokinetics. Paramagnetic [55Mn(CDTA)]2- shows an excellent compromise between thermodynamic stability, kinetic inertness and MR contrast enhancement. Therefore, the aim of this study was to develop new PET/MR tracers by labelling CDTA ligands with paramagnetic manganese and the β+-emitter 52gMn. N.c.a. 52gMn (t1/2: 5.6 d; Eβ+: 575.8 keV (29.6%)) was produced by proton irradiation of a natCr target followed by cation-exchange chromatography. CDTA was radiolabelled with n.c.a. 52gMn2+ in NaOAc buffer (pH 6) at RT. The complex was purified by RP-HPLC and its stability tested in PBS and blood plasma at 37°C. The redox stability was assessed by monitoring the T1 relaxation (20 MHz) in HEPES buffer (pH 7.4). A functionalized CDTA ligand was synthesized in 5 steps. [52gMn(CDTA)]2- was quantitatively formed within 30 min at RT. The complex was stable for at least 6 days in PBS and blood plasma at 37°C and no oxidation occurred within 7 months storage at RT. Labelling CDTA with an isotopic 52g/55Mn2+ mixture led to the corresponding bimodal PET/MR tracer. Furthermore, a functionalized CDTA ligand was synthesized with an overall yield of 18-25%. [52g/55Mn(CDTA)]2-, the first manganese-based bimodal PET/MR tracer prepared, exhibits excellent stability towards decomplexation and oxidation. This makes the functionalized CDTA ligand highly suitable for designing PET/MR tracers with high relaxivity or targeting properties.

  1. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    Science.gov (United States)

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  2. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions.

    Science.gov (United States)

    Wenger, Seth J; Freeman, Mary C

    2008-10-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  3. Distribution and Mobility of Wealth of Nations

    NARCIS (Netherlands)

    R. Paap (Richard); H.K. van Dijk (Herman)

    2009-01-01

    textabstractWe estimate the empirical bimodal cross-section distribution of real Gross Domestic Product per capita of 120 countries over the period 1960–1989 by a mixture of a Weibull and a truncated normal density. The components of the mixture represent a group of poor and a group of rich

  4. Influence of age, spatial memory, and ocular fixation on localization of auditory, visual, and bimodal targets by human subjects.

    Science.gov (United States)

    Dobreva, Marina S; O'Neill, William E; Paige, Gary D

    2012-12-01

    A common complaint of the elderly is difficulty identifying and localizing auditory and visual sources, particularly in competing background noise. Spatial errors in the elderly may pose challenges and even threats to self and others during everyday activities, such as localizing sounds in a crowded room or driving in traffic. In this study, we investigated the influence of aging, spatial memory, and ocular fixation on the localization of auditory, visual, and combined auditory-visual (bimodal) targets. Head-restrained young and elderly subjects localized targets in a dark, echo-attenuated room using a manual laser pointer. Localization accuracy and precision (repeatability) were quantified for both ongoing and transient (remembered) targets at response delays up to 10 s. Because eye movements bias auditory spatial perception, localization was assessed under target fixation (eyes free, pointer guided by foveal vision) and central fixation (eyes fixed straight ahead, pointer guided by peripheral vision) conditions. Spatial localization across the frontal field in young adults demonstrated (1) horizontal overshoot and vertical undershoot for ongoing auditory targets under target fixation conditions, but near-ideal horizontal localization with central fixation; (2) accurate and precise localization of ongoing visual targets guided by foveal vision under target fixation that degraded when guided by peripheral vision during central fixation; (3) overestimation in horizontal central space (±10°) of remembered auditory, visual, and bimodal targets with increasing response delay. In comparison with young adults, elderly subjects showed (1) worse precision in most paradigms, especially when localizing with peripheral vision under central fixation; (2) greatly impaired vertical localization of auditory and bimodal targets; (3) increased horizontal overshoot in the central field for remembered visual and bimodal targets across response delays; (4) greater vulnerability to

  5. SUBARU WEAK-LENSING STUDY OF A2163: BIMODAL MASS STRUCTURE

    International Nuclear Information System (INIS)

    Okabe, N.; Bourdin, H.; Mazzotta, P.; Maurogordato, S.

    2011-01-01

    We present a weak-lensing analysis of the merging cluster A2163 using Subaru/Suprime-Cam and CFHT/Mega-Cam data and discuss the dynamics of this cluster merger, based on complementary weak-lensing, X-ray, and optical spectroscopic data sets. From two-dimensional multi-component weak-lensing analysis, we reveal that the cluster mass distribution is well described by three main components including the two-component main cluster A2163-A with mass ratio 1:8, and its cluster satellite A2163-B. The bimodal mass distribution in A2163-A is similar to the galaxy density distribution, but appears as spatially segregated from the brightest X-ray emitting gas region. We discuss the possible origins of this gas-dark-matter offset and suggest the gas core of the A2163-A subcluster has been stripped away by ram pressure from its dark matter component. The survival of this gas core from the tidal forces exerted by the main cluster lets us infer a subcluster accretion with a non-zero impact parameter. Dominated by the most massive component of A2163-A, the mass distribution of A2163 is well described by a universal Navarro-Frenk-White profile as shown by a one-dimensional tangential shear analysis, while the singular-isothermal sphere profile is strongly ruled out. Comparing this cluster mass profile with profiles derived assuming intracluster medium hydrostatic equilibrium (H.E.) in two opposite regions of the cluster atmosphere has allowed us to confirm the prediction of a departure from H.E. in the eastern cluster side, presumably due to shock heating. Yielding a cluster mass estimate of M 500 = 11.18 +1.64 –1.46 × 10 14 h –1 M ☉ , our mass profile confirms the exceptionally high mass of A2163, consistent with previous analyses relying on the cluster dynamical analysis and Y X mass proxy.

  6. Probability distribution of distance in a uniform ellipsoid: Theory and applications to physics

    International Nuclear Information System (INIS)

    Parry, Michelle; Fischbach, Ephraim

    2000-01-01

    A number of authors have previously found the probability P n (r) that two points uniformly distributed in an n-dimensional sphere are separated by a distance r. This result greatly facilitates the calculation of self-energies of spherically symmetric matter distributions interacting by means of an arbitrary radially symmetric two-body potential. We present here the analogous results for P 2 (r;ε) and P 3 (r;ε) which respectively describe an ellipse and an ellipsoid whose major and minor axes are 2a and 2b. It is shown that for ε=(1-b 2 /a 2 ) 1/2 ≤1, P 2 (r;ε) and P 3 (r;ε) can be obtained as an expansion in powers of ε, and our results are valid through order ε 4 . As an application of these results we calculate the Coulomb energy of an ellipsoidal nucleus, and compare our result to an earlier result quoted in the literature. (c) 2000 American Institute of Physics

  7. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  8. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  9. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  10. Tools for Bramwell-Holdsworth-Pinton Probability Distribution

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2009-01-01

    Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.

  11. Bimodality in macroscopic dynamics of nuclear fission

    International Nuclear Information System (INIS)

    Bastrukov, S.I.; Salamatin, V.S.; Strteltsova, O.I.; Molodtsova, I.V.; Podgainy, D.V.; )

    2000-01-01

    The elastodynamic collective model of nuclear fission is outlined whose underlying idea is that the stiff structure of nuclear shells imparts to nucleus properties typical of a small piece of an elastic solid. Emphasis is placed on the macroscopic dynamics of nuclear deformations resulting in fission by two energetically different modes. The low-energy S-mode is the fission due to disruption of elongated quadrupole spheroidal shape. The characteristic features of the high-energy T-mode of division by means of torsional shear deformations is the compact scission configuration. Analytic and numerical estimates for the macroscopic fission-barrier heights are presented, followed by discussion of fingerprints of the above dynamical bimodality in the available data [ru

  12. Constituent quarks as clusters in quark-gluon-parton model. [Total cross sections, probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Kanki, T [Osaka Univ., Toyonaka (Japan). Coll. of General Education

    1976-12-01

    We present a quark-gluon-parton model in which quark-partons and gluons make clusters corresponding to two or three constituent quarks (or anti-quarks) in the meson or in the baryon, respectively. We explicitly construct the constituent quark state (cluster), by employing the Kuti-Weisskopf theory and by requiring the scaling. The quark additivity of the hadronic total cross sections and the quark counting rules on the threshold powers of various distributions are satisfied. For small x (Feynman fraction), it is shown that the constituent quarks and quark-partons have quite different probability distributions. We apply our model to hadron-hadron inclusive reactions, and clarify that the fragmentation and the diffractive processes relate to the constituent quark distributions, while the processes in or near the central region are controlled by the quark-partons. Our model gives the reasonable interpretation for the experimental data and much improves the usual ''constituent interchange model'' result near and in the central region (x asymptotically equals x sub(T) asymptotically equals 0).

  13. Distribution of sensory taste thresholds for phenylthiocarbamide ...

    African Journals Online (AJOL)

    The ability to taste Phenylthiocarbamide (PTC), a bitter organic compound has been described as a bimodal autosomal trait in both genetic and anthropological studies. This study is based on the ability of a person to taste PTC. The present study reports the threshold distribution of PTC taste sensitivity among some Muslim ...

  14. Concise method for evaluating the probability distribution of the marginal cost of power generation

    International Nuclear Information System (INIS)

    Zhang, S.H.; Li, Y.Z.

    2000-01-01

    In the developing electricity market, many questions on electricity pricing and the risk modelling of forward contracts require the evaluation of the expected value and probability distribution of the short-run marginal cost of power generation at any given time. A concise forecasting method is provided, which is consistent with the definitions of marginal costs and the techniques of probabilistic production costing. The method embodies clear physical concepts, so that it can be easily understood theoretically and computationally realised. A numerical example has been used to test the proposed method. (author)

  15. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    Science.gov (United States)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  16. Time shift in slope failure prediction between unimodal and bimodal modeling approaches

    Science.gov (United States)

    Ciervo, Fabio; Casini, Francesca; Nicolina Papa, Maria; Medina, Vicente

    2016-04-01

    Together with the need to use more appropriate mathematical expressions for describing hydro-mechanical soil processes, a challenge issue relates to the need of considering the effects induced by terrain heterogeneities on the physical mechanisms, taking into account the implications of the heterogeneities in affecting time-dependent hydro-mechanical variables, would improve the prediction capacities of models, such as the ones used in early warning systems. The presence of the heterogeneities in partially-saturated slopes results in irregular propagation of the moisture and suction front. To mathematically represent the "dual-implication" generally induced by the heterogeneities in describing the hydraulic terrain behavior, several bimodal hydraulic models have been presented in literature and replaced the conventional sigmoidal/unimodal functions; this presupposes that the scale of the macrostructure is comparable with the local scale (Darcy scale), thus the Richards' model can be assumed adequate to mathematically reproduce the processes. The purpose of this work is to focus on the differences in simulating flow infiltration processes and slope stability conditions originated from preliminary choices of hydraulic models and contextually between different approaches to evaluate the factor of safety (FoS). In particular, the results of two approaches are compared. The first one includes the conventional expression of the FoS under saturated conditions and the widespread used hydraulic model of van Genuchten-Mualem. The second approach includes a generalized FoS equation for infinite-slope model under variably saturated soil conditions (Lu and Godt, 2008) and the bimodal Romano et al.'s (2011) functions to describe the hydraulic response. The extension of the above mentioned approach to the bimodal context is based on an analytical method to assess the effects of the hydraulic properties on soil shear developed integrating a bimodal lognormal hydraulic function

  17. The Bi-Modal Organization: Balancing Autopoiesis and Fluid Social Networks for Sustainability

    Science.gov (United States)

    Smith, Peter A. C.; Sharicz, Carol Ann

    2013-01-01

    Purpose: The purpose of this paper is to assist an organization to restructure as a bi-modal organization in order to achieve sustainability in today's highly complex business world. Design/methodology/approach: The paper is conceptual and is based on relevant literature and the authors' research and practice. Findings: Although fluid…

  18. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  19. The correlation of defect distribution in collisional phase with measured cascade collapse probability

    International Nuclear Information System (INIS)

    Morishita, K.; Ishino, S.; Sekimura, N.

    1995-01-01

    The spatial distributions of atomic displacement at the end of the collisional phase of cascade damage processes were calculated using the computer simulation code MARLOWE, which is based on the binary collision approximation (BCA). The densities of the atomic displacement were evaluated in high dense regions (HDRs) of cascades in several pure metals (Fe, Ni, Cu, Ag, Au, Mo and W). They were compared with the measured cascade collapse probabilities reported in the literature where TEM observations were carried out using thin metal foils irradiated by low-dose ions at room temperature. We found that there exists the minimum or ''critical'' values of the atomic displacement densities for the HDR to collapse into TEM-visible vacancy clusters. The critical densities are generally independent of the cascade energy in the same metal. Furthermore, the material dependence of the critical densities can be explained by the difference in the vacancy mobility at the melting temperature of target materials. This critical density calibration, which is extracted from the ion-irradiation experiments and the BCA simulations, is applied to estimation of cascade collapse probabilities in the metals irradiated by fusion neutrons. (orig.)

  20. The trade-off between heat tolerance and metabolic cost drives the bimodal life strategy at the air-water interface

    KAUST Repository

    Fusi, Marco; Cannicci, Stefano; Daffonchio, Daniele; Mostert, Bruce; Pö rtner, Hans-Otto; Giomi, Folco

    2016-01-01

    The principle of oxygen and capacity limitation of thermal tolerance in ectotherms suggests that the long-term upper limits of an organism's thermal niche are equivalent to the upper limits of the organism's functional capacity for oxygen provision to tissues. Air-breathing ectotherms show wider thermal tolerances, since they can take advantage of the higher availability of oxygen in air than in water. Bimodal species move from aquatic to aerial media and switch between habitats in response to environmental variations such as cyclical or anomalous temperature fluctuations. Here we tested the prediction that bimodal species cope better with thermal stress than truly aquatic species using the crab Pachygrapsus marmoratus as a model species. When in water, oxygen consumption rates of P. marmoratus acutely rise during warming. Beyond a temperature threshold of 23 °C the crab's aerobic metabolism in air remains lower than in water. In parallel, the haemolymph oxygen partial pressure of submerged animals progressive decreases during warming, while it remains low but constant during emersion. Our results demonstrate the ability of a bimodal breathing ectotherm to extend its thermal tolerance during air-breathing, suggesting that there are temperature-related physiological benefits during the evolution of the bimodal life style.

  1. The trade-off between heat tolerance and metabolic cost drives the bimodal life strategy at the air-water interface

    KAUST Repository

    Fusi, Marco

    2016-01-13

    The principle of oxygen and capacity limitation of thermal tolerance in ectotherms suggests that the long-term upper limits of an organism\\'s thermal niche are equivalent to the upper limits of the organism\\'s functional capacity for oxygen provision to tissues. Air-breathing ectotherms show wider thermal tolerances, since they can take advantage of the higher availability of oxygen in air than in water. Bimodal species move from aquatic to aerial media and switch between habitats in response to environmental variations such as cyclical or anomalous temperature fluctuations. Here we tested the prediction that bimodal species cope better with thermal stress than truly aquatic species using the crab Pachygrapsus marmoratus as a model species. When in water, oxygen consumption rates of P. marmoratus acutely rise during warming. Beyond a temperature threshold of 23 °C the crab\\'s aerobic metabolism in air remains lower than in water. In parallel, the haemolymph oxygen partial pressure of submerged animals progressive decreases during warming, while it remains low but constant during emersion. Our results demonstrate the ability of a bimodal breathing ectotherm to extend its thermal tolerance during air-breathing, suggesting that there are temperature-related physiological benefits during the evolution of the bimodal life style.

  2. Bi-Modal Face and Speech Authentication: a BioLogin Demonstration System

    OpenAIRE

    Marcel, Sébastien; Mariéthoz, Johnny; Rodriguez, Yann; Cardinaux, Fabien

    2006-01-01

    This paper presents a bi-modal (face and speech) authentication demonstration system that simulates the login of a user using its face and its voice. This demonstration is called BioLogin. It runs both on Linux and Windows and the Windows version is freely available for download. Bio\\-Login is implemented using an open source machine learning library and its machine vision package.

  3. Bimodal MR-PET agent for quantitative pH imaging

    Science.gov (United States)

    Frullano, Luca; Catana, Ciprian; Benner, Thomas; Sherry, A. Dean; Caravan, Peter

    2010-01-01

    Activatable or “smart” magnetic resonance contrast agents have relaxivities that depend on environmental factors such as pH or enzymatic activity, but the MR signal depends on relaxivity and agent concentration – two unknowns. A bimodal approach, incorporating a positron emitter, solves this problem. Simultaneous positron emission tomography (PET) and MR imaging with the biomodal, pH-responsive MR-PET agent GdDOTA-4AMP-F allows direct determination of both concentration (PET) and T1 (MRI), and hence pH. PMID:20191650

  4. On the self-organizing process of large scale shear flows

    Energy Technology Data Exchange (ETDEWEB)

    Newton, Andrew P. L. [Department of Applied Maths, University of Sheffield, Sheffield, Yorkshire S3 7RH (United Kingdom); Kim, Eun-jin [School of Mathematics and Statistics, University of Sheffield, Sheffield, Yorkshire S3 7RH (United Kingdom); Liu, Han-Li [High Altitude Observatory, National Centre for Atmospheric Research, P. O. BOX 3000, Boulder, Colorado 80303-3000 (United States)

    2013-09-15

    Self organization is invoked as a paradigm to explore the processes governing the evolution of shear flows. By examining the probability density function (PDF) of the local flow gradient (shear), we show that shear flows reach a quasi-equilibrium state as its growth of shear is balanced by shear relaxation. Specifically, the PDFs of the local shear are calculated numerically and analytically in reduced 1D and 0D models, where the PDFs are shown to converge to a bimodal distribution in the case of finite correlated temporal forcing. This bimodal PDF is then shown to be reproduced in nonlinear simulation of 2D hydrodynamic turbulence. Furthermore, the bimodal PDF is demonstrated to result from a self-organizing shear flow with linear profile. Similar bimodal structure and linear profile of the shear flow are observed in gulf stream, suggesting self-organization.

  5. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  6. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  7. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  8. Processing bimodal stimulus information under alcohol: is there a risk to being redundant?

    Science.gov (United States)

    Fillmore, Mark T

    2010-10-01

    The impairing effects of alcohol are especially pronounced in environments that involve dividing attention across two or more stimuli. However, studies in cognitive psychology have identified circumstances in which the presentation of multiple stimuli can actually facilitate performance. The "redundant signal effect" (RSE) refers to the observation that individuals respond more quickly when information is presented as redundant, bimodal stimuli (e.g., aurally and visually), rather than as a single stimulus presented to either modality alone. The present study tested the hypothesis that the response facilitation attributed to RSE could reduce the degree to which alcohol slows information processing. Two experiments are reported. Experiment 1 demonstrated the validity of a reaction time model of RSE by showing that adults (N = 15) responded more quickly to redundant, bimodal stimuli (visual + aural) versus either stimuli presented individually. Experiment 2 used the RSE model to test the reaction time performance of 20 adults following three alcohol doses (0.0 g/kg, 0.45 g/kg, and 0.65 g/kg). Results showed that alcohol slowed reaction time in a general dose-dependent manner in all three stimulus conditions with the reaction time (RT) speed-advantage of the redundant signal being maintained, even under the highest dose of alcohol. Evidence for an RT advantage to bimodal stimuli under alcohol challenges the general assumption that alcohol impairment is intensified in multistimulus environments. The current study provides a useful model to investigate how drug effects on behavior might be altered in contexts that involve redundant response signals.

  9. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  10. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  12. The Multivariate Gaussian Probability Distribution

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2005-01-01

    This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...

  13. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  14. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  15. [Size distributions of aerosol during the Spring Festival in Nanjing].

    Science.gov (United States)

    Wang, Hong-Lei; Zhu, Bin; Shen, Li-Juan; Liu, Xiao-Hui; Zhang, Ze-Feng; Yang, Yang

    2014-02-01

    In order to investigate the firework burning impacts on spectrum distribution of atmospheric aerosol during the Spring Festival in Nanjing, number concentration and mass concentration of aerosol as well as mass concentration of gas pollutants were measured during January 19-31, 2012. The results indicated that the concentration of aerosol between 10-20 nm decreased, aerosol concentration in the range of 50-100 nm, 100-200 nm and 200-500 nm increased during the firework burning period comparing to those during the non-burning period. However, there was no obvious variation for aerosol between 20-50 nm and 0.5-10 microm. The spectrum distribution of number concentration was bimodal during the non-burning period and unimodal during the burning period, with the peak value shifting to large diameter section. The mass concentration presented a bimodal distribution, the value of PM2.5/PM10 and PM10/PM10 increased by 10% during the burning period. The firework burning events had big influence on the density of aerosol between 1.0-2.1 microm.

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  18. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  19. Approximation of ruin probabilities via Erlangized scale mixtures

    DEFF Research Database (Denmark)

    Peralta, Oscar; Rojas-Nandayapa, Leonardo; Xie, Wangyue

    2018-01-01

    In this paper, we extend an existing scheme for numerically calculating the probability of ruin of a classical Cramér–Lundbergreserve process having absolutely continuous but otherwise general claim size distributions. We employ a dense class of distributions that we denominate Erlangized scale...... a simple methodology for constructing a sequence of distributions having the form Π⋆G with the purpose of approximating the integrated tail distribution of the claim sizes. Then we adapt a recent result which delivers an explicit expression for the probability of ruin in the case that the claim size...... distribution is modeled as an Erlangized scale mixture. We provide simplified expressions for the approximation of the probability of ruin and construct explicit bounds for the error of approximation. We complement our results with a classical example where the claim sizes are heavy-tailed....

  20. Polybutadiene latex particle size distribution analysis utilizing a disk centrifuge

    NARCIS (Netherlands)

    Verdurmen, E.M.F.J.; Albers, J.G.; German, A.L.

    1994-01-01

    Polybutadiene (I) latexes prepd. by emulsifier-free emulsion polymn. and having particle diam. 50-300 nm for both unimodal and bimodal particles size distributions were analyzed by the line-start (LIST) method in a Brookhaven disk centrifuge photosedimentometer. A special spin fluid was designed to

  1. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors

    DEFF Research Database (Denmark)

    Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...... of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse....

  2. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  3. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    significantly due to risk aversion. We characterize an approach for eliciting the entire subjective belief distribution that is minimally biased due to risk aversion. We offer simulated examples to demonstrate the intuition of our approach. We also provide theory to formally characterize our framework. And we...... provide experimental evidence which corroborates our theoretical results. We conclude that for empirically plausible levels of risk aversion, one can reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one...

  4. Contralateral Bimodal Stimulation: A Way to Enhance Speech Performance in Arabic-Speaking Cochlear Implant Patients.

    Science.gov (United States)

    Abdeltawwab, Mohamed M; Khater, Ahmed; El-Anwar, Mohammad W

    2016-01-01

    The combination of acoustic and electric stimulation as a way to enhance speech recognition performance in cochlear implant (CI) users has generated considerable interest in the recent years. The purpose of this study was to evaluate the bimodal advantage of the FS4 speech processing strategy in combination with hearing aids (HA) as a means to improve low-frequency resolution in CI patients. Nineteen postlingual CI adults were selected to participate in this study. All patients wore implants on one side and HA on the contralateral side with residual hearing. Monosyllabic word recognition, speech in noise, and emotion and talker identification were assessed using CI with fine structure processing/FS4 and high-definition continuous interleaved sampling strategies, HA alone, and a combination of CI and HA. The bimodal stimulation showed improvement in speech performance and emotion identification for the question/statement/order tasks, which was statistically significant compared to patients with CI alone, but there were no significant statistical differences in intragender talker discrimination and emotion identification for the happy/angry/neutral tasks. The poorest performance was obtained with HA only, and it was statistically significant compared to the other modalities. The bimodal stimulation showed enhanced speech performance in CI patients, and it improves the limitations provided by electric or acoustic stimulation alone. © 2016 S. Karger AG, Basel.

  5. Influence of nucleon density distribution in nucleon emission probability

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.

    2014-01-01

    Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield

  6. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  7. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  8. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  9. Thermal significance of fission-track length distributions

    International Nuclear Information System (INIS)

    Crowley, K.D.

    1985-01-01

    The semi-analytical solution of an equation describing the production and shortening of fission tracks in apatite suggests that certain thermal histories have unique length-distribution 'signatures'. Isothermal-heating histories should be characterized by flattened, length-shortened distributions; step-heating histories should be characterized by bimodal track length distributions; and linear-cooling histories should be characterized by negatively skewed, length-shortened distributions. The model formulated here to investigate track length distributions can be used to constrain the thermal histories of natural samples for which unbiased track length data are available - provided that the geologic history of the system of interest can be used to partially constrain one of the unknowns in the model equations, time or temperature. (author)

  10. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  11. Probability Distributions for Cyclone Key Parameters and Cyclonic Wind Speed for the East Coast of Indian Region

    Directory of Open Access Journals (Sweden)

    Pradeep K. Goyal

    2011-09-01

    Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.

  12. Bimodality: A Sign of Critical Behavior in Nuclear Reactions

    International Nuclear Information System (INIS)

    Le Fevre, A.; Aichelin, J.

    2008-01-01

    The recently discovered coexistence of multifragmentation and residue production for the same total transverse energy of light charged particles, which has been dubbed bimodality like it has been introduced in the framework of equilibrium thermodynamics, can be well reproduced in numerical simulations of heavy ion reactions. A detailed analysis shows that fluctuations (introduced by elementary nucleon-nucleon collisions) determine which of the exit states is realized. Thus, we can identify bifurcation in heavy ion reactions as a critical phenomenon. Also the scaling of the coexistence region with beam energy is well reproduced in these results from the quantum molecular dynamics simulation program

  13. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  14. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  15. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    Energy Technology Data Exchange (ETDEWEB)

    O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-27

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  16. A probability distribution model of tooth pits for evaluating time-varying mesh stiffness of pitting gears

    Science.gov (United States)

    Lei, Yaguo; Liu, Zongyao; Wang, Delong; Yang, Xiao; Liu, Huan; Lin, Jing

    2018-06-01

    Tooth damage often causes a reduction in gear mesh stiffness. Thus time-varying mesh stiffness (TVMS) can be treated as an indication of gear health conditions. This study is devoted to investigating the mesh stiffness variations of a pair of external spur gears with tooth pitting, and proposes a new model for describing tooth pitting based on probability distribution. In the model, considering the appearance and development process of tooth pitting, we model the pitting on the surface of spur gear teeth as a series of pits with a uniform distribution in the direction of tooth width and a normal distribution in the direction of tooth height, respectively. In addition, four pitting degrees, from no pitting to severe pitting, are modeled. Finally, influences of tooth pitting on TVMS are analyzed in details and the proposed model is validated by comparing with a finite element model. The comparison results show that the proposed model is effective for the TVMS evaluations of pitting gears.

  17. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  18. Effects of Removing Low-Frequency Electric Information on Speech Perception with Bimodal Hearing

    Science.gov (United States)

    Fowler, Jennifer R.; Eggleston, Jessica L.; Reavis, Kelly M.; McMillan, Garnett P.; Reiss, Lina A. J.

    2016-01-01

    Purpose: The objective was to determine whether speech perception could be improved for bimodal listeners (those using a cochlear implant [CI] in one ear and hearing aid in the contralateral ear) by removing low-frequency information provided by the CI, thereby reducing acoustic-electric overlap. Method: Subjects were adult CI subjects with at…

  19. Inverse estimation of the particle size distribution using the Fruit Fly Optimization Algorithm

    International Nuclear Information System (INIS)

    He, Zhenzong; Qi, Hong; Yao, Yuchen; Ruan, Liming

    2015-01-01

    The Fruit Fly Optimization Algorithm (FOA) is applied to retrieve the particle size distribution (PSD) for the first time. The direct problems are solved by the modified Anomalous Diffraction Approximation (ADA) and the Lambert–Beer Law. Firstly, three commonly used monomodal PSDs, i.e. the Rosin–Rammer (R–R) distribution, the normal (N–N) distribution and the logarithmic normal (L–N) distribution, and the bimodal Rosin–Rammer distribution function are estimated in the dependent model. All the results show that the FOA can be used as an effective technique to estimate the PSDs under the dependent model. Then, an optimal wavelength selection technique is proposed to improve the retrieval results of bimodal PSD. Finally, combined with two general functions, i.e. the Johnson's S B (J-S B ) function and the modified beta (M-β) function, the FOA is employed to recover actual measurement aerosol PSDs over Beijing and Hangzhou obtained from the aerosol robotic network (AERONET). All the numerical simulations and experiment results demonstrate that the FOA can be used to retrieve actual measurement PSDs, and more reliable and accurate results can be obtained, if the J-S B function is employed

  20. Probability distribution functions for ELM bursts in a series of JET tokamak discharges

    International Nuclear Information System (INIS)

    Greenhough, J; Chapman, S C; Dendy, R O; Ward, D J

    2003-01-01

    A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour

  1. Phenol-formaldehyde carbon with ordered/disordered bimodal mesoporous structure as high-performance electrode materials for supercapacitors

    Science.gov (United States)

    Cai, Tingwei; Zhou, Min; Han, Guangshuai; Guan, Shiyou

    2013-11-01

    A novel phenol-formaldehyde carbon with ordered/disordered bimodal mesoporous structure is synthesized by the facile evaporation induced self-assembly strategy under a basic aqueous condition with SiO2 particles as template. The prepared bimodal mesoporous carbons (BMCs) are composed of ordered mesoporous and disordered mesoporous with diameter of about 3.5 nm and 7.0 nm, respectively. They can be employed as supercapacitor electrodes in H2SO4 aqueous electrolyte after the simple acid-treatment. BMC exhibits an exceptional specific capacitance of 344 F g-1 at the current density of 0.1 A g-1, although it has a relatively low surface area of 722 m2 g-1. And the BMC electrode displays an excellent cycling stability over 10,000 cycles.

  2. THE DEAD SEQUENCE: A CLEAR BIMODALITY IN GALAXY COLORS FROM z = 0 to z = 2.5

    International Nuclear Information System (INIS)

    Brammer, G. B.; Whitaker, K. E.; Van Dokkum, P. G.; Marchesini, D.; Lee, K.-S.; Muzzin, A.; Labbe, I.; Franx, M.; Quadri, R. F.; Kriek, M.; Illingworth, G.; Rudnick, G.

    2009-01-01

    We select 25,000 galaxies from the NEWFIRM Medium Band Survey (NMBS) to study the rest-frame U - V color distribution of galaxies at 0 < z ∼< 2.5. The five unique NIR filters of the NMBS enable the precise measurement of photometric redshifts and rest-frame colors for 9900 galaxies at 1 < z < 2.5. The rest-frame U - V color distribution at all z ∼< 2.5 is bimodal, with a red peak, a blue peak, and a population of galaxies in between (the green valley). Model fits to the optical-NIR spectral energy distributions and the distribution of MIPS-detected galaxies indicate that the colors of galaxies in the green valley are determined largely by the amount of reddening by dust. This result does not support the simplest interpretation of green valley objects as a transition from blue star forming to red quiescent galaxies. We show that correcting the rest-frame colors for dust reddening allows a remarkably clean separation between the red and blue sequences up to z ∼ 2.5. Our study confirms that dusty-starburst galaxies can contribute a significant fraction to red-sequence samples selected on the basis of a single rest-frame color (i.e., U - V), so extra care must be taken if samples of truly 'red and dead' galaxies are desired. Interestingly, of galaxies detected at 24 μm, 14% remain on the red sequence after applying the reddening correction.

  3. Calculating the Prior Probability Distribution for a Causal Network Using Maximum Entropy: Alternative Approaches

    Directory of Open Access Journals (Sweden)

    Michael J. Markham

    2011-07-01

    Full Text Available Some problems occurring in Expert Systems can be resolved by employing a causal (Bayesian network and methodologies exist for this purpose. These require data in a specific form and make assumptions about the independence relationships involved. Methodologies using Maximum Entropy (ME are free from these conditions and have the potential to be used in a wider context including systems consisting of given sets of linear and independence constraints, subject to consistency and convergence. ME can also be used to validate results from the causal network methodologies. Three ME methods for determining the prior probability distribution of causal network systems are considered. The first method is Sequential Maximum Entropy in which the computation of a progression of local distributions leads to the over-all distribution. This is followed by development of the Method of Tribus. The development takes the form of an algorithm that includes the handling of explicit independence constraints. These fall into two groups those relating parents of vertices, and those deduced from triangulation of the remaining graph. The third method involves a variation in the part of that algorithm which handles independence constraints. Evidence is presented that this adaptation only requires the linear constraints and the parental independence constraints to emulate the second method in a substantial class of examples.

  4. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  5. Assignment of probability distributions for parameters in the 1996 performance assessment for the Waste Isolation Pilot Plant. Part 1: description of process

    International Nuclear Information System (INIS)

    Rechard, Rob P.; Tierney, Martin S.

    2005-01-01

    A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The key to the success of the process was the use of a three-member team consisting of a Parameter Task Leader, PA Analyst, and Subject Matter Expert. This team, in turn, relied upon a series of guidelines for selecting distribution types. The primary function of the guidelines was not to constrain the actual process of developing a parameter distribution but rather to establish a series of well-defined steps where recognized methods would be consistently applied to all parameters. An important guideline was to use a small set of distributions satisfying the maximum entropy formalism. Another important guideline was the consistent use of the log transform for parameters with large ranges (i.e. maximum/minimum>10 3 ). A parameter development team assigned 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA using these and other guidelines described

  6. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  8. Affective and physiological correlates of the perception of unimodal and bimodal emotional stimuli.

    Science.gov (United States)

    Rosa, Pedro J; Oliveira, Jorge; Alghazzawi, Daniyal; Fardoun, Habib; Gamito, Pedro

    2017-08-01

    Despite the multisensory nature of perception, previous research on emotions has been focused on unimodal emotional cues with visual stimuli. To the best of our knowledge, there is no evidence on the extent to which incongruent emotional cues from visual and auditory sensory channels affect pupil size. To investigate the effects of audiovisual emotional information perception on the physiological and affective response, but also to determine the impact of mismatched cues in emotional perception on these physiological indexes. Pupil size, electrodermal activity and affective subjective responses were recorded while 30 participants were exposed to visual and auditory stimuli with varied emotional content in three different experimental conditions: pictures and sounds presented alone (unimodal), emotionally matched audio-visual stimuli (bimodal congruent) and emotionally mismatched audio-visual stimuli (bimodal incongruent). The data revealed no effect of emotional incongruence on physiological and affective responses. On the other hand, pupil size covaried with skin conductance response (SCR), but the subjective experience was partially dissociated from autonomic responses. Emotional stimuli are able to trigger physiological responses regardless of valence, sensory modality or level of emotional congruence.

  9. Human fatigue expression recognition through image-based dynamic multi-information and bimodal deep learning

    Science.gov (United States)

    Zhao, Lei; Wang, Zengcai; Wang, Xiaojin; Qi, Yazhou; Liu, Qing; Zhang, Guoxin

    2016-09-01

    Human fatigue is an important cause of traffic accidents. To improve the safety of transportation, we propose, in this paper, a framework for fatigue expression recognition using image-based facial dynamic multi-information and a bimodal deep neural network. First, the landmark of face region and the texture of eye region, which complement each other in fatigue expression recognition, are extracted from facial image sequences captured by a single camera. Then, two stacked autoencoder neural networks are trained for landmark and texture, respectively. Finally, the two trained neural networks are combined by learning a joint layer on top of them to construct a bimodal deep neural network. The model can be used to extract a unified representation that fuses landmark and texture modalities together and classify fatigue expressions accurately. The proposed system is tested on a human fatigue dataset obtained from an actual driving environment. The experimental results demonstrate that the proposed method performs stably and robustly, and that the average accuracy achieves 96.2%.

  10. Pattern recognition in spaces of probability distributions for the analysis of edge-localized modes in tokamak plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Shabbir, Aqsa

    2016-07-07

    In this doctoral work, pattern recognition techniques are developed and applied to data from tokamak plasmas, in order to contribute to a systematic analysis of edge-localized modes (ELMs). We employ probabilistic models for a quantitative data description geared towards an enhanced systematization of ELM phenomenology. Hence, we start from the point of view that the fundamental object resulting from the observation of a system is a probability distribution, with every single measurement providing a sample from this distribution. In exploring the patterns emerging from the various ELM regimes and relations, we need methods that can handle the intrinsic probabilistic nature of the data. The original contributions of this work are twofold. First, several novel pattern recognition methods in non-Euclidean spaces of probability distribution functions (PDFs) are developed and validated. The second main contribution lies in the application of these and other techniques to a systematic analysis of ELMs in tokamak plasmas. In regard to the methodological aims of the work, we employ the framework of information geometry to develop pattern visualization and classification methods in spaces of probability distributions. In information geometry, a family of probability distributions is considered as a Riemannian manifold. Every point on the manifold represents a single PDF and the distribution parameters provide local coordinates on the manifold. The Fisher information plays the role of a Riemannian metric tensor, enabling calculation of geodesic curves on the surface. The length of such curves yields the geodesic distance (GD) on probabilistic manifolds, which is a natural similarity (distance) measure between PDFs. Equipped with a suitable distance measure, we extrapolate several distance-based pattern recognition methods to the manifold setting. This includes k-nearest neighbor (kNN) and conformal predictor (CP) methods for classification, as well as multidimensional

  11. Pattern recognition in spaces of probability distributions for the analysis of edge-localized modes in tokamak plasmas

    International Nuclear Information System (INIS)

    Shabbir, Aqsa

    2016-01-01

    In this doctoral work, pattern recognition techniques are developed and applied to data from tokamak plasmas, in order to contribute to a systematic analysis of edge-localized modes (ELMs). We employ probabilistic models for a quantitative data description geared towards an enhanced systematization of ELM phenomenology. Hence, we start from the point of view that the fundamental object resulting from the observation of a system is a probability distribution, with every single measurement providing a sample from this distribution. In exploring the patterns emerging from the various ELM regimes and relations, we need methods that can handle the intrinsic probabilistic nature of the data. The original contributions of this work are twofold. First, several novel pattern recognition methods in non-Euclidean spaces of probability distribution functions (PDFs) are developed and validated. The second main contribution lies in the application of these and other techniques to a systematic analysis of ELMs in tokamak plasmas. In regard to the methodological aims of the work, we employ the framework of information geometry to develop pattern visualization and classification methods in spaces of probability distributions. In information geometry, a family of probability distributions is considered as a Riemannian manifold. Every point on the manifold represents a single PDF and the distribution parameters provide local coordinates on the manifold. The Fisher information plays the role of a Riemannian metric tensor, enabling calculation of geodesic curves on the surface. The length of such curves yields the geodesic distance (GD) on probabilistic manifolds, which is a natural similarity (distance) measure between PDFs. Equipped with a suitable distance measure, we extrapolate several distance-based pattern recognition methods to the manifold setting. This includes k-nearest neighbor (kNN) and conformal predictor (CP) methods for classification, as well as multidimensional

  12. Investigation of Probability Distributions Using Dice Rolling Simulation

    Science.gov (United States)

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  13. Performance Probability Distributions for Sediment Control Best Management Practices

    Science.gov (United States)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  14. A discussion on the origin of quantum probabilities

    International Nuclear Information System (INIS)

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-01

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases

  15. The role of the bimodal distribution of ultra-fine silicon phase and nano-scale V-phase (AlSi2Sc2) on spark plasma sintered hypereutectic Al–Si–Sc alloys

    International Nuclear Information System (INIS)

    Raghukiran, Nadimpalli; Kumar, Ravi

    2016-01-01

    Hypereutectic Al–Si and Al–Si–Sc alloys were spark plasma sintered from corresponding gas-atomized powders. The microstructures of the Al–Si and Al–Si–Sc alloys possessed remarkably refined silicon particles in the size range of 0.38–3.5 µm and 0.35–1.16 µm respectively in contrast to the silicon particles of size greater than 100 µm typically found in conventionally cast alloys. All the sintered alloys exhibited significant ductility of as high as 85% compressive strain without failure even with the presence of relatively higher weight fraction of the brittle silicon phase. Moreover, the Al–Si–Sc alloys have shown appreciable improvement in the compressive strength over their binary counterparts due to the presence of intermetallic compound AlSi 2 Sc 2 of size 10–20 nm distributed uniformly in the matrix of those alloys. The dry sliding pin-on-disc wear tests showed improvement in the wear performance of the sintered alloys with increase in silicon content in the alloys. Further, the Al–Si–Sc ternary alloys with relatively lesser silicon content exhibited appreciable improvement in the wear resistance over their binary counterparts. The Al–Si–Sc alloys with bimodal distribution of the strengthening phases consisting of ultra-fine (sub-micron size) silicon particles and the nano-scale AlSi 2 Sc 2 improved the strength and wear properties of the alloys while retaining significant amount of ductility.

  16. Wave functions and two-electron probability distributions of the Hooke's-law atom and helium

    International Nuclear Information System (INIS)

    O'Neill, Darragh P.; Gill, Peter M. W.

    2003-01-01

    The Hooke's-law atom (hookium) provides an exactly soluble model for a two-electron atom in which the nuclear-electron Coulombic attraction has been replaced by a harmonic one. Starting from the known exact position-space wave function for the ground state of hookium, we present the momentum-space wave function. We also look at the intracules, two-electron probability distributions, for hookium in position, momentum, and phase space. These are compared with the Hartree-Fock results and the Coulomb holes (the difference between the exact and Hartree-Fock intracules) in position, momentum, and phase space are examined. We then compare these results with analogous results for the ground state of helium using a simple, explicitly correlated wave function

  17. Improvement of and Parameter Identification for the Bimodal Time-Varying Modified Kanai-Tajimi Power Spectral Model

    Directory of Open Access Journals (Sweden)

    Huiguo Chen

    2017-01-01

    Full Text Available Based on the Kanai-Tajimi power spectrum filtering method proposed by Du Xiuli et al., a genetic algorithm and a quadratic optimization identification technique are employed to improve the bimodal time-varying modified Kanai-Tajimi power spectral model and the parameter identification method proposed by Vlachos et al. Additionally, a method for modeling time-varying power spectrum parameters for ground motion is proposed. The 8244 Orion and Chi-Chi earthquake accelerograms are selected as examples for time-varying power spectral model parameter identification and ground motion simulations to verify the feasibility and effectiveness of the improved bimodal time-varying modified Kanai-Tajimi power spectral model. The results of this study provide important references for designing ground motion inputs for seismic analyses of major engineering structures.

  18. A Novel Method of Extraction of Blend Component Structure from SANS Measurements of Homopolymer Bimodal Blends.

    Science.gov (United States)

    Smerdova, Olga; Graham, Richard S; Gasser, Urs; Hutchings, Lian R; De Focatiis, Davide S A

    2014-05-01

    A new method is presented for the extraction of single-chain form factors and interchain interference functions from a range of small-angle neutron scattering (SANS) experiments on bimodal homopolymer blends. The method requires a minimum of three blends, made up of hydrogenated and deuterated components with matched degree of polymerization at two different chain lengths, but with carefully varying deuteration levels. The method is validated through an experimental study on polystyrene homopolymer bimodal blends with [Formula: see text]. By fitting Debye functions to the structure factors, it is shown that there is good agreement between the molar mass of the components obtained from SANS and from chromatography. The extraction method also enables, for the first time, interchain scattering functions to be produced for scattering between chains of different lengths. [Formula: see text].

  19. Small Low Mass Advanced PBR's for Bi-Modal Operation

    Science.gov (United States)

    Ludewig, Hans; Todosow, Michael; Powell, James R.

    1994-07-01

    A preliminary assessment is made of a low mass bi-modal reactor for use as a propulsion unit and as a heat source for generating electricity. This reactor is based on the particle bed reactor (PBR) concept. It will be able to generate both thrust and electricity simultaneously. This assessment indicates that the reactor can generate approximately 6.8 (4) N of thrust using hydrogen as a coolant, and 100 KWe using a closed Brayton cycle (CBC) power conversion system. Two cooling paths pass through the reactor allowing simultaneous operation of both modes. The development of all the components for this reactor are within the experience base of the NTP project.

  20. Strong bimodality in the host halo mass of central galaxies from galaxy-galaxy lensing

    Science.gov (United States)

    Mandelbaum, Rachel; Wang, Wenting; Zu, Ying; White, Simon; Henriques, Bruno; More, Surhud

    2016-04-01

    We use galaxy-galaxy lensing to study the dark matter haloes surrounding a sample of locally brightest galaxies (LBGs) selected from the Sloan Digital Sky Survey. We measure mean halo mass as a function of the stellar mass and colour of the central galaxy. Mock catalogues constructed from semi-analytic galaxy formation simulations demonstrate that most LBGs are the central objects of their haloes, greatly reducing interpretation uncertainties due to satellite contributions to the lensing signal. Over the full stellar mass range, 10.3 10.7. Tests using the mock catalogues and on the data themselves clarify the effects of LBG selection and show that it cannot artificially induce a systematic dependence of halo mass on LBG colour. The bimodality in halo mass at fixed stellar mass is reproduced by the astrophysical model underlying our mock catalogue, but the sign of the effect is inconsistent with recent, nearly parameter-free age-matching models. The sign and magnitude of the effect can, however, be reproduced by halo occupation distribution models with a simple (few-parameter) prescription for type dependence.

  1. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  2. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Optimal design of unit hydrographs using probability distribution and ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    optimization formulation is solved using binary-coded genetic algorithms. The number of variables to ... Unit hydrograph; rainfall-runoff; hydrology; genetic algorithms; optimization; probability ..... Application of the model. Data derived from the ...

  4. Distribution of binder in granules produced by means of twin screw granulation

    DEFF Research Database (Denmark)

    Fonteyne, Margot; Fussell, Andrew Luke; Vercruysse, Jurgen

    2014-01-01

    According to the quality by design principle processes may not remain black-boxes and full process understanding is required. The granule size distribution of granules produced via twin screw granulation is often found to be bimodal. The aim of this study was to gain a better understanding...

  5. The Bayesian count rate probability distribution in measurement of ionizing radiation by use of a ratemeter

    Energy Technology Data Exchange (ETDEWEB)

    Weise, K.

    2004-06-01

    Recent metrological developments concerning measurement uncertainty, founded on Bayesian statistics, give rise to a revision of several parts of the DIN 25482 and ISO 11929 standard series. These series stipulate detection limits and decision thresholds for ionizing-radiation measurements. Part 3 and, respectively, part 4 of them deal with measurements by use of linear-scale analogue ratemeters. A normal frequency distribution of the momentary ratemeter indication for a fixed count rate value is assumed. The actual distribution, which is first calculated numerically by solving an integral equation, differs, however, considerably from the normal distribution although this one represents an approximation of it for sufficiently large values of the count rate to be measured. As is shown, this similarly holds true for the Bayesian probability distribution of the count rate for sufficiently large given measured values indicated by the ratemeter. This distribution follows from the first one mentioned by means of the Bayes theorem. Its expectation value and variance are needed for the standards to be revised on the basis of Bayesian statistics. Simple expressions are given by the present standards for estimating these parameters and for calculating the detection limit and the decision threshold. As is also shown, the same expressions can similarly be used as sufficient approximations by the revised standards if, roughly, the present indicated value exceeds the reciprocal ratemeter relaxation time constant. (orig.)

  6. Core-powered mass-loss and the radius distribution of small exoplanets

    Science.gov (United States)

    Ginzburg, Sivan; Schlichting, Hilke E.; Sari, Re'em

    2018-05-01

    Recent observations identify a valley in the radius distribution of small exoplanets, with planets in the range 1.5-2.0 R⊕ significantly less common than somewhat smaller or larger planets. This valley may suggest a bimodal population of rocky planets that are either engulfed by massive gas envelopes that significantly enlarge their radius, or do not have detectable atmospheres at all. One explanation of such a bimodal distribution is atmospheric erosion by high-energy stellar photons. We investigate an alternative mechanism: the luminosity of the cooling rocky core, which can completely erode light envelopes while preserving heavy ones, produces a deficit of intermediate sized planets. We evolve planetary populations that are derived from observations using a simple analytical prescription, accounting self-consistently for envelope accretion, cooling and mass-loss, and demonstrate that core-powered mass-loss naturally reproduces the observed radius distribution, regardless of the high-energy incident flux. Observations of planets around different stellar types may distinguish between photoevaporation, which is powered by the high-energy tail of the stellar radiation, and core-powered mass-loss, which depends on the bolometric flux through the planet's equilibrium temperature that sets both its cooling and mass-loss rates.

  7. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  8. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  9. Use of finite mixture distribution models in the analysis of wind energy in the Canarian Archipelago

    International Nuclear Information System (INIS)

    Carta, Jose Antonio; Ramirez, Penelope

    2007-01-01

    The statistical characteristics of hourly mean wind speed data recorded at 16 weather stations located in the Canarian Archipelago are analyzed in this paper. As a result of this analysis we see that the typical two parameter Weibull wind speed distribution (W-pdf) does not accurately represent all wind regimes observed in that region. However, a Singly Truncated from below Normal Weibull mixture distribution (TNW-pdf) and a two component mixture Weibull distribution (WW-pdf) developed here do provide very good fits for both unimodal and bimodal wind speed frequency distributions observed in that region and offer less relative errors in determining the annual mean wind power density. The parameters of the distributions are estimated using the least squares method, which is resolved in this paper using the Levenberg-Marquardt algorithm. The suitability of the distributions is judged from the probability plot correlation coefficient plot R 2 , adjusted for degrees of freedom. Based on the results obtained, we conclude that the two mixture distributions proposed here provide very flexible models for wind speed studies and can be applied in a widespread manner to represent the wind regimes in the Canarian archipelago and in other regions with similar characteristics. The TNW-pdf takes into account the frequency of null winds, whereas the WW-pdf and W-pdf do not. It can, therefore, better represent wind regimes with high percentages of null wind speeds. However, calculation of the TNW-pdf is markedly slower

  10. A quantum anharmonic oscillator model for the stock market

    Science.gov (United States)

    Gao, Tingting; Chen, Yu

    2017-02-01

    A financially interpretable quantum model is proposed to study the probability distributions of the stock price return. The dynamics of a quantum particle is considered an analog of the motion of stock price. Then the probability distributions of price return can be computed from the wave functions that evolve according to Schrodinger equation. Instead of a harmonic oscillator in previous studies, a quantum anharmonic oscillator is applied to the stock in liquid market. The leptokurtic distributions of price return can be reproduced by our quantum model with the introduction of mixed-state and multi-potential. The trend following dominant market, in which the price return follows a bimodal distribution, is discussed as a specific case of the illiquid market.

  11. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  12. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  13. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  14. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    Science.gov (United States)

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  15. Role of block copolymer adsorption versus bimodal grafting on nanoparticle self-assembly in polymer nanocomposites.

    Science.gov (United States)

    Zhao, Dan; Di Nicola, Matteo; Khani, Mohammad M; Jestin, Jacques; Benicewicz, Brian C; Kumar, Sanat K

    2016-09-14

    We compare the self-assembly of silica nanoparticles (NPs) with physically adsorbed polystyrene-block-poly(2-vinylpyridine) (PS-b-P2VP) copolymers (BCP) against NPs with grafted bimodal (BM) brushes comprised of long, sparsely grafted PS chains and a short dense carpet of P2VP chains. As with grafted NPs, the dispersion state of the BCP NPs can be facilely tuned in PS matrices by varying the PS coverage on the NP surface or by changes in the ratio of the PS graft to matrix chain lengths. Surprisingly, the BCP NPs are remarkably better dispersed than the NPs tethered with bimodal brushes at comparable PS grafting densities. We postulate that this difference arises because of two factors inherent in the synthesis of the NPs: In the case of the BCP NPs the adsorption process is analogous to the chains being "grafted to" the NP surface, while the BM case corresponds to "grafting from" the surface. We have shown that the "grafted from" protocol yields patchy NPs even if the graft points are uniformly placed on each particle. This phenomenon, which is caused by chain conformation fluctuations, is exacerbated by the distribution function associated with the (small) number of grafts per particle. In contrast, in the case of BCP adsorption, each NP is more uniformly coated by a P2VP monolayer driven by the strongly favorable P2VP-silica interactions. Since each P2VP block is connected to a PS chain we conjecture that these adsorbed systems are closer to the limit of spatially uniform sparse brush coverage than the chemically grafted case. We finally show that the better NP dispersion resulting from BCP adsorption leads to larger mechanical reinforcement than those achieved with BM particles. These results emphasize that physical adsorption of BCPs is a simple, effective and practically promising strategy to direct NP dispersion in a chemically unfavorable polymer matrix.

  16. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  17. The effects of radiotherapy treatment uncertainties on the delivered dose distribution and tumour control probability

    International Nuclear Information System (INIS)

    Booth, J.T.; Zavgorodni, S.F.; Royal Adelaide Hospital, SA

    2001-01-01

    Uncertainty in the precise quantity of radiation dose delivered to tumours in external beam radiotherapy is present due to many factors, and can result in either spatially uniform (Gaussian) or spatially non-uniform dose errors. These dose errors are incorporated into the calculation of tumour control probability (TCP) and produce a distribution of possible TCP values over a population. We also study the effect of inter-patient cell sensitivity heterogeneity on the population distribution of patient TCPs. This study aims to investigate the relative importance of these three uncertainties (spatially uniform dose uncertainty, spatially non-uniform dose uncertainty, and inter-patient cell sensitivity heterogeneity) on the delivered dose and TCP distribution following a typical course of fractionated external beam radiotherapy. The dose distributions used for patient treatments are modelled in one dimension. Geometric positioning uncertainties during and before treatment are considered as shifts of a pre-calculated dose distribution. Following the simulation of a population of patients, distributions of dose across the patient population are used to calculate mean treatment dose, standard deviation in mean treatment dose, mean TCP, standard deviation in TCP, and TCP mode. These parameters are calculated with each of the three uncertainties included separately. The calculations show that the dose errors in the tumour volume are dominated by the spatially uniform component of dose uncertainty. This could be related to machine specific parameters, such as linear accelerator calibration. TCP calculation is affected dramatically by inter-patient variation in the cell sensitivity and to a lesser extent by the spatially uniform dose errors. The positioning errors with the 1.5 cm margins used cause dose uncertainty outside the tumour volume and have a small effect on mean treatment dose (in the tumour volume) and tumour control. Copyright (2001) Australasian College of

  18. Investigation of heterogeneous asymmetric dihydroxylation over OsO{sub 4}-(QN){sub 2}PHAL catalysts of functionalized bimodal mesoporous silica with ionic liquid

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Shenjie [College of Environmental and Energy Engineering, Beijing University of Technology, Beijing 100124 (China); Sun, Jihong, E-mail: jhsun@bjut.edu.cn [College of Environmental and Energy Engineering, Beijing University of Technology, Beijing 100124 (China); Li, Yuzhen; Gao, Lin [College of Environmental and Energy Engineering, Beijing University of Technology, Beijing 100124 (China)

    2011-08-15

    Highlights: {yields} Functionalized bimodal mesoporous silica with MTMSPIm{sup +}Cl{sup -}. {yields} Mesoporous catalyst immobilized with OsO{sub 4}-(QN){sub 2}PHAL. {yields} Catalysts for asymmetric dihydroxylation reaction with high yield and enatioselectivity. {yields} Recyclable catalysts. -- Abstract: A novel synthesis of the functionalized bimodal mesoporous silica with ionic liquid (FBMMs) was performed. After grafting 1-methyl-3-(trimethoxysilyl)propylimidazolium chloride onto the surface of bimodal mesoporous silicas, 1,4-bis(9-O-quininyl)phthalazine ((QN){sub 2}-PHAL) and K{sub 2}Os(OH){sub 4}.2H{sub 2}O were immobilized onto the modified FBMMs by adsorption or ionic exchange methods, and then, the asymmetric dihydroxylation reaction was carried out by using solid catalysts. Techniques such as X-ray diffraction, Fourier Transform Infrared spectroscopy, N{sub 2} adsorption and desorption were employed to characterize their structure and properties. The results showed that the mesoporous ordering degree of bimodal mesoporous silica decreased after functionalization and immobilization of OsO{sub 4}-(QN){sub 2}PHAL. Being very effective in asymmetric dihydroxylation with high yield and enantioselectivity, the prepared heterogeneous solid catalyst could be recycled for five times with little loss of enantioselectivity, with comparison of those results obtained in homophase system. Moreover, the effect of Osmium catalyst on asymmetric dihydroxylation was investigated.

  19. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  20. Merging history of three bimodal clusters

    Science.gov (United States)

    Maurogordato, S.; Sauvageot, J. L.; Bourdin, H.; Cappi, A.; Benoist, C.; Ferrari, C.; Mars, G.; Houairi, K.

    2011-01-01

    We present a combined X-ray and optical analysis of three bimodal galaxy clusters selected as merging candidates at z ~ 0.1. These targets are part of MUSIC (MUlti-Wavelength Sample of Interacting Clusters), which is a general project designed to study the physics of merging clusters by means of multi-wavelength observations. Observations include spectro-imaging with XMM-Newton EPIC camera, multi-object spectroscopy (260 new redshifts), and wide-field imaging at the ESO 3.6 m and 2.2 m telescopes. We build a global picture of these clusters using X-ray luminosity and temperature maps together with galaxy density and velocity distributions. Idealized numerical simulations were used to constrain the merging scenario for each system. We show that A2933 is very likely an equal-mass advanced pre-merger ~200 Myr before the core collapse, while A2440 and A2384 are post-merger systems (~450 Myr and ~1.5 Gyr after core collapse, respectively). In the case of A2384, we detect a spectacular filament of galaxies and gas spreading over more than 1 h-1 Mpc, which we infer to have been stripped during the previous collision. The analysis of the MUSIC sample allows us to outline some general properties of merging clusters: a strong luminosity segregation of galaxies in recent post-mergers; the existence of preferential axes - corresponding to the merging directions - along which the BCGs and structures on various scales are aligned; the concomitance, in most major merger cases, of secondary merging or accretion events, with groups infalling onto the main cluster, and in some cases the evidence of previous merging episodes in one of the main components. These results are in good agreement with the hierarchical scenario of structure formation, in which clusters are expected to form by successive merging events, and matter is accreted along large-scale filaments. Based on data obtained with the European Southern Observatory, Chile (programs 072.A-0595, 075.A-0264, and 079.A-0425