WorldWideScience

Sample records for probabilistic distribution coefficients

  1. Optimized probabilistic calibration of safety coefficients in defect severity assessments; Dimensionnement probabiliste optimise des coefficients de securite dans les etudes de nocivite de defauts

    Energy Technology Data Exchange (ETDEWEB)

    Ardillon, E.; Pitner, P.; Barthelet, B. [Electricite de France, Direction des Etudes at Recherches, 92 - Clamart (France)

    1997-12-31

    The construction codes currently used in nuclear engineering recommend analysis methods and criteria consistent with a deterministic approach. Since 1993, in the framework of work related to the RSEM codes, the EFMT Branch has launched a probabilistic approach to establish a link between the current `deterministic` rules and failure risk assessments for the structures considered. There is an explicit link between the two approaches in the elementary strength/load case where the variables are Gaussian. This case provides the basis for the proposed methodology. In the complex case discussed in this paper, involving cracked piping with numerous non-Gaussian inputs, for a given failure mode, there is an implicit relationship between the target reliability level and the partial safety coefficients attached to each variable. The mean flaw size is the intermediate parameter used to make this link and allows flexibility in the choice of coefficients, thereby raising the question of optimized calibration. The approach is illustrated by the choice of coefficients based on the coordinates of the most probable failure point, resulting in a single set of coefficients adapted to the immediate vicinity of a given situation. In cases where the criterion must guarantee a given reliability level for a number of different operating situations, no set of coefficients can entirely guarantee the target reliability level. So, an optimized set of coefficients has to be selected, ensuring a reliability level as uniform as possible over the scope considered. This paper compares an initial coefficient proposal with a choice based on the design point method. The intermediate variable in assessing the reliability level is the mean flaw size, which would seem compatible with problems encountered under operating conditions. In addition, realistic risk assessment requires validation of the main variable distribution assumptions. We give an example of adjustment of distribution assumptions to

  2. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  3. Correlation Coefficients of Probabilistic Hesitant Fuzzy Elements and Their Applications to Evaluation of the Alternatives

    Directory of Open Access Journals (Sweden)

    Zhong-xing Wang

    2017-11-01

    Full Text Available Correlation coefficient is one of the broadly use indexes in multi-criteria decision-making (MCDM processes. However, some important issues related to correlation coefficient utilization within probabilistic hesitant fuzzy environments remain to be addressed. The purpose of this study is introduced a MCDM method based on correlation coefficients utilize probabilistic hesitant fuzzy information. First, the covariance and correlation coefficient between two PHFEs is introduced, the properties of the proposed covariance and correlation coefficient are discussed. In addition, the northwest corner rule to obtain the expected mean related to the multiply of two PHFEs is introduced. Second, the weighted correlation coefficient is proposed to make the proposed MCDM method more applicable. And the properties of the proposed weighted correlation coefficient are also discussed. Finally, an illustrative example is demonstrated the practicality and effectiveness of the proposed method. An illustrative example is presented to demonstrate the correlation coefficient propose in this paper lies in the interval [−1, 1], which not only consider the strength of relationship between the PHFEs but also whether the PHFEs are positively or negatively related. The advantage of this method is it can avoid the inconsistency of the decision-making result due to the loss of information.

  4. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    Science.gov (United States)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  5. Probabilistic calibration of safety coefficients for flawed components in nuclear engineering

    International Nuclear Information System (INIS)

    Ardillon, E.; Pitner, P.; Barthelet, B.; Remond, A.

    1996-01-01

    The rules that are currently under application to verify the acceptance of flaws in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a precise and reliable method to evaluate the safety margins and the integrity of components led Electricite de France to launch an approach to link directly safety coefficients with safety levels. This paper presents a probabilistic methodology to calibrate safety coefficients in relation to reliability target values. The proposed calibration procedure applies to the case of a ferritic flawed pipe using the R6 procedure for assessing the integrity of the structure. (authors). 5 refs., 5 figs

  6. Probabilistic calibration of safety coefficients for flawed components in nuclear engineering

    International Nuclear Information System (INIS)

    Ardillon, E.; Pitner, P.; Barthelet, B.; Remond, A.

    1995-01-01

    The current rules applied to verify the flaws acceptance in nuclear components rely on deterministic criteria supposed to ensure the plant safe operation. The interest in have a precise and reliable method to evaluate the safety margins and the integrity of components led Electricite de France to launch an approach to link directly safety coefficients with safety levels. This paper presents a probabilistic methodology to calibrate safety coefficients in relation do reliability target values. The proposed calibration procedure applies to the case of a ferritic flawed pipe using the R 6 procedure for assessing the structure integrity. (author). 5 refs., 5 figs., 1 tab

  7. Development of database on the distribution coefficient. 1. Collection of the distribution coefficient data

    Energy Technology Data Exchange (ETDEWEB)

    Takebe, Shinichi; Abe, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The distribution coefficient is very important parameter for environmental impact assessment on the disposal of radioactive waste arising from research institutes. The literature survey in the country was mainly carried out for the purpose of selecting the reasonable distribution coefficient value on the utilization of this value in the safety evaluation. This report was arranged much informations on the distribution coefficient for inputting to the database for each literature, and was summarized as a literature information data on the distribution coefficient. (author)

  8. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  9. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  10. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  11. Soil nuclide distribution coefficients and their statistical distributions

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Beals, D.I.; Thibault, D.H.; O'Connor, P.

    1984-12-01

    Environmental assessments of the disposal of nuclear fuel waste in plutonic rock formations require analysis of the migration of nuclides from the disposal vault to the biosphere. Analyses of nuclide migration via groundwater through the disposal vault, the buffer and backfill, the plutonic rock, and the consolidated and unconsolidated overburden use models requiring distribution coefficients (Ksub(d)) to describe the interaction of the nuclides with the geological and man-made materials. This report presents element-specific soil distribution coefficients and their statistical distributions, based on a detailed survey of the literature. Radioactive elements considered were actinium, americium, bismuth, calcium, carbon, cerium, cesium, iodine, lead, molybdenum, neptunium, nickel, niobium, palladium, plutonium, polonium, protactinium, radium, samarium, selenium, silver, strontium, technetium, terbium, thorium, tin, uranium and zirconium. Stable elements considered were antimony, boron, cadmium, tellurium and zinc. Where sufficient data were available, distribution coefficients and their distributions are given for sand, silt, clay and organic soils. Our values are recommended for use in assessments for the Canadian Nuclear Fuel Waste Management Program

  12. Probabilistic Reverse dOsimetry Estimating Exposure Distribution (PROcEED)

    Science.gov (United States)

    PROcEED is a web-based application used to conduct probabilistic reverse dosimetry calculations.The tool is used for estimating a distribution of exposure concentrations likely to have produced biomarker concentrations measured in a population.

  13. Radionuclides distribution coefficient of soil to soil-solution

    International Nuclear Information System (INIS)

    1990-06-01

    The present book addresses various issues related with the coefficient of radionuclides distribution between soil and soil solution. It consists of six sections and two appendices. The second section, following an introductory one, describes the definition of the coefficient and a procedures of its calculation. The third section deals with the application of the distribution coefficient to the prediction of movements of radionuclides through soil. Various methods for measuring the coefficient are described in the fourth section. The next section discusses a variety of factors (physical and chemical) that can affect the distribution coefficient. Measurements of the coefficient for different types of oils are listed in the sixth section. An appendix is attached to the book to show various models that can be helpful in applying the coefficient of distribution of radionuclides moving from soil into agricultural plants. (N.K.)

  14. Probabilistic modeling of fatigue crack growth in Ti-6Al-4V

    International Nuclear Information System (INIS)

    Soboyejo, W.O.; Shen, W.; Soboyejo, A.B.O.

    2001-01-01

    This paper presents the results of a combined experimental and analytical study of the probabilistic nature of fatigue crack growth in Ti-6Al-4V. A simple experimental fracture mechanics framework is presented for the determination of statistical fatigue crack growth parameters from two fatigue tests. The experimental studies show that the variabilities in long fatigue crack growth rate data and the Paris coefficient are well described by the log-normal distributions. The variabilities in the Paris exponent are also shown to be well characterized by a normal distribution. The measured statistical distributions are incorporated into a probabilistic fracture mechanics framework for the estimation of material reliability. The implications of the results are discussed for the probabilistic analysis of fatigue crack growth in engineering components and structures. (orig.)

  15. Distribution functions of probabilistic automata

    Science.gov (United States)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  16. Probabilistic analysis of glass elements with three-parameter Weibull distribution

    International Nuclear Information System (INIS)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-01-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  17. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  18. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    International Nuclear Information System (INIS)

    Yee, Eric

    2017-01-01

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered

  19. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    Science.gov (United States)

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment

  20. Probabilistic optimization of safety coefficients

    International Nuclear Information System (INIS)

    Marques, M.; Devictor, N.; Magistris, F. de

    1999-01-01

    This article describes a reliability-based method for the optimization of safety coefficients defined and used in design codes. The purpose of the optimization is to determine the partial safety coefficients which minimize an objective function for sets of components and loading situations covered by a design rule. This objective function is a sum of distances between the reliability of the components designed using the safety coefficients and a target reliability. The advantage of this method is shown on the examples of the reactor vessel, a vapour pipe and the safety injection circuit. (authors)

  1. Probabilistic Load Models for Simulating the Impact of Load Management

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    . It is concluded that the AR(12) model is favored with limited measurement data and that the joint-normal model may provide better results with a large data set. Both models can be applied in general to model load time series and used in time-sequential simulation of distribution system planning.......This paper analyzes a distribution system load time series through autocorrelation coefficient, power spectral density, probabilistic distribution and quantile value. Two probabilistic load models, i.e. the joint-normal model and the autoregressive model of order 12 (AR(12)), are proposed...... to simulate the impact of load management. The joint-normal model is superior in modeling the tail region of the hourly load distribution and implementing the change of hourly standard deviation. Whereas the AR(12) model requires much less parameter and is superior in modeling the autocorrelation...

  2. A combinatorial and probabilistic study of initial and end heights of descents in samples of geometrically distributed random variables and in permutations

    Directory of Open Access Journals (Sweden)

    Helmut Prodinger

    2007-01-01

    Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.

  3. Questionnaire on the measurement condition of distribution coefficient

    International Nuclear Information System (INIS)

    Takebe, Shinichi; Kimura, Hideo; Matsuzuru, Hideo

    2001-05-01

    The distribution coefficient is used for various transport models to evaluate the migration behavior of radionuclides in the environment and is very important parameter in environmental impact assessment of nuclear facility. The questionnaire was carried out for the purpose of utilizing for the proposal of the standard measuring method of distribution coefficient. This report is summarized the result of questionnairing on the sampling methods and storage condition, the pretreatment methods, the analysis items in the physical/chemical characteristics of the sample, and the distribution coefficient measuring method and the measurement conditions in the research institutes within country. (author)

  4. Blind RRT: A probabilistically complete distributed RRT

    KAUST Repository

    Rodriguez, Cesar; Denny, Jory; Jacobs, Sam Ade; Thomas, Shawna; Amato, Nancy M.

    2013-01-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  5. Blind RRT: A probabilistically complete distributed RRT

    KAUST Repository

    Rodriguez, Cesar

    2013-11-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  6. Developing and evaluating distributions for probabilistic human exposure assessments

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.

    2002-08-01

    This report describes research carried out at the Lawrence Berkeley National Laboratory (LBNL) to assist the U. S. Environmental Protection Agency (EPA) in developing a consistent yet flexible approach for evaluating the inputs to probabilistic risk assessments. The U.S. EPA Office of Emergency and Remedial Response (OERR) recently released Volume 3 Part A of Risk Assessment Guidance for Superfund (RAGS), as an update to the existing two-volume set of RAGS. The update provides policy and technical guidance on performing probabilistic risk assessment (PRA). Consequently, EPA risk managers and decision-makers need to review and evaluate the adequacy of PRAs for supporting regulatory decisions. A critical part of evaluating a PRA is the problem of evaluating or judging the adequacy of input distributions PRA. Although the overarching theme of this report is the need to improve the ease and consistency of the regulatory review process, the specific objectives are presented in two parts. The objective of Part 1 is to develop a consistent yet flexible process for evaluating distributions in a PRA by identifying the critical attributes of an exposure factor distribution and discussing how these attributes relate to the task-specific adequacy of the input. This objective is carried out with emphasis on the perspective of a risk manager or decision-maker. The proposed evaluation procedure provides consistency to the review process without a loss of flexibility. As a result, the approach described in Part 1 provides an opportunity to apply a single review framework for all EPA regions and yet provide the regional risk manager with the flexibility to deal with site- and case-specific issues in the PRA process. However, as the number of inputs to a PRA increases, so does the complexity of the process for calculating, communicating and managing risk. As a result, there is increasing effort required of both the risk professionals performing the analysis and the risk manager

  7. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    Science.gov (United States)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  8. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    Science.gov (United States)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  9. Evaluation for probabilistic distributions of fatigue life of marine propeller materials by using a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Yoon, Han Yong; Zhang, Jian Wei

    2008-01-01

    Engineering materials have been studied and developed remarkably for a long time. But, few reports about marine propeller materials are presented. Recently, some researchers have studied the material strength of marine propellers. However, studies on parametric sensitivity and probabilistic distribution of fatigue life of propeller materials have not been made yet. In this study, a method to predict the probabilistic distributions of fatigue life of propeller materials is presented, and the influence of several parameters on the life distribution is discussed

  10. An Investigation of the Sampling Distributions of Equating Coefficients.

    Science.gov (United States)

    Baker, Frank B.

    1996-01-01

    Using the characteristic curve method for dichotomously scored test items, the sampling distributions of equating coefficients were examined. Simulations indicate that for the equating conditions studied, the sampling distributions of the equating coefficients appear to have acceptable characteristics, suggesting confidence in the values obtained…

  11. An Investigation of the Sampling Distribution of the Congruence Coefficient.

    Science.gov (United States)

    Broadbooks, Wendy J.; Elmore, Patricia B.

    This study developed and investigated an empirical sampling distribution of the congruence coefficient. The effects of sample size, number of variables, and population value of the congruence coefficient on the sampling distribution of the congruence coefficient were examined. Sample data were generated on the basis of the common factor model and…

  12. The distribution coefficient concept and aspects on experimental distribution studies

    International Nuclear Information System (INIS)

    Allard, B.; Andersson, K.; Torstenfelt, B.

    1983-01-01

    Aspects on the distribution coefficient concept, sorption mechanisms and measurements of sorption phenomena are given. The distribution constant was shown to be pH-dependent. The temperature dependence of the sorption has been studied. The influence of the liquid/solid ratio has been studied for Cs and Sr. (G.B.)

  13. Development of database on the distribution coefficient. 2. Preparation of database

    Energy Technology Data Exchange (ETDEWEB)

    Takebe, Shinichi; Abe, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The distribution coefficient is very important parameter for environmental impact assessment on the disposal of radioactive waste arising from research institutes. 'Database on the Distribution Coefficient' was built up from the informations which were obtained by the literature survey in the country for these various items such as value , measuring method and measurement condition of distribution coefficient, in order to select the reasonable distribution coefficient value on the utilization of this value in the safety evaluation. This report was explained about the outline on preparation of this database and was summarized as a use guide book of database. (author)

  14. Development of database on the distribution coefficient. 2. Preparation of database

    Energy Technology Data Exchange (ETDEWEB)

    Takebe, Shinichi; Abe, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The distribution coefficient is very important parameter for environmental impact assessment on the disposal of radioactive waste arising from research institutes. 'Database on the Distribution Coefficient' was built up from the informations which were obtained by the literature survey in the country for these various items such as value , measuring method and measurement condition of distribution coefficient, in order to select the reasonable distribution coefficient value on the utilization of this value in the safety evaluation. This report was explained about the outline on preparation of this database and was summarized as a use guide book of database. (author)

  15. Countermeasure against probabilistic blinding attack in practical quantum key distribution systems

    International Nuclear Information System (INIS)

    Qian Yong-Jun; Li Hong-Wei; He De-Yong; Yin Zhen-Qiang; Zhang Chun-Mei; Chen Wei; Wang Shuang; Han Zheng-Fu

    2015-01-01

    In a practical quantum key distribution (QKD) system, imperfect equipment, especially the single-photon detector, can be eavesdropped on by a blinding attack. However, the original blinding attack may be discovered by directly detecting the current. In this paper, we propose a probabilistic blinding attack model, where Eve probabilistically applies a blinding attack without being caught by using only an existing intuitive countermeasure. More precisely, our countermeasure solves the problem of how to define the bound in the limitation of precision of current detection, and then we prove security of the practical system by considering the current parameter. Meanwhile, we discuss the bound of the quantum bit error rate (QBER) introduced by Eve, by which Eve can acquire information without the countermeasure. (paper)

  16. Distributing Correlation Coefficients of Linear Structure-Activity/Property Models

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACA

    2011-12-01

    Full Text Available Quantitative structure-activity/property relationships are mathematical relationships linking chemical structure and activity/property in a quantitative manner. These in silico approaches are frequently used to reduce animal testing and risk-assessment, as well as to increase time- and cost-effectiveness in characterization and identification of active compounds. The aim of our study was to investigate the pattern of correlation coefficients distribution associated to simple linear relationships linking the compounds structure with their activities. A set of the most common ordnance compounds found at naval facilities with a limited data set with a range of toxicities on aquatic ecosystem and a set of seven properties was studied. Statistically significant models were selected and investigated. The probability density function of the correlation coefficients was investigated using a series of possible continuous distribution laws. Almost 48% of the correlation coefficients proved fit Beta distribution, 40% fit Generalized Pareto distribution, and 12% fit Pert distribution.

  17. Probabilistic distributions of pin gaps within a wire-spaced fuel subassembly and sensitivities of the related uncertainties to pin gap

    International Nuclear Information System (INIS)

    Sakai, K.; Hishida, H.

    1978-01-01

    Probabilistic fuel pin gap distributions within a wire-spaced fuel subassembly and sensitivities of the related uncertainties to fuel pin gaps are discussed. The analyses consist mainly of expressing a local fuel pin gap in terms of sensitivity functions of the related uncertainties and calculating the corresponding probabilistic distribution through taking all the possible combinations of the distribution of uncertainties. The results of illustrative calculations show that with the reliability level of 0.9987, the maximum deviation of the pin gap at the cladding hot spot of a center fuel subassembly is 8.05% from its nominal value and the corresponding probabilistic pin gap distribution is shifted to the narrower side due to the external confinement of a pin bundle with a wrapper tube. (Auth.)

  18. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    Science.gov (United States)

    Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...

  19. Correlation of Cadmium Distribution Coefficients to Soil Characteristics

    DEFF Research Database (Denmark)

    Holm, Peter Engelund; Rootzen, Helle; Borggaard, Ole K.

    2003-01-01

    on whole soil samples have shown that pH is the main parameter controlling the distribution. To identify further the components that are important for Cd binding in soil we measured Cd distribution coefficients (K-d) at two fixed pH values and at low Cd loadings for 49 soils sampled in Denmark. The Kd...... values for Cd ranged from 5 to 3000 L kg(-1). The soils were described pedologically and characterized in detail (22 parameters) including determination of contents of the various minerals in the clay fraction. Correlating parameters were grouped and step-wise regression analysis revealed...... interlayered clay minerals [HIM], chlorite, quartz, microcline, plagioclase) were significant in explaining the Cd distribution coefficient....

  20. Study of Power Fluctuation from Dispersed Generations and Loads and its Impact on a Distribution Network through a Probabilistic Approach

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2007-01-01

    In order to assess the performance of distribution system under normal operating conditions with large integration of renewable energy based dispersed generation (DG) units, probabilistic modeling of the distribution system is necessary in order to take into consideration the stochastic behavior...... of load demands and DG units such as wind generation and combined heat and power plant generation. This paper classifies probabilistic models of load demands and DG units into summer and winter period, weekday and weekend as well as in 24 hours a day. The voltage results from the probabilistic load flow...

  1. Transport coefficients in Lorentz plasmas with the power-law kappa-distribution

    International Nuclear Information System (INIS)

    Jiulin, Du

    2013-01-01

    Transport coefficients in Lorentz plasma with the power-law κ-distribution are studied by means of using the transport equation and macroscopic laws of Lorentz plasma without magnetic field. Expressions of electric conductivity, thermoelectric coefficient, and thermal conductivity for the power-law κ-distribution are accurately derived. It is shown that these transport coefficients are significantly modified by the κ-parameter, and in the limit of the parameter κ→∞ they are reduced to the standard forms for a Maxwellian distribution

  2. Analysis of infant cry through weighted linear prediction cepstral coefficients and Probabilistic Neural Network.

    Science.gov (United States)

    Hariharan, M; Chee, Lim Sin; Yaacob, Sazali

    2012-06-01

    Acoustic analysis of infant cry signals has been proven to be an excellent tool in the area of automatic detection of pathological status of an infant. This paper investigates the application of parameter weighting for linear prediction cepstral coefficients (LPCCs) to provide the robust representation of infant cry signals. Three classes of infant cry signals were considered such as normal cry signals, cry signals from deaf babies and babies with asphyxia. A Probabilistic Neural Network (PNN) is suggested to classify the infant cry signals into normal and pathological cries. PNN is trained with different spread factor or smoothing parameter to obtain better classification accuracy. The experimental results demonstrate that the suggested features and classification algorithms give very promising classification accuracy of above 98% and it expounds that the suggested method can be used to help medical professionals for diagnosing pathological status of an infant from cry signals.

  3. Effects of varying the step particle distribution on a probabilistic transport model

    International Nuclear Information System (INIS)

    Bouzat, S.; Farengo, R.

    2005-01-01

    The consequences of varying the step particle distribution on a probabilistic transport model, which captures the basic features of transport in plasmas and was recently introduced in Ref. 1 [B. Ph. van Milligen et al., Phys. Plasmas 11, 2272 (2004)], are studied. Different superdiffusive transport mechanisms generated by a family of distributions with algebraic decays (Tsallis distributions) are considered. It is observed that the possibility of changing the superdiffusive transport mechanism improves the flexibility of the model for describing different situations. The use of the model to describe the low (L) and high (H) confinement modes is also analyzed

  4. Evaluation of distribution patterns and decision of distribution coefficients of trace elements in high-purity aluminium by INAA

    International Nuclear Information System (INIS)

    Hayakawa, Yasuhiro; Suzuki, Shogo; Hirai, Shoji

    1986-01-01

    Recently, a high-purity aluminium has been used in semi-coductor device, so on. It was required that trace impurities should be reduced and that its content should be quantitatively evaluated. In this study, distribution patterns of many trace impurities in 99.999 % aluminium ingots, which was purified using a normal freezing method, were evaluated by an INAA. The effective distribution coefficient k for each detected elements was calculated using a theoretical distribution equation in the normal freezing method. As a result, the elements of k 1 was Hf. Especially, La, Sm, U and Th could be effectively purified, but Sc and Hf could be scarcely purified. Further more, it was found that the slower freezing gave the effective distribution coefficient close to the equilibrium distribution coefficient, and that the effective distribution coefficient became smaller with the larger atomic radius. (author)

  5. Probabilistic Analysis of a Composite Crew Module

    Science.gov (United States)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  6. Use of the lognormal distribution for the coefficients of friction and wear

    International Nuclear Information System (INIS)

    Steele, Clint

    2008-01-01

    To predict the reliability of a system, an engineer might allocate a distribution to each input. This raises a question: how to select the correct distribution? Siddall put forward an evolutionary approach that was intended to utilise both the understanding of the engineer and available data. However, this method requires a subjective initial distribution based on the engineer's understanding of the variable or parameter. If the engineer's understanding is limited, the initial distribution will be misrepresentative of the actual distribution, and application of the method will likely fail. To provide some assistance, the coefficients of friction and wear are considered here. Basic tribology theory, dimensional issues and the central limit theorem are used to argue that the distribution for each of the coefficients will typically be like a lognormal distribution. Empirical evidence from other sources is cited to lend support to this argument. It is concluded that the distributions for the coefficients of friction and wear would typically be lognormal in nature. It is therefore recommended that the engineer, without data or evidence to suggest differently, should allocate a lognormal distribution to the coefficients of friction and wear

  7. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  8. Probabilistic evaluation of fatigue crack growth in SA 508 and SA 533 B steel

    International Nuclear Information System (INIS)

    Dufresne, J.; Rieunier, J.B.

    1982-07-01

    This paper describes the method used to select the best representative law of fatigue crack growth in view of its introduction in a probabilist computer code). A modelling of the selected law (Paris law) and the statistical distribution of the corresponding numerical coefficients are presented. Results of computation are given in the case of a PWR pressure vessel with defects in belt line weld

  9. Measurement of distribution coefficients of U series radionuclides on soils under shallow land environment (2). pH dependence of distribution coefficients

    International Nuclear Information System (INIS)

    Sakamoto, Yoshiaki; Takebe, Shinichi; Ogawa, Hiromichi; Inagawa, Satoshi; Sasaki, Tomozou

    2001-01-01

    In order to study sorption behavior of U series radionuclides (Pb, Ra, Th, Ac, Pa and U) under aerated zone environment (loam-rain water system) and aquifer environment (sand-groundwater system) for safety assessment of U bearing waste, pH dependence of distribution coefficients of each element has been obtained. The pH dependence of distribution coefficients of Pb, Ra, Th, Ac and U was analyzed by model calculation based on aqueous speciation of each element and soil surface charge characteristics, which is composed of a cation exchange capacity and surface hydroxyl groups. From the model calculation, the sorption behavior of Pb, Ra, Th, Ac and U could be described by a combination of cation exchange reaction and surface-complexation model. (author)

  10. A probabilistic method for testing and estimating selection differences between populations.

    Science.gov (United States)

    He, Yungang; Wang, Minxian; Huang, Xin; Li, Ran; Xu, Hongyang; Xu, Shuhua; Jin, Li

    2015-12-01

    Human populations around the world encounter various environmental challenges and, consequently, develop genetic adaptations to different selection forces. Identifying the differences in natural selection between populations is critical for understanding the roles of specific genetic variants in evolutionary adaptation. Although numerous methods have been developed to detect genetic loci under recent directional selection, a probabilistic solution for testing and quantifying selection differences between populations is lacking. Here we report the development of a probabilistic method for testing and estimating selection differences between populations. By use of a probabilistic model of genetic drift and selection, we showed that logarithm odds ratios of allele frequencies provide estimates of the differences in selection coefficients between populations. The estimates approximate a normal distribution, and variance can be estimated using genome-wide variants. This allows us to quantify differences in selection coefficients and to determine the confidence intervals of the estimate. Our work also revealed the link between genetic association testing and hypothesis testing of selection differences. It therefore supplies a solution for hypothesis testing of selection differences. This method was applied to a genome-wide data analysis of Han and Tibetan populations. The results confirmed that both the EPAS1 and EGLN1 genes are under statistically different selection in Han and Tibetan populations. We further estimated differences in the selection coefficients for genetic variants involved in melanin formation and determined their confidence intervals between continental population groups. Application of the method to empirical data demonstrated the outstanding capability of this novel approach for testing and quantifying differences in natural selection. © 2015 He et al.; Published by Cold Spring Harbor Laboratory Press.

  11. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    Science.gov (United States)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  12. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  13. A probabilistic SSYST-3 analysis for a PWR-core during a large break LOCA

    International Nuclear Information System (INIS)

    Schubert, J.D.; Gulden, W.; Jacobs, G.; Meyder, R.; Sengpiel, W.

    1985-05-01

    This report demonstrates the SSYST-3 analysis and application for a German PWR of 1300 MW. The report is concerned with the probabilistic analysis of a PWR core during a loss-of-coolant accident due to a large break. With the probabilistic analysis, the distribution functions of the maximum temperatures and cladding elongations occuring in the core can be calculated. Parameters like rod power, the thermohydraulic boundary conditions, stored energy in the fuel rods and the heat transfer coefficient were found to be the most important. The expected value of core damage was determined to be 2.9% on the base of response surfaces for cladding temperature and strain deduced from SSYST-3 single rod results. (orig./HP) [de

  14. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  15. Distribution coefficients of purine alkaloids in water-ammonium sulfate-alkyl acetate-dialkyl phthalate systems

    Science.gov (United States)

    Korenman, Ya. I.; Krivosheeva, O. A.; Mokshina, N. Ya.

    2012-12-01

    The distribution of purine alkaloids (caffeine, theobromine, theophylline) was studied in the systems: alkyl acetates-dialkyl phtalate-salting-out agent (ammonium sulfate). The quantitative characteristics of the extraction-distribution coefficients ( D) and the degree of extraction ( R, %) are calculated. The relationships between the distribution coefficients of alkaloids and the length of the hydrocarbon radical in the molecule of alkyl acetate (dialkyl phtalate) are determined. The possibility of predicting the distribution coefficients is demonstrated.

  16. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  17. Probabilistic Q-function distributions in fermionic phase-space

    International Nuclear Information System (INIS)

    Rosales-Zárate, Laura E C; Drummond, P D

    2015-01-01

    We obtain a positive probability distribution or Q-function for an arbitrary fermionic many-body system. This is different to previous Q-function proposals, which were either restricted to a subspace of the overall Hilbert space, or used Grassmann methods that do not give probabilities. The fermionic Q-function obtained here is constructed using normally ordered Gaussian operators, which include both non-interacting thermal density matrices and BCS states. We prove that the Q-function exists for any density matrix, is real and positive, and has moments that correspond to Fermi operator moments. It is defined on a finite symmetric phase-space equivalent to the space of real, antisymmetric matrices. This has the natural SO(2M) symmetry expected for Majorana fermion operators. We show that there is a physical interpretation of the Q-function: it is the relative probability for observing a given Gaussian density matrix. The distribution has a uniform probability across the space at infinite temperature, while for pure states it has a maximum value on the phase-space boundary. The advantage of probabilistic representations is that they can be used for computational sampling without a sign problem. (fast track communication)

  18. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  19. Energy distribution for Coefficients of Redundant Signal Representations of Music

    DEFF Research Database (Denmark)

    Endelt, Line Ørtoft; la Cour-Harbo, Anders

    2005-01-01

    different time-frequency dictionaries. We have applied these methods to music to examine their ability to express music signals in a sparse manner for a number of dictionaries and window lengths. The evaluation is based on the m-term approximation needed to represent 90 %, 95 %, 99 % and 99......In this paper we investigate how the energy is distributed in the coefficients vector of various redundant signal representations of music signals. The representations are found using Basis Pursuit, Matching Pursuit, Alternating Projections, Best Orthogonal Basis and Method of Frames, with five.......9 % of the energy in the coefficients, also the time consummation for finding the representations are considered. The distribution of energy in the coefficients of the representations found using Basis Pursuit, Matching Pursuit, Alternating Projections and Best Orthogonal Basis depends mainly on the signal...

  20. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  1. A probabilistic method for species sensitivity distributions taking into account the inherent uncertainty and variability of effects to estimate environmental risk.

    Science.gov (United States)

    Gottschalk, Fadri; Nowack, Bernd

    2013-01-01

    This article presents a method of probabilistically computing species sensitivity distributions (SSD) that is well-suited to cope with distinct data scarcity and variability. First, a probability distribution that reflects the uncertainty and variability of sensitivity is modeled for each species considered. These single species sensitivity distributions are then combined to create an SSD for a particular ecosystem. A probabilistic estimation of the risk is carried out by combining the probability of critical environmental concentrations with the probability of organisms being impacted negatively by these concentrations. To evaluate the performance of the method, we developed SSD and risk calculations for the aquatic environment exposed to triclosan. The case studies showed that the probabilistic results reflect the empirical information well, and the method provides a valuable alternative or supplement to more traditional methods for calculating SSDs based on averaging raw data and/or on using theoretical distributional forms. A comparison and evaluation with single SSD values (5th-percentile [HC5]) revealed the robustness of the proposed method. Copyright © 2012 SETAC.

  2. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  3. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    Science.gov (United States)

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  4. Distribution coefficient of plutonium between sediment and seawater

    International Nuclear Information System (INIS)

    Duursma, E.K.; Parsi, P.

    1974-01-01

    Using plutonium 237 as a tracer, a series of experiments were conducted to determine the distribution coefficient of plutonium onto sediments both under oxic and anoxic conditions, where the plutonium was added to seawater in three different valence states: III, IV and VI

  5. DYNAMIC SOFTWARE TESTING MODELS WITH PROBABILISTIC PARAMETERS FOR FAULT DETECTION AND ERLANG DISTRIBUTION FOR FAULT RESOLUTION DURATION

    Directory of Open Access Journals (Sweden)

    A. D. Khomonenko

    2016-07-01

    Full Text Available Subject of Research.Software reliability and test planning models are studied taking into account the probabilistic nature of error detection and discovering. Modeling of software testing enables to plan the resources and final quality at early stages of project execution. Methods. Two dynamic models of processes (strategies are suggested for software testing, using error detection probability for each software module. The Erlang distribution is used for arbitrary distribution approximation of fault resolution duration. The exponential distribution is used for approximation of fault resolution discovering. For each strategy, modified labeled graphs are built, along with differential equation systems and their numerical solutions. The latter makes it possible to compute probabilistic characteristics of the test processes and states: probability states, distribution functions for fault detection and elimination, mathematical expectations of random variables, amount of detected or fixed errors. Evaluation of Results. Probabilistic characteristics for software development projects were calculated using suggested models. The strategies have been compared by their quality indexes. Required debugging time to achieve the specified quality goals was calculated. The calculation results are used for time and resources planning for new projects. Practical Relevance. The proposed models give the possibility to use the reliability estimates for each individual module. The Erlang approximation removes restrictions on the use of arbitrary time distribution for fault resolution duration. It improves the accuracy of software test process modeling and helps to take into account the viability (power of the tests. With the use of these models we can search for ways to improve software reliability by generating tests which detect errors with the highest probability.

  6. Self-similar optical pulses in competing cubic-quintic nonlinear media with distributed coefficients

    International Nuclear Information System (INIS)

    Zhang Jiefang; Tian Qing; Wang Yueyue; Dai Chaoqing; Wu Lei

    2010-01-01

    We present a systematic analysis of the self-similar propagation of optical pulses within the framework of the generalized cubic-quintic nonlinear Schroedinger equation with distributed coefficients. By appropriately choosing the relations between the distributed coefficients, we not only retrieve the exact self-similar solitonic solutions, but also find both the approximate self-similar Gaussian-Hermite solutions and compact solutions. Our analytical and numerical considerations reveal that proper choices of the distributed coefficients could make the unstable solitons stable and could restrict the nonlinear interaction between the neighboring solitons.

  7. On the size distribution of cities: an economic interpretation of the Pareto coefficient.

    Science.gov (United States)

    Suh, S H

    1987-01-01

    "Both the hierarchy and the stochastic models of size distribution of cities are analyzed in order to explain the Pareto coefficient by economic variables. In hierarchy models, it is found that the rate of variation in the productivity of cities and that in the probability of emergence of cities can explain the Pareto coefficient. In stochastic models, the productivity of cities is found to explain the Pareto coefficient. New city-size distribution functions, in which the Pareto coefficient is decomposed by economic variables, are estimated." excerpt

  8. Generation of pseudo-random numbers from given probabilistic distribution with the use of chaotic maps

    Science.gov (United States)

    Lawnik, Marcin

    2018-01-01

    The scope of the paper is the presentation of a new method of generating numbers from a given distribution. The method uses the inverse cumulative distribution function and a method of flattening of probabilistic distributions. On the grounds of these methods, a new construction of chaotic maps was derived, which generates values from a given distribution. The analysis of the new method was conducted on the example of a newly constructed chaotic recurrences, based on the Box-Muller transformation and the quantile function of the exponential distribution. The obtained results certify that the proposed method may be successively applicable for the construction of generators of pseudo-random numbers.

  9. Sediment pore water distribution coefficients of PCB congeners in enriched black carbon sediment

    International Nuclear Information System (INIS)

    Martinez, Andres; O'Sullivan, Colin; Reible, Danny; Hornbuckle, Keri C.

    2013-01-01

    More than 2300 sediment pore water distribution coefficients (K PCBids ) of 93 polychlorinated biphenyls (PCBs) were measured and modeled from sediments from Indiana Harbor and Ship Canal. K PCBids were calculated from previously reported bulk sediment values and newly analyzed pore water. PCBs in pore waters were measured using SPME PDMS-fiber and ∑PCB ranged from 41 to 1500 ng L −1 . The resulting K PCBids were ∼1 log unit lower in comparison to other reported values. A simple model for the K PCBid consisted of the product of the organic carbon fraction and the octanol–water partition coefficient and provided an excellent prediction for the measured values, with a mean square error of 0.09 ± 0.06. Although black carbon content is very high in these sediments and was expected to play an important role in the distribution of PCBs, no improvement was obtained when a two-carbon model was used. -- Highlights: •PCB sediment-pore water distribution coefficients were measured and modeled. •Distribution coefficients were lower in comparison to other reported values. •Organic carbon fraction times the K OW yielded the best prediction model. •The incorporation of black carbon into a model did not improve the results. -- The organic carbon fraction times the octanol–water partition coefficient yielded the best prediction model for the sediment pore water distribution coefficient of PCBs

  10. Procedures and apparatus for measuring diffusion and distribution coefficients in compacted clays

    Energy Technology Data Exchange (ETDEWEB)

    Hume, H B

    1993-12-01

    Diffusion and distribution coefficients are needed to assess the migration of radionuclides through the compacted clay-based buffer and backfill materials proposed for use in a nuclear fuel waste disposal vault. This report describes the techniques used to measure these coefficients. Both steady-state and transient diffusion experiments are discussed. The procedures used to prepare the clay plug, assemble the cell, conduct the experiment and calculate the results are described. In addition, methods for obtaining distribution coefficients for radionuclides on both loose and compacted clays are discussed. (author). 18 refs., 3 tabs., 16 figs.

  11. Procedures and apparatus for measuring diffusion and distribution coefficients in compacted clays

    International Nuclear Information System (INIS)

    Hume, H.B.

    1993-12-01

    Diffusion and distribution coefficients are needed to assess the migration of radionuclides through the compacted clay-based buffer and backfill materials proposed for use in a nuclear fuel waste disposal vault. This report describes the techniques used to measure these coefficients. Both steady-state and transient diffusion experiments are discussed. The procedures used to prepare the clay plug, assemble the cell, conduct the experiment and calculate the results are described. In addition, methods for obtaining distribution coefficients for radionuclides on both loose and compacted clays are discussed. (author). 18 refs., 3 tabs., 16 figs

  12. The stochastic distribution of available coefficient of friction on quarry tiles for human locomotion.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.

  13. Measurement and prediction of aromatic solute distribution coefficients for aqueous-organic solvent systems. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.R.; Luthy, R.G.

    1984-06-01

    Experimental and modeling activities were performed to assess techniques for measurement and prediction of distribution coefficients for aromatic solutes between water and immiscible organic solvents. Experiments were performed to measure distribution coefficients in both clean water and wastewater systems, and to assess treatment of a wastewater by solvent extraction. The theoretical portions of this investigation were directed towards development of techniques for prediction of solute-solvent/water distribution coefficients. Experiments were performed to assess treatment of a phenolic-laden coal conversion wastewater by solvent extraction. The results showed that solvent extraction for recovery of phenolic material offered several wastewater processing advantages. Distribution coefficients were measured in clean water and wastewater systems for aromatic solutes of varying functionality with different solvent types. It was found that distribution coefficients for these compounds in clean water systems were not statistically different from distribution coefficients determined in a complex coal conversion process wastewater. These and other aromatic solute distribution coefficient data were employed for evaluation of modeling techniques for prediction of solute-solvent/water distribution coefficients. Eight solvents were selected in order to represent various chemical classes: toluene and benzene (aromatics), hexane and heptane (alkanes), n-octanol (alcohols), n-butyl acetate (esters), diisopropyl ether (ethers), and methylisobutyl ketone (ketones). The aromatic solutes included: nonpolar compounds such as benzene, toluene and naphthalene, phenolic compounds such as phenol, cresol and catechol, nitrogenous aromatics such as aniline, pyridine and aminonaphthalene, and other aromatic solutes such as naphthol, quinolinol and halogenated compounds. 100 references, 20 figures, 34 tables.

  14. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  15. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  16. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  17. Probabilistic evaluation method of stability of ground and slope considering spatial randomness of soil properties

    International Nuclear Information System (INIS)

    Ohtori, Yasuki

    2004-01-01

    In the JEAG4601-1987 (Japan Electric Association Guide for earthquake resistance design), either the conventional deterministic method or probabilistic method is used for evaluating the stability of ground foundations and surrounding slopes in nuclear power plants. The deterministic method, in which the soil properties of 'mean ± coefficient x standard deviation' is adopted for the calculations, is generally used in the design stage to data. On the other hand, the probabilistic method, in which the soil properties assume to have probabilistic distributions, is stated as a future method. The deterministic method facilitates the evaluation, however, it is necessary to clarify the relation with the probabilistic method. In this paper, the relationship between the deterministic and the probabilistic methods are investigated. To do that, a simple model that can take into account the dynamic effect of structures and a simplified method for accounting the spatial randomness are proposed and used for the studies. As the results of studies, it is found that the strength of soil properties is most importation factor for the stability of ground structures and the probability below the safety factor evaluated with the soil properties of mean -1.0 x standard deviation' by the deterministic method is of much lower. (author)

  18. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  19. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    OpenAIRE

    S. Mori; K. Kitsukawa; M. Hisakado

    2006-01-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution h...

  20. Probabilistic Performance Guarantees for Distributed Self-Assembly

    KAUST Repository

    Fox, Michael J.

    2015-04-01

    In distributed self-assembly, a multitude of agents seek to form copies of a particular structure, modeled here as a labeled graph. In the model, agents encounter each other in spontaneous pairwise interactions and decide whether or not to form or sever edges based on their two labels and a fixed set of local interaction rules described by a graph grammar. The objective is to converge on a graph with a maximum number of copies of a given target graph. Our main result is the introduction of a simple algorithm that achieves an asymptotically maximum yield in a probabilistic sense. Notably, agents do not need to update their labels except when forming or severing edges. This contrasts with certain existing approaches that exploit information propagating rules, effectively addressing the decision problem at the level of subgraphs as opposed to individual vertices. We are able to obey more stringent locality requirements while also providing smaller rule sets. The results can be improved upon if certain requirements on the labels are relaxed. We discuss limits of performance in self-assembly in terms of rule set characteristics and achievable maximum yield.

  1. The rate coefficients of unimolecular reactions in the systems with power-law distributions

    Science.gov (United States)

    Yin, Cangtao; Guo, Ran; Du, Jiulin

    2014-08-01

    The rate coefficient formulae of unimolecular reactions are generalized to the systems with the power-law distributions based on nonextensive statistics, and the power-law rate coefficients are derived in the high and low pressure limits, respectively. The numerical analyses are made of the rate coefficients as functions of the ν-parameter, the threshold energy, the temperature and the number of degrees of freedom. We show that the new rate coefficients depend strongly on the ν-parameter different from one (thus from a Boltzmann-Gibbs distribution). Two unimolecular reactions, CH3CO→CH3+CO and CH3NC→CH3CN, are taken as application examples to calculate their power-law rate coefficients, which obtained with the ν-parameters slightly different from one can be exactly in agreement with all the experimental studies on these two reactions in the given temperature ranges.

  2. PROBABILISTIC FLOW DISTRIBUTION AS A REACTION TO THE STOCHASTICITY OF THE LOAD IN THE POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    A. M. Hashimov

    2016-01-01

    Full Text Available For the analysis and control of power systems deterministic approaches that are implemented in the form of well-known methods and models of calculation of steady-state and transient modes are mostly use in current practice. With the use of these methods it is possible to obtain solutions only for fixed circuit parameters of the system scheme and assuming that active and reactive powers as well as generation in nodal points of the network remain the same. In reality the stochastic character of power consumption cause the casual fluctuations of voltages at the nodes and power flows in electric power lines of the power system. Such casual fluctuations of operation can be estimated with the use of probabilistic simulation of the power flows. In the article the results of research of the influence of depth of casual fluctuations of the load power of the system on the probability distribution of voltage at nodes as well as on the flows of active and reactive power in the lines are presented. Probabilistic modeling of flow under stochastic load change is performed for different levels of fluctuations and under loading of the mode of the system up to peak load power. Test study to quantify the effect of stochastic variability of loads on the probabilistic distribution parameters of the modes was carried out on behalf of the electrical network of the real power system. The results of the simulation of the probability flow distribution for these fluctuations of the load, represented in the form of discrete sample values of the active power obtained with the use of the analytical Monte-Carlo method, and real data measurements of their values in the network under examination were compared.

  3. Distribution coefficients for plutonium and americium on particulates in aquatic environments

    International Nuclear Information System (INIS)

    Sanchez, A.L.; Schell, W.R.; Sibley, T.H.

    1982-01-01

    The distribution coefficients of two transuranic elements, plutonium and americium, were measured experimentally in laboratory systems of selected freshwater, estuarine, and marine environments. Gamma-ray emitting isotopes of these radionuclides, 237 Pu and 241 Am, were significantly greater than the sorption Ksub(d) values, suggesting some irreversibility in the sorption of these radionuclides onto sediments. The effects of pH and of sediment concentration on the distribution coefficients were also investigated. There were significant changes in the Ksub(d) values as these parameters were varied. Experiments using sterilized and nonsterilized samples for some of the sediment/water systems indicate possible bacterial effects on Ksub(d) values. (author)

  4. Measurement of the distribution coefficient between soil and Cesium-137

    International Nuclear Information System (INIS)

    Tejada V, S.; Hernandez P, M.

    1996-01-01

    The measurement of the distribution coefficient of Cs-137 is currently performed by batch method between radioisotope solution and which was collected from the Mexican Disposal Site, near the town of Maquixco, in the state of Mexico. The Kd values were obtained in activity concentration of Cs-137 of 100 Bq. The solution is shaken for seven days at 25 o C when the maximum amount of radionuclide is absorbed by the soil. The radionuclide in solution is measured by gamma spectrometry. The results obtained from batch method show that the distribution coefficients were from 144 to 660 ml/g for fine soil particles. This work is currently done as part of the site characterization studies for the disposal of low level rad-waste. (authors). 10 refs., 2 tabs

  5. Estimation of the effective distribution coefficient from the solubility constant

    International Nuclear Information System (INIS)

    Wang, Yug-Yea; Yu, C.

    1994-01-01

    An updated version of RESRAD has been developed by Argonne National Laboratory for the US Department of Energy to derive site-specific soil guidelines for residual radioactive material. In this updated version, many new features have been added to the, RESRAD code. One of the options is that a user can input a solubility constant to limit the leaching of contaminants. The leaching model used in the code requires the input of an empirical distribution coefficient, K d , which represents the ratio of the solute concentration in soil to that in solution under equilibrium conditions. This paper describes the methodology developed to estimate an effective distribution coefficient, Kd, from the user-input solubility constant and the use of the effective K d for predicting the leaching of contaminants

  6. Estimates of the Sampling Distribution of Scalability Coefficient H

    Science.gov (United States)

    Van Onna, Marieke J. H.

    2004-01-01

    Coefficient "H" is used as an index of scalability in nonparametric item response theory (NIRT). It indicates the degree to which a set of items rank orders examinees. Theoretical sampling distributions, however, have only been derived asymptotically and only under restrictive conditions. Bootstrap methods offer an alternative possibility to…

  7. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  8. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  9. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  10. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  11. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  12. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  13. A methodology to quantify the stochastic distribution of friction coefficient required for level walking.

    Science.gov (United States)

    Chang, Wen-Ruey; Chang, Chien-Chi; Matz, Simon; Lesch, Mary F

    2008-11-01

    The required friction coefficient is defined as the minimum friction needed at the shoe and floor interface to support human locomotion. The available friction is the maximum friction coefficient that can be supported without a slip at the shoe and floor interface. A statistical model was recently introduced to estimate the probability of slip and fall incidents by comparing the available friction with the required friction, assuming that both the available and required friction coefficients have stochastic distributions. This paper presents a methodology to investigate the stochastic distributions of the required friction coefficient for level walking. In this experiment, a walkway with a layout of three force plates was specially designed in order to capture a large number of successful strikes without causing fatigue in participants. The required coefficient of friction data of one participant, who repeatedly walked on this walkway under four different walking conditions, is presented as an example of the readiness of the methodology examined in this paper. The results of the Kolmogorov-Smirnov goodness-of-fit test indicated that the required friction coefficient generated from each foot and walking condition by this participant appears to fit the normal, log-normal or Weibull distributions with few exceptions. Among these three distributions, the normal distribution appears to fit all the data generated with this participant. The average of successful strikes for each walk achieved with three force plates in this experiment was 2.49, ranging from 2.14 to 2.95 for each walking condition. The methodology and layout of the experimental apparatus presented in this paper are suitable for being applied to a full-scale study.

  14. Study of distribution coefficients of admixtures in tellurium

    International Nuclear Information System (INIS)

    Kuchar, L.; Drapala, J.; Kuchar, L. jr.

    1986-01-01

    Limit areas of tellurium-admixture binary systems were studied and the values determined of steady-state distribution coefficients of admixtures. A second order polynomial was used to express equations of solidus and liquidus curves for Te-Se, Te-S, Te-Hg systems; the curves are graphically represented. The most effective method for preparing high-purity tellurium is zonal melting with material removal. (M.D.). 4 figs., 4 tabs., 16 refs

  15. River Flow Prediction Using the Nearest Neighbor Probabilistic Ensemble Method

    Directory of Open Access Journals (Sweden)

    H. Sanikhani

    2016-02-01

    Full Text Available Introduction: In the recent years, researchers interested on probabilistic forecasting of hydrologic variables such river flow.A probabilistic approach aims at quantifying the prediction reliability through a probability distribution function or a prediction interval for the unknown future value. The evaluation of the uncertainty associated to the forecast is seen as a fundamental information, not only to correctly assess the prediction, but also to compare forecasts from different methods and to evaluate actions and decisions conditionally on the expected values. Several probabilistic approaches have been proposed in the literature, including (1 methods that use resampling techniques to assess parameter and model uncertainty, such as the Metropolis algorithm or the Generalized Likelihood Uncertainty Estimation (GLUE methodology for an application to runoff prediction, (2 methods based on processing the forecast errors of past data to produce the probability distributions of future values and (3 methods that evaluate how the uncertainty propagates from the rainfall forecast to the river discharge prediction, as the Bayesian forecasting system. Materials and Methods: In this study, two different probabilistic methods are used for river flow prediction.Then the uncertainty related to the forecast is quantified. One approach is based on linear predictors and in the other, nearest neighbor was used. The nonlinear probabilistic ensemble can be used for nonlinear time series analysis using locally linear predictors, while NNPE utilize a method adapted for one step ahead nearest neighbor methods. In this regard, daily river discharge (twelve years of Dizaj and Mashin Stations on Baranduz-Chay basin in west Azerbijan and Zard-River basin in Khouzestan provinces were used, respectively. The first six years of data was applied for fitting the model. The next three years was used to calibration and the remained three yeas utilized for testing the models

  16. Activity risk coefficients for living generations

    International Nuclear Information System (INIS)

    Raicevic, J.; Merkle, M.; Ninkovic, M. M.

    1993-01-01

    This paper deals with the new concept of the Activity risk coefficients, ARCs, which are in Probabilistic risk assessment PRA computer codes used for the calculation of the stochastic effects due to low dose exposures. As an example, ARC expressions for the Cloudshine is derived. (author)

  17. Distribution of temperature coefficient density for muons in the atmosphere

    Directory of Open Access Journals (Sweden)

    Kuzmenko V.S.

    2017-12-01

    Full Text Available To date, several dozens of new muon detectors have been built. When studying cosmic-ray intensity variations with these detectors, located deep in the atmosphere, it is necessary to calculate all characteristics, including the distribution of temperature coefficient density for muons in the atmosphere, taking into account their specific geometry. For this purpose, we calculate the density of temperature coefficients of muon intensity in the atmosphere at various zenith angles of detection at sea level and at various depths underground for different absorption ranges of primary protons and pions in the atmosphere.

  18. Critical electrode size in measurement of d33 coefficient of films via spatial distribution of piezoelectric displacement

    International Nuclear Information System (INIS)

    Wang Zhihong; Miao Jianmin

    2008-01-01

    Spatial distributions of piezoelectric displacement response across the top electrode have been used in this paper to measure the piezoelectric coefficient d 33 of films based on the converse piezoelectric effect. The technical details and features of a scanning laser Doppler vibrometer have been summarized and discussed for accurately obtaining the spatial displacement distributions. Three definitions, including the apparent, the effective and the constrained piezoelectric coefficient d 33 of films, have been clarified and used to better understand the fundamental phenomenon behind the measured displacement distributions. Finite element analysis reveals that both the apparent and the effective piezoelectric coefficients depend on the electrode radius of test capacitor as well as film thickness. However, there exists a critical electrode size for apparent piezoelectric coefficients and a critical test capacitor aspect ratio for effective piezoelectric coefficient. Beyond their respective critical values, both coefficients converge to the constrained piezoelectric coefficient irrespective of film thickness. The finding of the critical electric size makes it possible to consistently measure the constrained piezoelectric coefficient of films by using the spatial distributions of the piezoelectric displacement response and becomes the fundamental criterion of this measurement method

  19. Standard test method for distribution coefficients of inorganic species by the batch method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the determination of distribution coefficients of chemical species to quantify uptake onto solid materials by a batch sorption technique. It is a laboratory method primarily intended to assess sorption of dissolved ionic species subject to migration through pores and interstices of site specific geomedia. It may also be applied to other materials such as manufactured adsorption media and construction materials. Application of the results to long-term field behavior is not addressed in this method. Distribution coefficients for radionuclides in selected geomedia are commonly determined for the purpose of assessing potential migratory behavior of contaminants in the subsurface of contaminated sites and waste disposal facilities. This test method is also applicable to studies for parametric studies of the variables and mechanisms which contribute to the measured distribution coefficient. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement a...

  20. Characterizing the topology of probabilistic biological networks.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Biological interactions are often uncertain events, that may or may not take place with some probability. This uncertainty leads to a massive number of alternative interaction topologies for each such network. The existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. In this paper, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. Using our mathematical representation, we develop a method that can accurately describe the degree distribution of such networks. We also take one more step and extend our method to accurately compute the joint-degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. Our method works quickly even for entire protein-protein interaction (PPI) networks. It also helps us find an adequate mathematical model using MLE. We perform a comparative study of node-degree and joint-degree distributions in two types of biological networks: the classical deterministic networks and the more flexible probabilistic networks. Our results confirm that power-law and log-normal models best describe degree distributions for both probabilistic and deterministic networks. Moreover, the inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected. We also show that probabilistic networks are more robust for node-degree distribution computation than the deterministic ones. all the data sets used, the software

  1. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  2. Probabilistic Flexural Fatigue in Plain and Fiber-Reinforced Concrete.

    Science.gov (United States)

    Ríos, José D; Cifuentes, Héctor; Yu, Rena C; Ruiz, Gonzalo

    2017-07-07

    The objective of this work is two-fold. First, we attempt to fit the experimental data on the flexural fatigue of plain and fiber-reinforced concrete with a probabilistic model (Saucedo, Yu, Medeiros, Zhang and Ruiz, Int. J. Fatigue, 2013, 48, 308-318). This model was validated for compressive fatigue at various loading frequencies, but not for flexural fatigue. Since the model is probabilistic, it is not necessarily related to the specific mechanism of fatigue damage, but rather generically explains the fatigue distribution in concrete (plain or reinforced with fibers) for damage under compression, tension or flexion. In this work, more than 100 series of flexural fatigue tests in the literature are fit with excellent results. Since the distribution of monotonic tests was not available in the majority of cases, a two-step procedure is established to estimate the model parameters based solely on fatigue tests. The coefficient of regression was more than 0.90 except for particular cases where not all tests were strictly performed under the same loading conditions, which confirms the applicability of the model to flexural fatigue data analysis. Moreover, the model parameters are closely related to fatigue performance, which demonstrates the predictive capacity of the model. For instance, the scale parameter is related to flexural strength, which improves with the addition of fibers. Similarly, fiber increases the scattering of fatigue life, which is reflected by the decreasing shape parameter.

  3. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  4. Study on the distribution coefficient during environmental impact evaluation in Chinese inland nuclear power plants

    International Nuclear Information System (INIS)

    Xu Haifeng; Shang Zhaorong; Chen Fangqiang

    2012-01-01

    Description the radionuclide distribution coefficient of the important factors in the river sediment systems, at home and abroad the main method of measuring the K d value and progress in China's inland nuclear power plant environmental impact assessment of workers to carry out the distribution coefficient K d value measurement ideas put forward recommendations. (authors)

  5. Empirical Sampling Distributions of Equating Coefficients for Graded and Nominal Response Instruments.

    Science.gov (United States)

    Baker, Frank B.

    1997-01-01

    Examined the sampling distributions of equating coefficients produced by the characteristic curve method for tests using graded and nominal response scoring using simulated data. For both models and across all three equating situations, the sampling distributions were generally bell-shaped and peaked, and occasionally had a small degree of…

  6. Probabilistic properties of the Curve Number

    Science.gov (United States)

    Rutkowska, Agnieszka; Banasik, Kazimierz; Kohnova, Silvia; Karabova, Beata

    2013-04-01

    The determination of the Curve Number (CN) is fundamental for the hydrological rainfall-runoff SCS-CN method which assesses the runoff volume in small catchments. The CN depends on geomorphologic and physiographic properties of the catchment and traditionally it is assumed to be constant for each catchment. Many practitioners and researchers observe, however, that the parameter is characterized by a variability. This sometimes causes inconsistency in the river discharge prediction using the SCS-CN model. Hence probabilistic and statistical methods are advisable to investigate the CN as a random variable and to complement and improve the deterministic model. The results that will be presented contain determination of the probabilistic properties of the CNs for various Slovakian and Polish catchments using statistical methods. The detailed study concerns the description of empirical distributions (characteristics, QQ-plots and coefficients of goodness of fit, histograms), testing of the statistical hypotheses about some theoretical distributions (Kolmogorov-Smirnow, Anderson-Darling, Cramer-von Mises, χ2, Shapiro-Wilk), construction of confidence intervals and comparisons among catchments. The relationship between confidence intervals and the ARC soil classification will also be performed. The comparison between the border values of the confidence intervals and the ARC I and ARC III conditions is crucial for further modeling. The study of the response of the catchment to the stormy rainfall depth when the variability of the CN arises is also of special interest. ACKNOWLEDGMENTS The investigation described in the contribution has been initiated by first Author research visit to Technical University of Bratislava in 2012 within a STSM of the COST Action ES0901. Data used here have been provided by research project no. N N305 396238 founded by PL-Ministry of Science and Higher Education. The support provided by the organizations is gratefully acknowledged.

  7. Developing Pavement Distress Deterioration Models for Pavement Management System Using Markovian Probabilistic Process

    Directory of Open Access Journals (Sweden)

    Promothes Saha

    2017-01-01

    Full Text Available In the state of Colorado, the Colorado Department of Transportation (CDOT utilizes their pavement management system (PMS to manage approximately 9,100 miles of interstate, highways, and low-volume roads. Three types of deterioration models are currently being used in the existing PMS: site-specific, family, and expert opinion curves. These curves are developed using deterministic techniques. In the deterministic technique, the uncertainties of pavement deterioration related to traffic and weather are not considered. Probabilistic models that take into account the uncertainties result in more accurate curves. In this study, probabilistic models using the discrete-time Markov process were developed for five distress indices: transverse, longitudinal, fatigue, rut, and ride indices, as a case study on low-volume roads. Regression techniques were used to develop the deterioration paths using the predicted distribution of indices estimated from the Markov process. Results indicated that longitudinal, fatigue, and rut indices had very slow deterioration over time, whereas transverse and ride indices showed faster deterioration. The developed deterioration models had the coefficient of determination (R2 above 0.84. As probabilistic models provide more accurate results, it is recommended that these models be used as the family curves in the CDOT PMS for low-volume roads.

  8. Probabilistic considerations on the effects of random soil properties on the stability of ground structures of nuclear power plants

    International Nuclear Information System (INIS)

    Ootori, Yasuki; Ishikawa, Hiroyuki; Takeda, Tomoyoshi

    2004-01-01

    In the JEAG4601-1987 (Japan Electric Association Guide for earthquake resistance design), either the conventional deterministic method or probabilistic method is used for evaluating the stability of ground foundations and surrounding slopes in nuclear power plants. The deterministic method, in which the soil properties of 'mean ± coefficient x standard deviation' is adopted for the calculations, is generally used in the design stage to data. On the other hand, the probabilistic method, in which the soil properties assume to have probabilistic distributions, is stated as a future method. The deterministic method facilitates the evaluation, however, it is necessary to clarify the relationship between the deterministic and probabilistic methods. In order to investigate the relationship, a simple model that can take into account the dynamic effect of structures, and a simplified method for taking the spatial randomness into account are proposed in this study. As a result, it is found that the shear strength of soil is the most important factor for the stability of grounds and slopes, and the probability below the safety factor evaluated with the soil properties of mean - 1.0 x standard deviation' by the deterministic methods of much lower. (author)

  9. Probabilistic analysis of glass elements with three-parameter Weibull distribution; Analisis probabilistico de elementos de vidrio recocido mediante una distribucion triparametrica Weibull

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, A.; Muniz-Calvente, M.; Fernandez, P.; Fernandez Cantel, A.; Lamela, M. J.

    2015-10-01

    Glass and ceramics present a brittle behaviour so a large scatter in the test results is obtained. This dispersion is mainly due to the inevitable presence of micro-cracks on its surface, edge defects or internal defects, which must be taken into account using an appropriate failure criteria non-deterministic but probabilistic. Among the existing probability distributions, the two or three parameter Weibull distribution is generally used in adjusting material resistance results, although the method of use thereof is not always correct. Firstly, in this work, the results of a large experimental programme using annealed glass specimens of different dimensions based on four-point bending and coaxial double ring tests was performed. Then, the finite element models made for each type of test, the adjustment of the parameters of the three-parameter Weibull distribution function (cdf) (λ: location, β: shape, d: scale) for a certain failure criterion and the calculation of the effective areas from the cumulative distribution function are presented. Summarizing, this work aims to generalize the use of the three-parameter Weibull function in structural glass elements with stress distributions not analytically described, allowing to apply the probabilistic model proposed in general loading distributions. (Author)

  10. Effect of Variable Manning Coefficients on Tsunami Inundation

    Science.gov (United States)

    Barberopoulou, A.; Rees, D.

    2017-12-01

    Numerical simulations are commonly used to help estimate tsunami hazard, improve evacuation plans, issue or cancel tsunami warnings, inform forecasting and hazard assessments and have therefore become an integral part of hazard mitigation among the tsunami community. Many numerical codes exist for simulating tsunamis, most of which have undergone extensive benchmarking and testing. Tsunami hazard or risk assessments employ these codes following a deterministic or probabilistic approach. Depending on the scope these studies may or may not consider uncertainty in the numerical simulations, the effects of tides, variable friction or estimate financial losses, none of which are necessarily trivial. Distributed manning coefficients, the roughness coefficients used in hydraulic modeling, are commonly used in simulating both riverine and pluvial flood events however, their use in tsunami hazard assessments is primarily part of limited scope studies and for the most part, not a standard practice. For this work, we investigate variations in manning coefficients and their effects on tsunami inundation extent, pattern and financial loss. To assign manning coefficients we use land use maps that come from the New Zealand Land Cover Database (LCDB) and more recent data from the Ministry of the Environment. More than 40 classes covering different types of land use are combined into major classes such as cropland, grassland and wetland representing common types of land use in New Zealand, each of which is assigned a unique manning coefficient. By utilizing different data sources for variable manning coefficients, we examine the impact of data sources and classification methodology on the accuracy of model outputs.

  11. Evaluation of distribution coefficients and concentration ratios of (90)Sr and (137)Cs in the Techa River and the Miass River.

    Science.gov (United States)

    Shishkina, E A; Pryakhin, E A; Popova, I Ya; Osipov, D I; Tikhova, Yu; Andreyev, S S; Shaposhnikova, I A; Egoreichenkov, E A; Styazhkina, E V; Deryabina, L V; Tryapitsina, G A; Melnikov, V; Rudolfsen, G; Teien, H-C; Sneve, M K; Akleyev, A V

    2016-07-01

    Empirical data on the behavior of radionuclides in aquatic ecosystems are needed for radioecological modeling, which is commonly used for predicting transfer of radionuclides, estimating doses, and assessing possible adverse effects on species and communities. Preliminary studies of radioecological parameters including distribution coefficients and concentration ratios, for (90)Sr and (137)Cs were not in full agreement with the default values used in the ERICA Tool and the RESRAD BIOTA codes. The unique radiation situation in the Techa River, which was contaminated by long-lived radionuclides ((90)Sr and (137)Cs) in the middle of the last century allows improved knowledge about these parameters for river systems. Therefore, the study was focused on the evaluation of radioecological parameters (distribution coefficients and concentration ratios for (90)Sr and (137)Cs) for the Techa River and the Miass River, which is assumed as a comparison waterbody. To achieve the aim the current contamination of biotic and abiotic components of the river ecosystems was studied; distribution coefficients for (90)Sr and (137)Cs were calculated; concentration ratios of (90)Sr and (137)Cs for three fish species (roach, perch and pike), gastropods and filamentous algae were evaluated. Study results were then compared with default values available for use in the well-known computer codes ERICA Tool and RESRAD BIOTA (when site-specific data are not available). We show that the concentration ratios of (137)Cs in whole fish bodies depend on the predominant type of nutrition (carnivores and phytophagous). The results presented here are useful in the context of improving of tools for assessing concentrations of radionuclides in biota, which could rely on a wider range of ecosystem information compared with the process limited the current versions of ERICA and RESRAD codes. Further, the concentration ratios of (90)Sr are species-specific and strongly dependent on Ca(2+) concentration in

  12. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  13. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    Science.gov (United States)

    Mori, Shintaro; Kitsukawa, Kenji; Hisakado, Masato

    2008-11-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution has singular correlation structures, reflecting the credit market implications. We point out two possible origins of the singular behavior.

  14. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...... coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate...

  15. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...

  16. Determination of distribution coefficients of some natural radionuclides (U, Ra, Pb, Po) between different types of Syrian soils and their solutions

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Al-Hamwi, A.; Amin, Y.; Al-Akel, B.

    2009-11-01

    In this study, distribution coefficients of some natural radionuclides ( 226 Ra, U, 210 Pb and 210 Po) between different types of soils in Syria and their solutions were determined. The distribution coefficients values ranged from (164-1933, 280-1722, 350-4749 and 101-117) l kg - 1 for 226 Ra, U, 210 Pb and 210 Po, respectively at pH = 4.0. While, the distribution coefficients values ranged from (207-6706, 673-2397, 149-2147 and 103- 292) l kg - 1 for 226 Ra, U, 210 Pb and 210 Po, respectively at pH = 5.5. In addition, the distribution coefficients values ranged from (167-1707, 126- 1239, 44-1122 and 125-1475) l kg - 1 for 226 Ra, U, 210 Pb and 210 Po, respectively at pH = 7.0. Moreover, the results showed that 210 Po distribution coefficients had the maximum values at pH 7. While 210 Pb distribution coefficients had the minimum values at same pH. In addition to, U distribution coefficients had the maximum values at pH 5.5. On the other hand, the effect of soil mineral content, CEC, ECE, pH and soluble ions on the distribution coefficients were investigated. In general, the results showed that there are logarithmic relationships between studied radionuclide activity in the soil and their distribution coefficients in all soil types (R 2 ranged from 0.59 to 1.00 at pH 4.0). On the other hand, there were no relationships between the distribution coefficients and soil pH. (authors)

  17. A tiered approach for probabilistic ecological risk assessment of contaminated sites

    International Nuclear Information System (INIS)

    Zolezzi, M.; Nicolella, C.; Tarazona, J.V.

    2005-01-01

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it

  18. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  19. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  20. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  1. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p cranial nerves. Probabilistic tracking with a gradual

  2. Probabilistic and Nonprobabilistic Sensitivity Analyses of Uncertain Parameters

    Directory of Open Access Journals (Sweden)

    Sheng-En Fang

    2014-01-01

    Full Text Available Parameter sensitivity analyses have been widely applied to industrial problems for evaluating parameter significance, effects on responses, uncertainty influence, and so forth. In the interest of simple implementation and computational efficiency, this study has developed two sensitivity analysis methods corresponding to the situations with or without sufficient probability information. The probabilistic method is established with the aid of the stochastic response surface and the mathematical derivation proves that the coefficients of first-order items embody the parameter main effects on the response. Simultaneously, a nonprobabilistic interval analysis based method is brought forward for the circumstance when the parameter probability distributions are unknown. The two methods have been verified against a numerical beam example with their accuracy compared to that of a traditional variance-based method. The analysis results have demonstrated the reliability and accuracy of the developed methods. And their suitability for different situations has also been discussed.

  3. Decision making by hybrid probabilistic: Possibilistic utility theory

    Directory of Open Access Journals (Sweden)

    Pap Endre

    2009-01-01

    Full Text Available It is presented an approach to decision theory based upon nonprobabilistic uncertainty. There is an axiomatization of the hybrid probabilistic possibilistic mixtures based on a pair of triangular conorm and triangular norm satisfying restricted distributivity law, and the corresponding non-additive Smeasure. This is characterized by the families of operations involved in generalized mixtures, based upon a previous result on the characterization of the pair of continuous t-norm and t-conorm such that the former is restrictedly distributive over the latter. The obtained family of mixtures combines probabilistic and idempotent (possibilistic mixtures via a threshold.

  4. Predicting cyclohexane/water distribution coefficients for the SAMPL5 challenge using MOSCED and the SMD solvation model

    Science.gov (United States)

    Diaz-Rodriguez, Sebastian; Bozada, Samantha M.; Phifer, Jeremy R.; Paluch, Andrew S.

    2016-11-01

    We present blind predictions using the solubility parameter based method MOSCED submitted for the SAMPL5 challenge on calculating cyclohexane/water distribution coefficients at 298 K. Reference data to parameterize MOSCED was generated with knowledge only of chemical structure by performing solvation free energy calculations using electronic structure calculations in the SMD continuum solvent. To maintain simplicity and use only a single method, we approximate the distribution coefficient with the partition coefficient of the neutral species. Over the final SAMPL5 set of 53 compounds, we achieved an average unsigned error of 2.2± 0.2 log units (ranking 15 out of 62 entries), the correlation coefficient ( R) was 0.6± 0.1 (ranking 35), and 72± 6 % of the predictions had the correct sign (ranking 30). While used here to predict cyclohexane/water distribution coefficients at 298 K, MOSCED is broadly applicable, allowing one to predict temperature dependent infinite dilution activity coefficients in any solvent for which parameters exist, and provides a means by which an excess Gibbs free energy model may be parameterized to predict composition dependent phase-equilibrium.

  5. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  6. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  7. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Characterizing Topology of Probabilistic Biological Networks.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-09-06

    Biological interactions are often uncertain events, that may or may not take place with some probability. Existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. Here, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. We develop a method that accurately describes the degree distribution of such networks. We also extend our method to accurately compute the joint degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. It also helps us find an adequate mathematical model using maximum likelihood estimation. Our results demonstrate that power law and log-normal models best describe degree distributions for probabilistic networks. The inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected.

  9. Distribution Coefficient Kd of Cesium in Soils from Areas in Perak

    International Nuclear Information System (INIS)

    Azian Hashim; Mohd Suhaimi Hamzah; Shamsiah Abdul Rahman; Nazaratul Ashifa Abdullah Salim; Md Suhaimi Elias; Shakirah Shukor; Muhd Azfar Azman; Siti Aminah Omar

    2015-01-01

    This is the paper reports on the study of distribution coefficient or Kd value in soil collected from the Western of Perak, which is Manjung, Setiawan, and Lahat with two different depths using lab batch method. Particle sizes were analyzed using the conventional technique known as pipette method. pH of the sample were 2-3. Determinations for cesium were performed using Inductively Coupled Plasma-Mass (ICP-MS). From the results, distribution factor for cesium, Kd value, was found to be influenced by the particle size of soil. (author)

  10. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Science.gov (United States)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  11. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  12. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  13. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase-type distribution. The latter property is partially important for the development of hypothesis testing in linear models. Finally, it is easy to simulate...

  14. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  15. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  16. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto

    2013-01-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue

  17. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)

    2013-11-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.

  18. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  19. Adaptive Probabilistic Tracking Embedded in Smart Cameras for Distributed Surveillance in a 3D Model

    Directory of Open Access Journals (Sweden)

    Sven Fleck

    2006-12-01

    Full Text Available Tracking applications based on distributed and embedded sensor networks are emerging today, both in the fields of surveillance and industrial vision. Traditional centralized approaches have several drawbacks, due to limited communication bandwidth, computational requirements, and thus limited spatial camera resolution and frame rate. In this article, we present network-enabled smart cameras for probabilistic tracking. They are capable of tracking objects adaptively in real time and offer a very bandwidthconservative approach, as the whole computation is performed embedded in each smart camera and only the tracking results are transmitted, which are on a higher level of abstraction. Based on this, we present a distributed surveillance system. The smart cameras' tracking results are embedded in an integrated 3D environment as live textures and can be viewed from arbitrary perspectives. Also a georeferenced live visualization embedded in Google Earth is presented.

  20. Adaptive Probabilistic Tracking Embedded in Smart Cameras for Distributed Surveillance in a 3D Model

    Directory of Open Access Journals (Sweden)

    Fleck Sven

    2007-01-01

    Full Text Available Tracking applications based on distributed and embedded sensor networks are emerging today, both in the fields of surveillance and industrial vision. Traditional centralized approaches have several drawbacks, due to limited communication bandwidth, computational requirements, and thus limited spatial camera resolution and frame rate. In this article, we present network-enabled smart cameras for probabilistic tracking. They are capable of tracking objects adaptively in real time and offer a very bandwidthconservative approach, as the whole computation is performed embedded in each smart camera and only the tracking results are transmitted, which are on a higher level of abstraction. Based on this, we present a distributed surveillance system. The smart cameras' tracking results are embedded in an integrated 3D environment as live textures and can be viewed from arbitrary perspectives. Also a georeferenced live visualization embedded in Google Earth is presented.

  1. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  2. Engineering magnetic polariton system with distributed coefficients: Applications to soliton management

    International Nuclear Information System (INIS)

    Kuetche, Victor K.; Nguepjouo, Francis T.; Kofane, Timoleon C.

    2014-01-01

    In the wake of the recent design of a powerful method for generating higher-dimensional evolution systems with distributed coefficients Kuetche (2014) [15] illustrated on the dynamics of the current-fed membrane of zero Young’s modulus, we construct the general Lax-representation of a new higher-dimensional coupled evolution equations with varying coefficients. Discussing the physical meanings of these equations, we show that the coupled system above describes the propagation of magnetic polaritons within saturated ferrites, resulting structurally from the fast-near adiabatic magnetization dynamics combined to the Maxwell’s equations. Accordingly, we address some practical issues of the nonautonomous soliton managements underlying in the fast remagnetization process of data inputs within magnetic memory devices

  3. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  4. From linear possibility distributions to a non-infinitesimal probabilistic semantics of conditional knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Benferhat, S.; Dubois, D.; Prade, H. [Universite Paul Sabatier, Toulouse (France)

    1996-12-31

    The authors have proposed in their previous works to view a set of default information of the form, {open_quotes}generally, from {alpha}{sub i} deduce {beta}{sub i}{close_quotes}, as the family of possibility distributions satisfying constraints expressing that the situations where {alpha}{sub 1} {beta}{sub i} is true are more possible than the situations where {alpha}{sub i} {beta}{sub i} is true. This provides a representation theorem for default reasoning obeying the System P of postulates proposed by Lehmann et al., and for which it also exists a semantics in terms of infinitesimal probabilities. This paper offers a detailed analysis of the structure of this family of possibility distributions by making use of two different orderings between them: the specificity ordering and the refinement ordering. It is shown that from a representation point of view, it is sufficient to consider the subset of linear possibility distributions which corresponds to all the possible completions of the default knowledge in agreement with the constraints. Surprisingly, it is also shown that a standard probabilistic semantics can be equivalently given to System P, without referring to infinitesimals, by using a special family of probability measures, here called acceptance functions, and that has been also recently considered by Snow in that perspective.

  5. Probabilistic Teleportation of Multi-particle d-Level Quantum State

    Institute of Scientific and Technical Information of China (English)

    CAO Min; ZHU Shi-Qun

    2005-01-01

    The general scheme for teleportation of a multi-particle d-level quantum state is presented when m pairs of partially entangled particles are utilized as quantum channels. The probabilistic teleportation can be achieved with a successful probability of d-1∏N=0(CN0)2/dM,which is determined by the smallest coefficients of each entangled channels.

  6. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  7. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  8. Stochastic distribution of the required coefficient of friction for level walking--an in-depth study.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    This study investigated the stochastic distribution of the required coefficient of friction (RCOF) which is a critical element for estimating slip probability. Fifty participants walked under four walking conditions. The results of the Kolmogorov-Smirnov two-sample test indicate that 76% of the RCOF data showed a difference in distribution between both feet for the same participant under each walking condition; the data from both feet were kept separate. The results of the Kolmogorov-Smirnov goodness-of-fit test indicate that most of the distribution of the RCOF appears to have a good match with the normal (85.5%), log-normal (84.5%) and Weibull distributions (81.5%). However, approximately 7.75% of the cases did not have a match with any of these distributions. It is reasonable to use the normal distribution for representation of the RCOF distribution due to its simplicity and familiarity, but each foot had a different distribution from the other foot in 76% of cases. The stochastic distribution of the required coefficient of friction (RCOF) was investigated for use in a statistical model to improve the estimate of slip probability in risk assessment. The results indicate that 85.5% of the distribution of the RCOF appears to have a good match with the normal distribution.

  9. Are distribution coefficients measured from batch experiments meaningful for quantifying retention in compacted material?

    Energy Technology Data Exchange (ETDEWEB)

    Goutelard, F.; Charles, Y.; Page, J. [CEA/DEN/DPC/SECR/L3MR batiment 450, 91191 Gif sur Yvette (France)

    2005-07-01

    Full text of publication follows: To quantify the ability of a clayey material to act as a barrier for radionuclides migration, reliable data on retention properties must be available. The most common method for determining the distribution coefficient, quantifying the radionuclide adsorption, is the batch technique applied to powdered solid. Are these data meaningful for highly compacted minerals? This question is still under debate in literature [1,2]. The aim of the present study is to compare distribution coefficient (KD) value for Cs and Ni onto compacted and dispersed for both Bentonite MX80 and Callovo-Oxfordian clayey material in a simulated site water. Firstly, classical batch sorption experiments are carried on dispersed materials pre-conditioned with the simulated site water at pH 7.3. Radiotracer {sup 137}Cs and {sup 58}Ni are used to investigate the constant-pH isotherm sorption. The bottleneck for measuring distribution coefficient onto highly compacted material lies in a careful monitoring of chemical conditions because they are driven by diffusion processes. For this study, we have chosen to use in-diffusion experiments [3]. Sample size is optimized to reach for high retention level (300 mL/g) the steady state in a reasonable time (3 to 6 month). In order to describe the response surface of compacted distribution coefficient on bentonite MX80, a 2 variables Doehlert matrix has been chosen. In this experimental design, the two variables are density and dispersed distribution coefficient. Bentonite is pre-conditioning before compaction to a density ranging from 1.2 to 1.85 kg/l. The pellet is confined in a cylindrical stainless steel filter (150 {mu}L) closed to both ends. The cell is placed in a tightly closed bottle containing the working solution. After a re-equilibration period (at least 3 weeks), {sup 133}Cs and {sup 59}Ni stable isotope are introduced for monitoring the KD level (between 150 mL/g to 330 mL/g). Radiotracer {sup 137}Cs and {sup 58

  10. Are distribution coefficients measured from batch experiments meaningful for quantifying retention in compacted material?

    International Nuclear Information System (INIS)

    Goutelard, F.; Charles, Y.; Page, J.

    2005-01-01

    Full text of publication follows: To quantify the ability of a clayey material to act as a barrier for radionuclides migration, reliable data on retention properties must be available. The most common method for determining the distribution coefficient, quantifying the radionuclide adsorption, is the batch technique applied to powdered solid. Are these data meaningful for highly compacted minerals? This question is still under debate in literature [1,2]. The aim of the present study is to compare distribution coefficient (KD) value for Cs and Ni onto compacted and dispersed for both Bentonite MX80 and Callovo-Oxfordian clayey material in a simulated site water. Firstly, classical batch sorption experiments are carried on dispersed materials pre-conditioned with the simulated site water at pH 7.3. Radiotracer 137 Cs and 58 Ni are used to investigate the constant-pH isotherm sorption. The bottleneck for measuring distribution coefficient onto highly compacted material lies in a careful monitoring of chemical conditions because they are driven by diffusion processes. For this study, we have chosen to use in-diffusion experiments [3]. Sample size is optimized to reach for high retention level (300 mL/g) the steady state in a reasonable time (3 to 6 month). In order to describe the response surface of compacted distribution coefficient on bentonite MX80, a 2 variables Doehlert matrix has been chosen. In this experimental design, the two variables are density and dispersed distribution coefficient. Bentonite is pre-conditioning before compaction to a density ranging from 1.2 to 1.85 kg/l. The pellet is confined in a cylindrical stainless steel filter (150 μL) closed to both ends. The cell is placed in a tightly closed bottle containing the working solution. After a re-equilibration period (at least 3 weeks), 133 Cs and 59 Ni stable isotope are introduced for monitoring the KD level (between 150 mL/g to 330 mL/g). Radiotracer 137 Cs and 58 Ni are used to quantify the

  11. Probabilistic estimates of drought impacts on agricultural production

    Science.gov (United States)

    Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.

    2017-08-01

    Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.

  12. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  13. A Random Riemannian Metric for Probabilistic Shortest-Path Tractography

    DEFF Research Database (Denmark)

    Hauberg, Søren; Schober, Michael; Liptrot, Matthew George

    2015-01-01

    of the diffusion tensor as a “random Riemannian metric”, where a geodesic is a distribution over tracts. We approximate this distribution with a Gaussian process and present a probabilistic numerics algorithm for computing the geodesic distribution. We demonstrate SPT improvements on data from the Human Connectome...

  14. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Andova, S.; Georgievska, S.; Trcka, N.

    2012-01-01

    A notion of branching bisimilarity for the alternating model of probabilistic systems, compatible with parallel composition, is defined. For a congruence result, an internal transition immediately followed by a non-trivial probability distribution is not considered inert. A weaker definition of

  15. Cooperation in an evolutionary prisoner’s dilemma game with probabilistic strategies

    International Nuclear Information System (INIS)

    Li Haihong; Dai Qionglin; Cheng Hongyan; Yang Junzhong

    2012-01-01

    Highlights: ► Introducing probabilistic strategies instead of the pure C/D in the PDG. ► The strategies patterns depends on interaction structures and updating rules. ► There exists an optimal increment of the probabilistic strategy. - Abstract: In this work, we investigate an evolutionary prisoner’s dilemma game in structured populations with probabilistic strategies instead of the pure strategies of cooperation and defection. We explore the model in details by considering different strategy update rules and different population structures. We find that the distribution of probabilistic strategies patterns is dependent on both the interaction structures and the updating rules. We also find that, when an individual updates her strategy by increasing or decreasing her probabilistic strategy a certain amount towards that of her opponent, there exists an optimal increment of the probabilistic strategy at which the cooperator frequency reaches its maximum.

  16. Thermodynamic and Probabilistic Metabolic Control Analysis of Riboflavin (Vitamin B₂) Biosynthesis in Bacteria.

    Science.gov (United States)

    Birkenmeier, Markus; Mack, Matthias; Röder, Thorsten

    2015-10-01

    In this study, we applied a coupled in silico thermodynamic and probabilistic metabolic control analysis methodology to investigate the control mechanisms of the commercially relevant riboflavin biosynthetic pathway in bacteria. Under the investigated steady-state conditions, we found that several enzyme reactions of the pathway operate far from thermodynamic equilibrium (transformed Gibbs energies of reaction below about -17 kJ mol(-1)). Using the obtained thermodynamic information and applying enzyme elasticity sampling, we calculated the distributions of the scaled concentration control coefficients (CCCs) and scaled flux control coefficients (FCCs). From the statistical analysis of the calculated distributions, we inferred that the control over the riboflavin producing flux is shared among several enzyme activities and mostly resides in the initial reactions of the pathway. More precisely, the guanosine triphosphate (GTP) cyclohydrolase II activity, and therefore the bifunctional RibA protein of Bacillus subtilis because it catalyzes this activity, appears to mainly control the riboflavin producing flux (mean FCCs = 0.45 and 0.55, respectively). The GTP cyclohydrolase II activity and RibA also exert a high positive control over the riboflavin concentration (mean CCCs = 2.43 and 2.91, respectively). This prediction is consistent with previous findings for microbial riboflavin overproducing strains.

  17. Coefficients of distribution and accumulation of K, Rb, Cs and 137Cs in the intensive poultry breeding cycle

    International Nuclear Information System (INIS)

    Djuric, G.; Ajdacic, N.; Institut za Nuklearne Nauke Boris Kidric, Belgrade

    1984-01-01

    The concentration of K,Rb,Cs and the activity level of Cs-137 in samples from the intensive poultry breeding cycle (feed, meat, eggs), under the condition of chronic alimentary contamination is presented. Concentrations of Cs and Rb were determined by non-destructive neutron activation analysis, concentration of K by atomic absorption flame photometry and activity of Cs-137 by gamma spectrometric analysis. On the basis of these results, coefficients of distribution and accumulation were calculated. The distribution coefficients of the analysed stable isotopes in meat have values close to 1, whereas for various parts of egg these coefficients vary between 0.5 and 1.5. Significant differences in Cs-137 distribution in various parts of egg were established. The values of accumulation coefficients indicate that all analysed elements selectively accumulate in the meat of young birds (broilers), and Cs-137 accumulates in the egg white as well. (orig.)

  18. Spatial distribution of coefficients for determination of global radiation in Serbia

    Directory of Open Access Journals (Sweden)

    Nikolić Jugoslav L.

    2012-01-01

    Full Text Available The aim of this paper is a creation of the spatial distribution of the corresponding coefficients for the indirect determination of global radiation using all direct measurements data of this shortwave radiation balance component in Serbia in the standard climate period (1961-1990. Based on the global radiation direct measurements data recorded in the past and routine measurements/observations of cloudiness and sunshine duration, the spatial distribution coefficients maps required for calculation of global radiation were produced on the basis of sunshine/cloudiness in an arbitrary point on the territory of Serbia. Besides, a specific verification of the proposed empirical formula was performed. This paper contributes to a wide range of practical applications as direct measurements of global radiation are relatively rare, and are not carried out in Serbia today. Significant application is possible in the domain of renewable energy sources. The development of method for determination of the global radiation has an importance from the aspect of the environmental protection; however it also has an economic importance through applications in numerous commercial projects, as it does not require special measurements or additional financial investments.

  19. Probabilistic Capacity of a Grid connected Wind Farm

    DEFF Research Database (Denmark)

    Zhao, Menghua; Chen, Zhe; Blaabjerg, Frede

    2005-01-01

    This paper proposes a method to find the maximum acceptable wind power injection regarding the thermal limits, steady state stability limits and voltage limits of the grid system. The probabilistic wind power is introduced based on the probability distribution of wind speed. Based on Power Transfer...... Distribution Factor (PTDF) and voltage sensitivities, a predictor-corrector method is suggested to calculate the acceptable active power injection. Then this method is combined with the probabilistic model of wind power to compute the allowable capacity of the wind farm. Finally, an example is illustrated...... to test this method. It is concluded that proposed method in this paper is a feasible, fast, and accurate approach to find the size of a wind farm....

  20. Probabilistic model to forecast earthquakes in the Zemmouri (Algeria) seismoactive area on the basis of moment magnitude scale distribution functions

    Science.gov (United States)

    Baddari, Kamel; Makdeche, Said; Bellalem, Fouzi

    2013-02-01

    Based on the moment magnitude scale, a probabilistic model was developed to predict the occurrences of strong earthquakes in the seismoactive area of Zemmouri, Algeria. Firstly, the distributions of earthquake magnitudes M i were described using the distribution function F 0(m), which adjusts the magnitudes considered as independent random variables. Secondly, the obtained result, i.e., the distribution function F 0(m) of the variables M i was used to deduce the distribution functions G(x) and H(y) of the variables Y i = Log M 0,i and Z i = M 0,i , where (Y i)i and (Z i)i are independent. Thirdly, some forecast for moments of the future earthquakes in the studied area is given.

  1. Ionization constants by curve fitting: determination of partition and distribution coefficients of acids and bases and their ions.

    Science.gov (United States)

    Clarke, F H; Cahoon, N M

    1987-08-01

    A convenient procedure has been developed for the determination of partition and distribution coefficients. The method involves the potentiometric titration of the compound, first in water and then in a rapidly stirred mixture of water and octanol. An automatic titrator is used, and the data is collected and analyzed by curve fitting on a microcomputer with 64 K of memory. The method is rapid and accurate for compounds with pKa values between 4 and 10. Partition coefficients can be measured for monoprotic and diprotic acids and bases. The partition coefficients of the neutral compound and its ion(s) can be determined by varying the ratio of octanol to water. Distribution coefficients calculated over a wide range of pH values are presented graphically as "distribution profiles". It is shown that subtraction of the titration curve of solvent alone from that of the compound in the solvent offers advantages for pKa determination by curve fitting for compounds of low aqueous solubility.

  2. Probabilistic inhalation risk assessment due to radioactivity released from coal fired thermal power plants

    International Nuclear Information System (INIS)

    Tiwari, M.; Ajmal, P.Y.; Bhangare, R.C.; Sahu, S.K.; Pandit, G.G.

    2014-01-01

    This paper deals with assessment of radiological risk to the general public around in the neighborhood of a 1000 MWe coal-based thermal power plant. We have used Monte Carlo simulation for characterization of uncertainty in inhalation risk due to radionuclide escaping from the stack of thermal power plant. Monte Carlo simulation treats parameters as random variables bound to a given probabilistic distribution to evaluate the distribution of the resulting output. Risk assessment is the process that estimates the likelihood of occurrence of adverse effects to humans and ecological receptors as a result of exposure to hazardous chemical, radiation, and/or biological agents. Quantitative risk characterization involves evaluating exposure estimates against a benchmark of toxicity, such as a cancer slope factor. Risk is calculated by multiplying the carcinogenic slope factor (SF) of the radionuclide by the dose an individual receives. The collective effective doses to the population living in the neighborhood of coal-based thermal power plant were calculated using Gaussian plume dispersion model. Monte Carlo Analysis is the most widely used probabilistic method in risk assessment. The MCA technique treats any uncertain parameter as random variable that obeys a given probabilistic distribution. This technique is widely used for analyzing probabilistic uncertainty. In MCA computer simulation are used to combine multiple probability distributions associated with the dose and SF depicted in risk equation. Thus we get a probabilistic distribution for the risk

  3. A Correction of Random Incidence Absorption Coefficients for the Angular Distribution of Acoustic Energy under Measurement Conditions

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2009-01-01

    Most acoustic measurements are based on an assumption of ideal conditions. One such ideal condition is a diffuse and reverberant field. In practice, a perfectly diffuse sound field cannot be achieved in a reverberation chamber. Uneven incident energy density under measurement conditions can cause...... discrepancies between the measured value and the theoretical random incidence absorption coefficient. Therefore the angular distribution of the incident acoustic energy onto an absorber sample should be taken into account. The angular distribution of the incident energy density was simulated using the beam...... tracing method for various room shapes and source positions. The averaged angular distribution is found to be similar to a Gaussian distribution. As a result, an angle-weighted absorption coefficient was proposed by considering the angular energy distribution to improve the agreement between...

  4. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  5. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  6. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  7. Probabilistic Programming : A True Verification Challenge

    NARCIS (Netherlands)

    Katoen, Joost P.; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun

    2015-01-01

    Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a

  8. Estimation of the distribution coefficient by combined application of two different methods

    International Nuclear Information System (INIS)

    Vogl, G.; Gerstenbrand, F.

    1982-01-01

    A simple, non-invasive method is presented which permits determination of the rBCF and, in addition, of the distribution coefficient of the grey matter. The latter, which is closely correlated with the cerebral metabolism, has only been determined in vitro so far. The new method will be a means to check its accuracy. (orig.) [de

  9. Prodeto, a computer code for probabilistic fatigue design

    Energy Technology Data Exchange (ETDEWEB)

    Braam, H [ECN-Solar and Wind Energy, Petten (Netherlands); Christensen, C J; Thoegersen, M L [Risoe National Lab., Roskilde (Denmark); Ronold, K O [Det Norske Veritas, Hoevik (Norway)

    1999-03-01

    A computer code for structural relibility analyses of wind turbine rotor blades subjected to fatigue loading is presented. With pre-processors that can transform measured and theoretically predicted load series to load range distributions by rain-flow counting and with a family of generic distribution models for parametric representation of these distribution this computer program is available for carying through probabilistic fatigue analyses of rotor blades. (au)

  10. Probabilistic forecasting for extreme NO2 pollution episodes

    International Nuclear Information System (INIS)

    Aznarte, José L.

    2017-01-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. - Highlights: • A new probabilistic forecasting system is presented to predict NO 2 concentrations. • While predicting the full distribution, it also outperforms other point-forecasting models. • Forecasts show good properties and peak concentrations are properly predicted. • It forecasts the probability of exceedance of thresholds, key to decision makers. • Relative forecasting importance of the variables is obtained as a by-product.

  11. Apparent distribution coefficients of transuranium elements in UK coastal waters

    International Nuclear Information System (INIS)

    Kershaw, P.J.; Pentreath, R.J.; Harvey, B.R.; Lovett, M.B.; Boggis, S.J.

    1986-01-01

    The authorized inputs of low-level radioactive waste into the Irish Sea from the British Nuclear Fuels plc reprocessing plant at Sellafield may be used to advantage to study the distribution and behaviour of artificial radionuclides in the marine environment. Apparent distribution coefficients (Ksub(d)) for the transuranium elements Np, Pu, Am and Cm have been determined by the analysis of environmental samples collected from UK coastal waters. The sampling methodology for obtaining suspended sediment-seawater Ksub(d)s by filtration is described and critically evaluated. Artefacts may be introduced in the sample collection stage. Ksub(d) values have also been determined for seabed sediment-interstitial waters and the precautions taken to preserve in-situ chemical conditions are described. Variations in Ksub(d) values are discussed in relation to distance from Sellafield, suspended load, redox conditions and oxidation state changes. (author)

  12. PlanJury: probabilistic plan evaluation revisited

    Science.gov (United States)

    Witte, M.; Sonke, J.-J.; van Herk, M.

    2014-03-01

    Purpose: Over a decade ago, the 'Van Herk margin recipe paper' introduced plan evaluation through DVH statistics based on population distributions of systematic and random errors. We extended this work for structures with correlated uncertainties (e.g. lymph nodes or parotid glands), and considered treatment plans containing multiple (overlapping) dose distributions (e.g. conventional lymph node and hypo-fractionated tumor doses) for which different image guidance protocols may lead to correlated errors. Methods: A command-line software tool 'PlanJury' was developed which reads 3D dose and structure data exported from a treatment planning system. Uncertainties are specified by standard deviations and correlation coefficients. Parameters control the DVH statistics to be computed: e.g. the probability of reaching a DVH constraint, or the dose absorbed at given confidence in a (combined) volume. Code was written in C++ and parallelized using OpenMP. Testing geometries were constructed using idealized spherical volumes and dose distributions. Results: Negligible stochastic noise could be attained within two minutes computation time for a single target. The confidence to properly cover both of two targets was 90% for two synchronously moving targets, but decreased by 7% if the targets moved independently. For two partially covered organs at risk the confidence of at least one organ below the mean dose threshold was 40% for synchronous motion, 36% for uncorrelated motion, but only 20% for either of the organs separately. Two abutting dose distributions ensuring 91% confidence of proper target dose for correlated motions led to 28% lower confidence for uncorrelated motions as relative displacements between the doses resulted in cold spots near the target. Conclusions: Probabilistic plan evaluation can efficiently be performed for complicated treatment planning situations, thus providing important plan quality information unavailable in conventional PTV based evaluations.

  13. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008

    Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...... likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...

  14. Probabilistic Mobility Models for Mobile and Wireless Networks

    DEFF Research Database (Denmark)

    Song, Lei; Godskesen, Jens Christian

    2010-01-01

    In this paper we present a probabilistic broadcast calculus for mobile and wireless networks whose connections are unreliable. In our calculus broadcasted messages can be lost with a certain probability, and due to mobility the connection probabilities may change. If a network broadcasts a message...... from a location it will evolve to a network distribution depending on whether nodes at other locations receive the message or not. Mobility of locations is not arbitrary but guarded by a probabilistic mobility function (PMF) and we also define the notion of a weak bisimulation given a PMF...

  15. Probabilistic pathway construction.

    Science.gov (United States)

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    Science.gov (United States)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  17. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  18. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  19. Probabilistic-Stochastic Model of Distribution of Physical and Mechanical Properties of Soft Mineral Rocks

    Directory of Open Access Journals (Sweden)

    O.O. Sdvizhkova

    2017-12-01

    Full Text Available The physical and mechanical characteristics of soils and soft rocks obtained as a result of laboratory tests are important initial parameters for assessing the stability of natural and artificial slopes. Such properties of rocks as adhesion and the angle of internal friction are due to the influence of a number of natural and technogenic factors. At the same time, from the set of factors influencing the stability of the slope, the most significant ones are singled out, which to a greater extent determine the properties of the rocks. The more factors are taken into account in the geotechnical model, the more closely the properties of the rocks are studied, which increases the accuracy of the scientific forecast of the landslide danger of the slope. On the other hand, an increase in the number of factors involved in the model complicates it and causes a decrease in the reliability of geotechnical calculations. The aim of the work is to construct a statistical distribution of the studied physical and mechanical properties of soft rocks and to substantiate a probabilistic statistical model. Based on the results of laboratory tests of rocks, the statistical distributions of the quantitative traits studied, the angle of internal friction φ and the cohesion, were constructed. It was established that the statistical distribution of physical mechanical properties of rocks is close to a uniform law.

  20. Distribution of physicians and hospital beds based on Gini coefficient and Lorenz curve: A national survey

    Directory of Open Access Journals (Sweden)

    Satar Rezaei

    2016-06-01

    Full Text Available Introduction: Inequality is prevalent in all sectors, particularly in distribution of and access to resources in the health sector. The aim of current study was to investigate the distribution of physicians and hospital beds in Iran in 2001, 2006 and 2011. Methods: This retrospective, cross-sectional study evaluated the distribution of physicians and hospital beds in 2001, 2006 and 2011 using Gini coefficient and Lorenz curve. The required data, including the number of physicians (general practitioners and specialists, number of hospital beds and number of hospitalized patients were obtained from the statistical yearbook of Iranian Statistical Center (ISC. The data analysis was performed by DASP software. Results: The Gini Coefficients for physicians and hospital beds based on population in 2001 were 0.19 and 0.16, and based on hospitalized patients, were 0.48 and 0.37, respectively. In 2006, these values were found to be 0.18 and 0.15 based on population, and 0.21 and 0.21 based on hospitalized patients, respectively. In 2011, however, the Gini coefficients were reported to be 0.16 and 0.13 based on population, and 0.47 and 0.37 based on hospitalized patients, respectively. Although distribution status had improved in 2011compared with 2001 in terms of population and number of hospitalized patients, there was more inequality in distribution based on the number of hospitalized patients than based on population. Conclusion: This study indicated that inequality in distribution of physicians and hospital beds was declined in 2011 compared with 2001. This distribution was based on the population, so it is suggested that, in allocation of resource, the health policymakers consider such need indices as the pattern of diseases and illness-prone areas, number of inpatients, and mortality.

  1. Research on sorption behavior of radionuclides under shallow land environment. Mechanism and standard methodologies for measurement of distribution coefficients of radionuclides

    International Nuclear Information System (INIS)

    Sakamoto, Yoshiaki; Tanaka, Tadao; Takebe, Shinichi; Nagao, Seiya; Ogawa, Hiromichi; Komiya, Tomokazu; Hagiwara, Shigeru

    2001-01-01

    This study consists of two categories' research works. One is research on sorption mechanism of radionuclides with long half-life, which are Technetium-99, TRU elements and U series radionuclides, on soil and rocks, including a development of database of distribution coefficients of radionuclides. The database on the distribution coefficients of radionuclides with information about measurement conditions, such as shaking method, soil characteristics and solution composition, has been already opened to the public (JAERI-DATABASE 20001003). Another study is investigation on a standard methodology of the distribution coefficient of radionuclide on soils, rocks and engineering materials in Japan. (author)

  2. Meta-heuristic CRPS minimization for the calibration of short-range probabilistic forecasts

    Science.gov (United States)

    Mohammadi, Seyedeh Atefeh; Rahmani, Morteza; Azadi, Majid

    2016-08-01

    This paper deals with the probabilistic short-range temperature forecasts over synoptic meteorological stations across Iran using non-homogeneous Gaussian regression (NGR). NGR creates a Gaussian forecast probability density function (PDF) from the ensemble output. The mean of the normal predictive PDF is a bias-corrected weighted average of the ensemble members and its variance is a linear function of the raw ensemble variance. The coefficients for the mean and variance are estimated by minimizing the continuous ranked probability score (CRPS) during a training period. CRPS is a scoring rule for distributional forecasts. In the paper of Gneiting et al. (Mon Weather Rev 133:1098-1118, 2005), Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is used to minimize the CRPS. Since BFGS is a conventional optimization method with its own limitations, we suggest using the particle swarm optimization (PSO), a robust meta-heuristic method, to minimize the CRPS. The ensemble prediction system used in this study consists of nine different configurations of the weather research and forecasting model for 48-h forecasts of temperature during autumn and winter 2011 and 2012. The probabilistic forecasts were evaluated using several common verification scores including Brier score, attribute diagram and rank histogram. Results show that both BFGS and PSO find the optimal solution and show the same evaluation scores, but PSO can do this with a feasible random first guess and much less computational complexity.

  3. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  4. Multiaxial probabilistic elastic-plastic constitutive simulations of soils

    Science.gov (United States)

    Sadrinezhad, Arezoo

    Fokker-Planck-Kolmogorov (FPK) equation approach has recently been developed to simulate elastic-plastic constitutive behaviors of materials with uncertain material properties. The FPK equation approach transforms the stochastic constitutive rate equation, which is a stochastic, nonlinear, ordinary differential equation (ODE) in the stress-pseudo time space into a second-order accurate, deterministic, linear FPK partial differential equation (PDE) in the probability density of stress-pseudo time space. This approach does not suffer from the drawbacks of the traditional approaches such as the Monte Carlo approach and the perturbation approach for solving nonlinear ODEs with random coefficients. In this study, the existing one dimensional FPK framework for probabilistic constitutive modeling of soils is extended to multi--dimension. However, the multivariate FPK PDEs cannot be solved using the traditional mathematical techniques such as finite difference techniques due to their high computational cost. Therefore, computationally efficient algorithms based on the Fourier spectral approach are developed for solving a class of FPK PDEs that arises in probabilistic elasto-plasticity. This class includes linear FPK PDEs in (stress) space and (pseudo) time - having space-independent but time-dependent, and both space- and time-dependent coefficients - with impulse initial conditions and reflecting boundary conditions. The solution algorithms, rely on first mapping the stress space of the governing PDE between 0 and 2pi using the change of coordinates rule, followed by approximating the solution of the PDE in the 2pi-periodic domain by a finite Fourier series in the stress space and unknown time-dependent solution coefficients. Finally, the time-dependent solution coefficients are obtained from the initial condition. The accuracy and efficiency of the developed algorithms are tested. The developed algorithms are used to simulate uniaxial and multiaxial, monotonic and cyclic

  5. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    Science.gov (United States)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  6. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  7. Effectiveness of Securities with Fuzzy Probabilistic Return

    Directory of Open Access Journals (Sweden)

    Krzysztof Piasecki

    2011-01-01

    Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract

  8. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  9. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  10. Probabilistic Modeling of the Renal Stone Formation Module

    Science.gov (United States)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  11. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  12. Unsteady Probabilistic Analysis of a Gas Turbine System

    Science.gov (United States)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  13. A novel method for the investigation of liquid/liquid distribution coefficients and interface permeabilities applied to the water-octanol-drug system.

    Science.gov (United States)

    Stein, Paul C; di Cagno, Massimiliano; Bauer-Brandl, Annette

    2011-09-01

    In this work a new, accurate and convenient technique for the measurement of distribution coefficients and membrane permeabilities based on nuclear magnetic resonance (NMR) is described. This method is a novel implementation of localized NMR spectroscopy and enables the simultaneous analysis of the drug content in the octanol and in the water phase without separation. For validation of the method, the distribution coefficients at pH = 7.4 of four active pharmaceutical ingredients (APIs), namely ibuprofen, ketoprofen, nadolol, and paracetamol (acetaminophen), were determined using a classical approach. These results were compared to the NMR experiments which are described in this work. For all substances, the respective distribution coefficients found with the two techniques coincided very well. Furthermore, the NMR experiments make it possible to follow the distribution of the drug between the phases as a function of position and time. Our results show that the technique, which is available on any modern NMR spectrometer, is well suited to the measurement of distribution coefficients. The experiments present also new insight into the dynamics of the water-octanol interface itself and permit measurement of the interface permeability.

  14. Estimation of soil-soil solution distribution coefficient of radiostrontium using soil properties.

    Science.gov (United States)

    Ishikawa, Nao K; Uchida, Shigeo; Tagami, Keiko

    2009-02-01

    We propose a new approach for estimation of soil-soil solution distribution coefficient (K(d)) of radiostrontium using some selected soil properties. We used 142 Japanese agricultural soil samples (35 Andosol, 25 Cambisol, 77 Fluvisol, and 5 others) for which Sr-K(d) values had been determined by a batch sorption test and listed in our database. Spearman's rank correlation test was carried out to investigate correlations between Sr-K(d) values and soil properties. Electrical conductivity and water soluble Ca had good correlations with Sr-K(d) values for all soil groups. Then, we found a high correlation between the ratio of exchangeable Ca to Ca concentration in water soluble fraction and Sr-K(d) values with correlation coefficient R=0.72. This pointed us toward a relatively easy way to estimate Sr-K(d) values.

  15. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  16. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  17. Determination of distribution coefficient (Kd's) of some artificial and naturally occurring radionuclide in fresh and marine coastal water sediment

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Mamish, S; Haleem, M. A.

    2004-12-01

    Distribution coefficients of artificial and natural radionuclides in fresh and marine water sediment are used in modeling radionuclide dispersion in water system, and the radiation risk and environmental investigating of impact of radioactive emissions, due to routine operations of nuclear plants or disposal and burial of radioactive waste in the environment. In the present work, distribution coefficient of uranium, lead, polonium, radium (naturally occurring radionuclides that may be emitted into the Syrian environment by the phosphate and oil industry with relatively high concentrations) and caesium 137 and strontium 85, in fresh water sediment (Euphrates River, Orantos River and Mzzerib Lake) and marine coastal water (Lattakia, Tartous and Banias). Distribution coefficients were found to vary between (5.8-17.18)*10 3 , (2.2-8.11)*10 3 , (0.22-2.08)*10 3 , (0.16-0.19)*10 3 , (0.38-0.69)*10 3 and 49-312 for polonium, lead, uranium, radium, cesium and strontium respectively. Results have indicated that most measurement distribution coefficients in the present study were lower than those values reported in IAEA documents for marine coastal sediment. In addition, variations of Kd's with aqueous phase composition and sediment elemental and mineralogical composition and its total organic materials content have been studied, where liner correlation coefficients for each isotope with different parameters have been determined. The obtained data reported in this study can be used for radioactive contaminants dispersion and transfer in Syrian river, lake and coast to assess risks to public due to discharges of the phosphate and oil industry into the Syrian environment. (Authors)

  18. Fine-Tuning Nonhomogeneous Regression for Probabilistic Precipitation Forecasts: Unanimous Predictions, Heavy Tails, and Link Functions

    DEFF Research Database (Denmark)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.

    2017-01-01

    functions for the optimization of regression coefficients for the scale parameter. These three refinements are tested for 10 stations in a small area of the European Alps for lead times from +24 to +144 h and accumulation periods of 24 and 6 h. Together, they improve probabilistic forecasts...... to obtain automatically corrected weather forecasts. This study applies the nonhomogenous regression framework as a state-of-the-art ensemble postprocessing technique to predict a full forecast distribution and improves its forecast performance with three statistical refinements. First of all, a novel split...... for precipitation amounts as well as the probability of precipitation events over the default postprocessing method. The improvements are largest for the shorter accumulation periods and shorter lead times, where the information of unanimous ensemble predictions is more important....

  19. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  20. Probabilistically-Cued Patterns Trump Perfect Cues in Statistical Language Learning.

    Science.gov (United States)

    Lany, Jill; Gómez, Rebecca L

    2013-01-01

    Probabilistically-cued co-occurrence relationships between word categories are common in natural languages but difficult to acquire. For example, in English, determiner-noun and auxiliary-verb dependencies both involve co-occurrence relationships, but determiner-noun relationships are more reliably marked by correlated distributional and phonological cues, and appear to be learned more readily. We tested whether experience with co-occurrence relationships that are more reliable promotes learning those that are less reliable using an artificial language paradigm. Prior experience with deterministically-cued contingencies did not promote learning of less reliably-cued structure, nor did prior experience with relationships instantiated in the same vocabulary. In contrast, prior experience with probabilistically-cued co-occurrence relationships instantiated in different vocabulary did enhance learning. Thus, experience with co-occurrence relationships sharing underlying structure but not vocabulary may be an important factor in learning grammatical patterns. Furthermore, experience with probabilistically-cued co-occurrence relationships, despite their difficultly for naïve learners, lays an important foundation for learning novel probabilistic structure.

  1. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  2. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  3. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  4. Aluminum 7075-T6 fatigue data generation and probabilistic life prediction formulation

    OpenAIRE

    Kemna, John G.

    1998-01-01

    Approved for public release; distribution is unlimited. The life extension of aging fleet aircraft requires an assessment of the safe-life remaining after refurbishment. Risk can be estimated by conventional deterministic fatigue analysis coupled with a subjective factor of safety. Alternatively, risk can be quantitatively and objectively predicted by probabilistic analysis. In this investigation, a general probabilistic life formulation is specialized for constant amplitude, fully reverse...

  5. A Geometric Presentation of Probabilistic Satisfiability

    OpenAIRE

    Morales-Luna, Guillermo

    2010-01-01

    By considering probability distributions over the set of assignments the expected truth values assignment to propositional variables are extended through linear operators, and the expected truth values of the clauses at any given conjunctive form are also extended through linear maps. The probabilistic satisfiability problems are discussed in terms of the introduced linear extensions. The case of multiple truth values is also discussed.

  6. Probabilistic model for sterilization of food

    International Nuclear Information System (INIS)

    Chepurko, V.V.; Malinovskij, O.V.

    1986-01-01

    The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts

  7. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  8. Probabilistic Fatigue Analysis of Jacket Support Structures for Offshore Wind Turbines Exemplified on Tubular Joints

    OpenAIRE

    Kelma, Sebastian; Schaumann, Peter

    2015-01-01

    The design of offshore wind turbines is usually based on the semi-probabilistic safety concept. Using probabilistic methods, the aim is to find an advanced structural design of OWTs in order to improve safety and reduce costs. The probabilistic design is exemplified on tubular joints of a jacket substructure. Loads and resistance are considered by their respective probability distributions. Time series of loads are generated by fully-coupled numerical simulation of the offshore wind turbine. ...

  9. Exploring the Potential Use of the Birnbaum-Saunders Distribution in Inventory Management

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2015-01-01

    Full Text Available Choosing the suitable demand distribution during lead-time is an important issue in inventory models. Much research has explored the advantage of following a distributional assumption different from the normality. The Birnbaum-Saunders (BS distribution is a probabilistic model that has its genesis in engineering but is also being widely applied to other fields including business, industry, and management. We conduct numeric experiments using the R statistical software to assess the adequacy of the BS distribution against the normal and gamma distributions in light of the traditional lot size-reorder point inventory model, known as (Q, r. The BS distribution is well-known to be robust to extreme values; indeed, results indicate that it is a more adequate assumption under higher values of the lead-time demand coefficient of variation, thus outperforming the gamma and the normal assumptions.

  10. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  11. Distribution coefficient of radionuclides on rocks for performance assessment of high-level radioactive waste repository

    International Nuclear Information System (INIS)

    Shibutani, Tomoki; Shibata, Masahiro; Suyama, Tadahiro

    1999-11-01

    Distribution coefficients of radionuclides on rocks are selected for safety assessment in the 'Second Progress Report on Research and Development for the geological disposal of HLW in Japan (H12 Report)'. The categorized types of rock are granitic rocks (crystalline and acidic rocks), basaltic rocks (crystalline and basic rocks), psammitic rocks (neogene sedimentary (soft)), and tuffaceous-pelitic rocks (pre-neogene sedimentary rocks (hard)). The types of groundwater are FRHP (fresh reducing high-pH), FRLP (fresh reducing low-pH), SRHP (saline reducing high-pH), SRLP (saline reducing low-pH), MRNP (mixing reducing neutral-pH) and FOHP (fresh oxidizing high-pH) groundwater. The elements to be surveyed are Ni, Se, Zr, Nb, Tc, Pd, Sn, Cs, Sm, Pb, Ra, Ac, Th, Pa, U, Np, Pu, Am and Cm. Distribution coefficients are collected from literatures describing batch sorption experimental results, and are selected under consideration of conservativity. (author)

  12. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  13. TU-F-18C-05: Evaluation of a Method to Calculate Patient-Oriented MGD Coefficients Using Estimates of Glandular Tissue Distribution

    International Nuclear Information System (INIS)

    Porras-Chaverri, M; Galavis, P; Bakic, P; Vetter, J

    2014-01-01

    Purpose: Evaluate mammographic mean glandular dose (MGD) coefficients for particular known tissue distributions using a novel formalism that incorporates the effect of the heterogeneous glandular tissue distribution, by comparing them with MGD coefficients derived from the corresponding anthropomorphic computer breast phantom. Methods: MGD coefficients were obtained using MCNP5 simulations with the currently used homogeneous assumption and the heterogeneously-layered breast (HLB) geometry and compared against those from the computer phantom (ground truth). The tissue distribution for the HLB geometry was estimated using glandularity map image pairs corrected for the presence of non-glandular fibrous tissue. Heterogeneity of tissue distribution was quantified using the glandular tissue distribution index, Idist. The phantom had 5 cm compressed breast thickness (MLO and CC views) and 29% whole breast glandular percentage. Results: Differences as high as 116% were found between the MGD coefficients with the homogeneous breast core assumption and those from the corresponding ground truth. Higher differences were found for cases with more heterogeneous distribution of glandular tissue. The Idist for all cases was in the [−0.8 − +0.3] range. The use of the methods presented in this work results in better agreement with ground truth with an improvement as high as 105 pp. The decrease in difference across all phantom cases was in the [9 − 105] pp range, dependent on the distribution of glandular tissue and was larger for the cases with the highest Idist values. Conclusion: Our results suggest that the use of corrected glandularity image pairs, as well as the HLB geometry, improves the estimates of MGD conversion coefficients by accounting for the distribution of glandular tissue within the breast. The accuracy of this approach with respect to ground truth is highly dependent on the particular glandular tissue distribution studied. Predrag Bakic discloses current

  14. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  15. Size Evolution and Stochastic Models: Explaining Ostracod Size through Probabilistic Distributions

    Science.gov (United States)

    Krawczyk, M.; Decker, S.; Heim, N. A.; Payne, J.

    2014-12-01

    The biovolume of animals has functioned as an important benchmark for measuring evolution throughout geologic time. In our project, we examined the observed average body size of ostracods over time in order to understand the mechanism of size evolution in these marine organisms. The body size of ostracods has varied since the beginning of the Ordovician, where the first true ostracods appeared. We created a stochastic branching model to create possible evolutionary trees of ostracod size. Using stratigraphic ranges for ostracods compiled from over 750 genera in the Treatise on Invertebrate Paleontology, we calculated overall speciation and extinction rates for our model. At each timestep in our model, new lineages can evolve or existing lineages can become extinct. Newly evolved lineages are assigned sizes based on their parent genera. We parameterized our model to generate neutral and directional changes in ostracod size to compare with the observed data. New sizes were chosen via a normal distribution, and the neutral model selected new sizes differentials centered on zero, allowing for an equal chance of larger or smaller ostracods at each speciation. Conversely, the directional model centered the distribution on a negative value, giving a larger chance of smaller ostracods. Our data strongly suggests that the overall direction of ostracod evolution has been following a model that directionally pushes mean ostracod size down, shying away from a neutral model. Our model was able to match the magnitude of size decrease. Our models had a constant linear decrease while the actual data had a much more rapid initial rate followed by a constant size. The nuance of the observed trends ultimately suggests a more complex method of size evolution. In conclusion, probabilistic methods can provide valuable insight into possible evolutionary mechanisms determining size evolution in ostracods.

  16. Probabilistic design of aluminum sheet drawing for reduced risk of wrinkling and fracture

    International Nuclear Information System (INIS)

    Zhang Wenfeng; Shivpuri, Rajiv

    2009-01-01

    Often, sheet drawing processes are designed to provide the geometry of the final part, and then the process parameters such as blank dimensions, blank holder forces (BHFs), press strokes and interface friction are designed and controlled to provide the greatest drawability (largest depth of draw without violating the wrinkling and thinning constraints). The exclusion of inherent process variations in this design can often lead to process designs that are unreliable and uncontrollable. In this paper, a general multi-criteria design approach is presented to quantify the uncertainties and to incorporate them into the response surface method (RSM) based model so as to conduct probabilistic optimization. A surrogate RSM model of the process mechanics is generated using FEM-based high-fidelity models and design of experiments (DOEs), and a simple linear weighted approach is used to formulate the objective function or the quality index (QI). To demonstrate this approach, deep drawing of an aluminum Hishida part is analyzed. With the predetermined blank shape, tooling design and fixed drawing depth, a probabilistic design (PD) is successfully carried out to find the optimal combination of BHF and friction coefficient under variation of material properties. The results show that with the probabilistic approach, the QI improved by 42% over the traditional deterministic design (DD). It also shows that by further reducing the variation of friction coefficient to 2%, the QI will improve further to 98.97%

  17. A probabilistic approach for RIA fuel failure criteria

    International Nuclear Information System (INIS)

    Carlo Vitanza, Dr.

    2008-01-01

    Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser

  18. Effects of calcium and magnesium on strontium distribution coefficients

    Science.gov (United States)

    Bunde, R.L.; Rosentreter, J.J.; Liszewski, M.J.; Hemming, C.H.; Welhan, J.

    1997-01-01

    The effects of calcium and magnesium on the distribution of strontium between a surficial sediment and simulated wastewater solutions were measured as part of an investigation to determine strontium transport properties of surficial sediment at the Idaho National Engineering Laboratory (INEL), Idaho. The investigation was conducted by the U.S. Geological Survey and Idaho State University, in cooperation with the U.S. Department of Energy. Batch experimental techniques were used to determine strontium linear sorption isotherms and distribution coefficients (K(d)'s) using simulated wastewater solutions prepared at pH 8.0??0.1 with variable concentrations of calcium and magnesium. Strontium linear sorption isotherm K(d)'s ranged from 12??1 to 85??3 ml/g, increasing as the concentration of calcium and magnesium decreased. The concentration of sorbed strontium and the percentage of strontium retained by the sediment were correlated to aqueous concentrations of strontium, calcium, and magnesium. The effect of these cation concentrations on strontium sorption was quantified using multivariate least-squares regression techniques. Analysis of data from these experiments indicates that increased concentrations of calcium and magnesium in wastewater discharged to waste disposal ponds at the INEL increases the availability of strontium for transport beneath the ponds by decreasing strontium sorption to the surficial sediment.

  19. An exponential distribution

    International Nuclear Information System (INIS)

    Anon

    2009-01-01

    In this presentation author deals with the probabilistic evaluation of product life on the example of the exponential distribution. The exponential distribution is special one-parametric case of the weibull distribution.

  20. Simplified probabilistic approach to determine safety factors in deterministic flaw acceptance criteria

    International Nuclear Information System (INIS)

    Barthelet, B.; Ardillon, E.

    1997-01-01

    The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)

  1. Effects of methamphetamine administration on information gathering during probabilistic reasoning in healthy humans.

    Science.gov (United States)

    Ermakova, Anna O; Ramachandra, Pranathi; Corlett, Philip R; Fletcher, Paul C; Murray, Graham K

    2014-01-01

    Jumping to conclusions (JTC) during probabilistic reasoning is a cognitive bias repeatedly demonstrated in people with schizophrenia and shown to be associated with delusions. Little is known about the neurochemical basis of probabilistic reasoning. We tested the hypothesis that catecholamines influence data gathering and probabilistic reasoning by administering intravenous methamphetamine, which is known to cause synaptic release of the catecholamines noradrenaline and dopamine, to healthy humans whilst they undertook a probabilistic inference task. Our study used a randomised, double-blind, cross-over design. Seventeen healthy volunteers on three visits were administered either placebo or methamphetamine or methamphetamine preceded by amisulpride. In all three conditions participants performed the "beads" task in which participants decide how much information to gather before making a probabilistic inference, and which measures the cognitive bias towards jumping to conclusions. Psychotic symptoms triggered by methamphetamine were assessed using Comprehensive Assessment of At-Risk Mental States (CAARMS). Methamphetamine induced mild psychotic symptoms, but there was no effect of drug administration on the number of draws to decision (DTD) on the beads task. DTD was a stable trait that was highly correlated within subjects across visits (intra-class correlation coefficients of 0.86 and 0.91 on two versions of the task). The less information was sampled in the placebo condition, the more psychotic-like symptoms the person had after the methamphetamine plus amisulpride condition (p = 0.028). Our results suggest that information gathering during probabilistic reasoning is a stable trait, not easily modified by dopaminergic or noradrenergic modulation.

  2. Probabilistic predictive modelling of carbon nanocomposites for medical implants design.

    Science.gov (United States)

    Chua, Matthew; Chui, Chee-Kong

    2015-04-01

    Modelling of the mechanical properties of carbon nanocomposites based on input variables like percentage weight of Carbon Nanotubes (CNT) inclusions is important for the design of medical implants and other structural scaffolds. Current constitutive models for the mechanical properties of nanocomposites may not predict well due to differences in conditions, fabrication techniques and inconsistencies in reagents properties used across industries and laboratories. Furthermore, the mechanical properties of the designed products are not deterministic, but exist as a probabilistic range. A predictive model based on a modified probabilistic surface response algorithm is proposed in this paper to address this issue. Tensile testing of three groups of different CNT weight fractions of carbon nanocomposite samples displays scattered stress-strain curves, with the instantaneous stresses assumed to vary according to a normal distribution at a specific strain. From the probabilistic density function of the experimental data, a two factors Central Composite Design (CCD) experimental matrix based on strain and CNT weight fraction input with their corresponding stress distribution was established. Monte Carlo simulation was carried out on this design matrix to generate a predictive probabilistic polynomial equation. The equation and method was subsequently validated with more tensile experiments and Finite Element (FE) studies. The method was subsequently demonstrated in the design of an artificial tracheal implant. Our algorithm provides an effective way to accurately model the mechanical properties in implants of various compositions based on experimental data of samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. AN EMPIRICAL INVESTIGATION OF THE EFFECTS OF NONNORMALITY UPON THE SAMPLING DISTRIBUTION OF THE PROJECT MOMENT CORRELATION COEFFICIENT.

    Science.gov (United States)

    HJELM, HOWARD; NORRIS, RAYMOND C.

    THE STUDY EMPIRICALLY DETERMINED THE EFFECTS OF NONNORMALITY UPON SOME SAMPLING DISTRIBUTIONS OF THE PRODUCT MOMENT CORRELATION COEFFICIENT (PMCC). SAMPLING DISTRIBUTIONS OF THE PMCC WERE OBTAINED BY DRAWING NUMEROUS SAMPLES FROM CONTROL AND EXPERIMENTAL POPULATIONS HAVING VARIOUS DEGREES OF NONNORMALITY AND BY CALCULATING CORRELATION COEFFICIENTS…

  4. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    Science.gov (United States)

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  5. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  6. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  7. A probabilistic approach to combining smart meter and electric vehicle charging data to investigate distribution network impacts

    International Nuclear Information System (INIS)

    Neaimeh, Myriam; Wardle, Robin; Jenkins, Andrew M.; Yi, Jialiang; Hill, Graeme; Lyons, Padraig F.; Hübner, Yvonne; Blythe, Phil T.; Taylor, Phil C.

    2015-01-01

    Highlights: • Working with unique datasets of EV charging and smart meter load demand. • Distribution networks are not a homogenous group with more capabilities to accommodate EVs than previously suggested. • Spatial and temporal diversity of EV charging demand alleviate the impacts on networks. • An extensive recharging infrastructure could enable connection of additional EVs on constrained distribution networks. • Electric utilities could increase the network capability to accommodate EVs by investing in recharging infrastructure. - Abstract: This work uses a probabilistic method to combine two unique datasets of real world electric vehicle charging profiles and residential smart meter load demand. The data was used to study the impact of the uptake of Electric Vehicles (EVs) on electricity distribution networks. Two real networks representing an urban and rural area, and a generic network representative of a heavily loaded UK distribution network were used. The findings show that distribution networks are not a homogeneous group with a variation of capabilities to accommodate EVs and there is a greater capability than previous studies have suggested. Consideration of the spatial and temporal diversity of EV charging demand has been demonstrated to reduce the estimated impacts on the distribution networks. It is suggested that distribution network operators could collaborate with new market players, such as charging infrastructure operators, to support the roll out of an extensive charging infrastructure in a way that makes the network more robust; create more opportunities for demand side management; and reduce planning uncertainties associated with the stochastic nature of EV charging demand.

  8. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  9. Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software

    Science.gov (United States)

    Hunter, George; Boisvert, Benjamin

    2013-01-01

    This document is the final report for the project entitled "Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software." This report consists of 17 sections which document the results of the several subtasks of this effort. The Probabilistic NAS Platform (PNP) is an air operations simulation platform developed and maintained by the Saab Sensis Corporation. The improvements made to the PNP simulation include the following: an airborne distributed separation assurance capability, a required time of arrival assignment and conformance capability, and a tactical and strategic weather avoidance capability.

  10. Probabilistic, sediment-geochemical parameterisation of the groundwater compartment of the Netherlands for spatially distributed, reactive transport modelling

    Science.gov (United States)

    Janssen, Gijs; Gunnink, Jan; van Vliet, Marielle; Goldberg, Tanya; Griffioen, Jasper

    2017-04-01

    Pollution of groundwater aquifers with contaminants as nitrate is a common problem. Reactive transport models are useful to predict the fate of such contaminants and to characterise the efficiency of mitigating or preventive measures. Parameterisation of a groundwater transport model on reaction capacity is a necessary step during building the model. Two Dutch, national programs are combined to establish a methodology for building a probabilistic model on reaction capacity of the groundwater compartment at the national scale: the Geological Survey program and the NHI Netherlands Hydrological Instrument program. Reaction capacity is considered as a series of geochemical characteristics that control acid/base condition, redox condition and sorption capacity. Five primary reaction capacity variables are characterised: 1. pyrite, 2. non-pyrite, reactive iron (oxides, siderite and glauconite), 3. clay fraction, 4. organic matter and 5. Ca-carbonate. Important reaction capacity variables that are determined by more than one solid compound are also deduced: 1. potential reduction capacity (PRC) by pyrite and organic matter, 2. cation-exchange capacity (CEC) by organic matter and clay content, 3. carbonate buffering upon pyrite oxidation (CPBO) by carbonate and pyrite. Statistical properties of these variables are established based on c. 16,000 sediment geochemical analyses. The first tens of meters are characterised based on 25 regions using combinations of lithological class and geological formation as strata. Because of both less data and more geochemical uniformity, the deeper subsurface is characterised in a similar way based on 3 regions. The statistical data is used as input in an algoritm that probabilistically calculates the reaction capacity per grid cell. First, the cumulative frequency distribution (cfd) functions are calculated from the statistical data for the geochemical strata. Second, all voxel cells are classified into the geochemical strata. Third, the

  11. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  12. A hybrid of ant colony optimization and artificial bee colony algorithm for probabilistic optimal placement and sizing of distributed energy resources

    International Nuclear Information System (INIS)

    Kefayat, M.; Lashkar Ara, A.; Nabavi Niaki, S.A.

    2015-01-01

    Highlights: • A probabilistic optimization framework incorporated with uncertainty is proposed. • A hybrid optimization approach combining ACO and ABC algorithms is proposed. • The problem is to deal with technical, environmental and economical aspects. • A fuzzy interactive approach is incorporated to solve the multi-objective problem. • Several strategies are implemented to compare with literature methods. - Abstract: In this paper, a hybrid configuration of ant colony optimization (ACO) with artificial bee colony (ABC) algorithm called hybrid ACO–ABC algorithm is presented for optimal location and sizing of distributed energy resources (DERs) (i.e., gas turbine, fuel cell, and wind energy) on distribution systems. The proposed algorithm is a combined strategy based on the discrete (location optimization) and continuous (size optimization) structures to achieve advantages of the global and local search ability of ABC and ACO algorithms, respectively. Also, in the proposed algorithm, a multi-objective ABC is used to produce a set of non-dominated solutions which store in the external archive. The objectives consist of minimizing power losses, total emissions produced by substation and resources, total electrical energy cost, and improving the voltage stability. In order to investigate the impact of the uncertainty in the output of the wind energy and load demands, a probabilistic load flow is necessary. In this study, an efficient point estimate method (PEM) is employed to solve the optimization problem in a stochastic environment. The proposed algorithm is tested on the IEEE 33- and 69-bus distribution systems. The results demonstrate the potential and effectiveness of the proposed algorithm in comparison with those of other evolutionary optimization methods

  13. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  14. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  15. A probabilistic approach to crack instability

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  16. Distribution coefficients for chemical components of a coal-oil/water system

    Energy Technology Data Exchange (ETDEWEB)

    Picel, K C; Stamoudis, V C; Simmons, M S

    1988-09-01

    Distribution coefficients (K/sub D/) were measured by equilibrating a coal oil comparative reference material (CRM-1) with water and then separating the oil and water phases. Aqueous phase concentrations were determined by direct analysis of this phase, while organic phase concentrations were determined from the original oil composition by difference. The log K/sub D/ values obtained for acidic and basic components were generally <3, while those for the neutral components ranged from 3 to 6. For aromatic hydrocarbons, strong correlations were observed between log K/sub D/ and log S/sub w/ (water solubility), and between log K/sub D/ and log K/sub o//sub w/ (octanol/water partition coefficient). Alkylated benzenes had significantly higher K/sub D/s than did unsubstituted aromatics of similar molecular weight. Examination of homologs revealed an increase of 0.307 log K/sub D/ units per additional carbon atom for polynuclear aromatic hydrocarbons having from 10 to 16 carbons. Alkyl substituent effects determined for various sets of homologs ranged from 0.391 to 0.466 log K/sub d/ units per -CH/sub 2/- group added. 38 refs., 5 figs., 7 tabs.

  17. The Effect of Nonzero Autocorrelation Coefficients on the Distributions of Durbin-Watson Test Estimator: Three Autoregressive Models

    Directory of Open Access Journals (Sweden)

    Mei-Yu LEE

    2014-11-01

    Full Text Available This paper investigates the effect of the nonzero autocorrelation coefficients on the sampling distributions of the Durbin-Watson test estimator in three time-series models that have different variance-covariance matrix assumption, separately. We show that the expected values and variances of the Durbin-Watson test estimator are slightly different, but the skewed and kurtosis coefficients are considerably different among three models. The shapes of four coefficients are similar between the Durbin-Watson model and our benchmark model, but are not the same with the autoregressive model cut by one-lagged period. Second, the large sample case shows that the three models have the same expected values, however, the autoregressive model cut by one-lagged period explores different shapes of variance, skewed and kurtosis coefficients from the other two models. This implies that the large samples lead to the same expected values, 2(1 – ρ0, whatever the variance-covariance matrix of the errors is assumed. Finally, comparing with the two sample cases, the shape of each coefficient is almost the same, moreover, the autocorrelation coefficients are negatively related with expected values, are inverted-U related with variances, are cubic related with skewed coefficients, and are U related with kurtosis coefficients.

  18. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  19. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  20. Correlation functions for the distribution coefficients of U(IV) and Pu(III) ions between aqueous nitric acid and 30% TBP in an aliphatic diluent

    International Nuclear Information System (INIS)

    Geldard, J.F.; Beyerlein, A.L.; Phillips, L.

    1985-01-01

    Distribution coefficient correlations for U(IV) and Pu(III) are obtained in terms of a modified form of the total nitrate ion salting strength that was successfully used to obtain distribution coefficient correlations for U(VI) and Pu(IV) in the earlier work of G.L. Richardson. The modification of salting strength was needed to account for the fact that the U(IV) distribution coefficients measured under conditions where U(VI) is present consistently fall below those obtained when it is absent. The correlations were incorporated into the mixer-settler computer model PUBG, and in the simulation of a 20-stage 1B partitioning contactor, calculated product stream concentrations were in excellent agreement with experiment. Earlier mixer-settler computer models, which failed to account for U(IV) distribution coefficients, predicted that U(IV) remained in the aqueous product stream, which is contrary to the experimental measurements

  1. Correlation functions for the distribution coefficients of U(IV) and Pu(III) ions between aqueous nitric acid and 30% TBP in an aliphatic diluent

    Energy Technology Data Exchange (ETDEWEB)

    Geldard, J.F.; Beyerlein, A.L.; Phillips, L.

    1985-09-01

    Distribution coefficient correlations for U(IV) and Pu(III) are obtained in terms of a modified form of the total nitrate ion salting strength that was successfully used to obtain distribution coefficient correlations for U(VI) and Pu(IV) in the earlier work of G.L. Richardson. The modification of salting strength was needed to account for the fact that the U(IV) distribution coefficients measured under conditions where U(VI) is present consistently fall below those obtained when it is absent. The correlations were incorporated into the mixer-settler computer model PUBG, and in the simulation of a 20-stage 1B partitioning contactor, calculated product stream concentrations were in excellent agreement with experiment. Earlier mixer-settler computer models, which failed to account for U(IV) distribution coefficients, predicted that U(IV) remained in the aqueous product stream, which is contrary to the experimental measurements.

  2. Exact and approximate probabilistic symbolic execution for nondeterministic programs

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probab...... Java programs. We show that our algorithms significantly improve upon a state-of-the-art statistical model checking algorithm, originally developed for Markov Decision Processes....... probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also...

  3. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  4. Probabilistic cost estimating of nuclear power plant construction projects

    International Nuclear Information System (INIS)

    Finch, W.C.; Perry, L.W.; Postula, F.D.

    1978-01-01

    This paper shows how to identify and isolate cost accounts by developing probability trees down to component levels as justified by value and cost uncertainty. Examples are given of the procedure for assessing uncertainty in all areas contributing to cost: design, factory equipment pricing, and field labor and materials. The method of combining these individual uncertainties is presented so that the cost risk can be developed for components, systems and the total plant construction project. Formats which enable management to use the probabilistic cost estimate information for business planning and risk control are illustrated. Topics considered include code estimate performance, cost allocation, uncertainty encoding, probabilistic cost distributions, and interpretation. Effective cost control of nuclear power plant construction projects requires insight into areas of greatest cost uncertainty and a knowledge of the factors which can cause costs to vary from the single value estimates. It is concluded that probabilistic cost estimating can provide the necessary assessment of uncertainties both as to the cause and the consequences

  5. Probabilistic finite element investigation of prestressing loss in nuclear containment wall segments

    International Nuclear Information System (INIS)

    Balomenos, Georgios P.; Pandey, Mahesh D.

    2017-01-01

    Highlights: • Probabilistic finite element framework for assessing concrete strain distribution. • Investigation of prestressing loss based on concrete strain distribution. • Application to 3D nuclear containment wall segments. • Use of ABAQUS with python programing for Monte Carlo simulation. - Abstract: The main function of the concrete containment structures is to prevent radioactive leakage to the environment in case of a loss of coolant accident (LOCA). The Canadian Standard CSA N287.6 (2011) proposes periodic inspections, i.e., pressure testing, in order to assess the strength and design criteria of the containment (proof test) and the leak tightness of the containment boundary (leakage rate test). During these tests, the concrete strains are measured and are expected to have a distribution due to several uncertainties. Therefore, this study aims to propose a probabilistic finite element analysis framework. Then, investigates the relationship between the concrete strains and the prestressing loss, in order to examine the possibility of estimating the average prestressing loss during pressure testing inspections. The results indicate that the concrete strain measurements during the leakage rate test may provide information with respect to the prestressing loss of the bonded system. In addition, the demonstrated framework can be further used for the probabilistic finite element analysis of real scale containments.

  6. Probabilistic finite element investigation of prestressing loss in nuclear containment wall segments

    Energy Technology Data Exchange (ETDEWEB)

    Balomenos, Georgios P., E-mail: gbalomen@uwaterloo.ca; Pandey, Mahesh D., E-mail: mdpandey@uwaterloo.ca

    2017-01-15

    Highlights: • Probabilistic finite element framework for assessing concrete strain distribution. • Investigation of prestressing loss based on concrete strain distribution. • Application to 3D nuclear containment wall segments. • Use of ABAQUS with python programing for Monte Carlo simulation. - Abstract: The main function of the concrete containment structures is to prevent radioactive leakage to the environment in case of a loss of coolant accident (LOCA). The Canadian Standard CSA N287.6 (2011) proposes periodic inspections, i.e., pressure testing, in order to assess the strength and design criteria of the containment (proof test) and the leak tightness of the containment boundary (leakage rate test). During these tests, the concrete strains are measured and are expected to have a distribution due to several uncertainties. Therefore, this study aims to propose a probabilistic finite element analysis framework. Then, investigates the relationship between the concrete strains and the prestressing loss, in order to examine the possibility of estimating the average prestressing loss during pressure testing inspections. The results indicate that the concrete strain measurements during the leakage rate test may provide information with respect to the prestressing loss of the bonded system. In addition, the demonstrated framework can be further used for the probabilistic finite element analysis of real scale containments.

  7. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    Science.gov (United States)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  8. Deterministic and Probabilistic Serviceability Assessment of Footbridge Vibrations due to a Single Walker Crossing

    Directory of Open Access Journals (Sweden)

    Cristoforo Demartino

    2018-01-01

    Full Text Available This paper presents a numerical study on the deterministic and probabilistic serviceability assessment of footbridge vibrations due to a single walker crossing. The dynamic response of the footbridge is analyzed by means of modal analysis, considering only the first lateral and vertical modes. Single span footbridges with uniform mass distribution are considered, with different values of the span length, natural frequencies, mass, and structural damping and with different support conditions. The load induced by a single walker crossing the footbridge is modeled as a moving sinusoidal force either in the lateral or in the vertical direction. The variability of the characteristics of the load induced by walkers is modeled using probability distributions taken from the literature defining a Standard Population of walkers. Deterministic and probabilistic approaches were adopted to assess the peak response. Based on the results of the simulations, deterministic and probabilistic vibration serviceability assessment methods are proposed, not requiring numerical analyses. Finally, an example of the application of the proposed method to a truss steel footbridge is presented. The results highlight the advantages of the probabilistic procedure in terms of reliability quantification.

  9. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [University of Texas at Dallas; Feng, Cong [University of Texas at Dallas; Wang, Zhenke [University of Texas at Dallas; Zhang, Jie [University of Texas at Dallas

    2018-02-01

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  10. Probabilistic Representations of Solutions to the Heat Equation

    Indian Academy of Sciences (India)

    In this paper we provide a new (probabilistic) proof of a classical result in partial differential equations, viz. if is a tempered distribution, then the solution of the heat equation for the Laplacian, with initial condition , is given by the convolution of with the heat kernel (Gaussian density). Our results also extend the ...

  11. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  12. Monitoring device for local power peaking coefficients

    International Nuclear Information System (INIS)

    Mihashi, Ishi

    1987-01-01

    Purpose: To determine and monitor the local power peaking coefficients by a method not depending on the combination of fuel types. Constitution: Representative values for the local power distribution can be obtained by determining corresponding burn-up degrees based on the burn-up degree of each of fuel assembly segments obtained in a power distribution monitor and by the interpolation and extrapolation of void coefficients. The typical values are multiplied with compensation coefficients for the control rod effect and coefficients for compensating the effect of adjacent fuel assemblies in a calculation device to obtain typical values for the present local power distribution compensated with all of the effects. Further, the calculation device compares them with typical values of the present local power distribution to obtain an aimed local power peaking coefficient as the maximum value thereof. According to the present invention, since the local power peaking coefficients can be determined not depending on the combination of the kind of fuels, if the combination of fuel assemblies is increased upon fuel change, the amount of operation therefor is not increased. (Kamimura, M.)

  13. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    Science.gov (United States)

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  14. Probabilistic Teleportation of an Arbitrary Two-Atom State in Cavity QED

    Institute of Scientific and Technical Information of China (English)

    LIU Jin-Ming

    2007-01-01

    We propose a scheme for the teleportation of an arbitrary two-atom state by using two pairs of two-atom nonmaximally entangled states as the quantum channel in cavity QED.It is shown that no matter whether the arbitrary two-atom pure state to be teleported is entangled or not,our teleportation scheme can always be probabilistically realized.The success probability of teleportation is determined by the smaller coefficients of the two initially entangled atom pairs.

  15. Mixing zone and drinking water intake dilution factor and wastewater generation distributions to enable probabilistic assessment of down-the-drain consumer product chemicals in the U.S.

    Science.gov (United States)

    Kapo, Katherine E; McDonough, Kathleen; Federle, Thomas; Dyer, Scott; Vamshi, Raghu

    2015-06-15

    Environmental exposure and associated ecological risk related to down-the-drain chemicals discharged by municipal wastewater treatment plants (WWTPs) are strongly influenced by in-stream dilution of receiving waters which varies by geography, flow conditions and upstream wastewater inputs. The iSTREEM® model (American Cleaning Institute, Washington D.C.) was utilized to determine probabilistic distributions for no decay and decay-based dilution factors in mean annual and low (7Q10) flow conditions. The dilution factors derived in this study are "combined" dilution factors which account for both hydrologic dilution and cumulative upstream effluent contributions that will differ depending on the rate of in-stream decay due to biodegradation, volatilization, sorption, etc. for the chemical being evaluated. The median dilution factors estimated in this study (based on various in-stream decay rates from zero decay to a 1h half-life) for WWTP mixing zones dominated by domestic wastewater flow ranged from 132 to 609 at mean flow and 5 to 25 at low flow, while median dilution factors at drinking water intakes (mean flow) ranged from 146 to 2×10(7) depending on the in-stream decay rate. WWTPs within the iSTREEM® model were used to generate a distribution of per capita wastewater generated in the U.S. The dilution factor and per capita wastewater generation distributions developed by this work can be used to conduct probabilistic exposure assessments for down-the-drain chemicals in influent wastewater, wastewater treatment plant mixing zones and at drinking water intakes in the conterminous U.S. In addition, evaluation of types and abundance of U.S. wastewater treatment processes provided insight into treatment trends and the flow volume treated by each type of process. Moreover, removal efficiencies of chemicals can differ by treatment type. Hence, the availability of distributions for per capita wastewater production, treatment type, and dilution factors at a national

  16. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  17. Probabilistic estimation of residential air exchange rates for ...

    Science.gov (United States)

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER measurements. An algorithm for probabilistically estimating AER was developed based on the Lawrence Berkley National Laboratory Infiltration model utilizing housing characteristics and meteorological data with adjustment for window opening behavior. The algorithm was evaluated by comparing modeled and measured AERs in four US cities (Los Angeles, CA; Detroit, MI; Elizabeth, NJ; and Houston, TX) inputting study-specific data. The impact on the modeled AER of using publically available housing data representative of the region for each city was also assessed. Finally, modeled AER based on region-specific inputs was compared with those estimated using literature-based distributions. While modeled AERs were similar in magnitude to the measured AER they were consistently lower for all cities except Houston. AERs estimated using region-specific inputs were lower than those using study-specific inputs due to differences in window opening probabilities. The algorithm produced more spatially and temporally variable AERs compared with literature-based distributions reflecting within- and between-city differences, helping reduce error in estimates of air pollutant exposure. Published in the Journal of

  18. Carrier Mediated Distribution System (CAMDIS): a new approach for the measurement of octanol/water distribution coefficients.

    Science.gov (United States)

    Wagner, Bjoern; Fischer, Holger; Kansy, Manfred; Seelig, Anna; Assmus, Frauke

    2015-02-20

    Here we present a miniaturized assay, referred to as Carrier-Mediated Distribution System (CAMDIS) for fast and reliable measurement of octanol/water distribution coefficients, log D(oct). By introducing a filter support for octanol, phase separation from water is facilitated and the tendency of emulsion formation (emulsification) at the interface is reduced. A guideline for the best practice of CAMDIS is given, describing a strategy to manage drug adsorption at the filter-supported octanol/buffer interface. We validated the assay on a set of 52 structurally diverse drugs with known shake flask log D(oct) values. Excellent agreement with literature data (r(2) = 0.996, standard error of estimate, SEE = 0.111), high reproducibility (standard deviation, SD < 0.1 log D(oct) units), minimal sample consumption (10 μL of 100 μM DMSO stock solution) and a broad analytical range (log D(oct) range = -0.5 to 4.2) make CAMDIS a valuable tool for the high-throughput assessment of log D(oc)t. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  20. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  1. A probabilistic model of RNA conformational space

    DEFF Research Database (Denmark)

    Frellsen, Jes; Moltke, Ida; Thiim, Martin

    2009-01-01

    efficient sampling of RNA conformations in continuous space, and with associated probabilities. We show that the model captures several key features of RNA structure, such as its rotameric nature and the distribution of the helix lengths. Furthermore, the model readily generates native-like 3-D......, the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows......The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling...

  2. Vertical random variability of the distribution coefficient in the soil and its effect on the migration of fallout radionuclides

    International Nuclear Information System (INIS)

    Bunzl, K.

    2002-01-01

    In the field, the distribution coefficient, K d , for the sorption of a radionuclide by the soil cannot be expected to be constant. Even in a well defined soil horizon, K d will vary stochastically in horizontal as well as in vertical direction around a mean value. The horizontal random variability of K d produce a pronounced tailing effect in the concentration depth profile of a fallout radionuclide, much less is known on the corresponding effect of the vertical random variability. To analyze this effect theoretically, the classical convection-dispersion model in combination with the random-walk particle method was applied. The concentration depth profile of a radionuclide was calculated one year after deposition assuming constant values of the pore water velocity, the diffusion/dispersion coefficient, and the distribution coefficient (K d = 100 cm 3 x g -1 ) and exhibiting a vertical variability for K d according to a log-normal distribution with a geometric mean of 100 cm 3 x g -1 and a coefficient of variation of CV 0.53. The results show that these two concentration depth profiles are only slightly different, the location of the peak is shifted somewhat upwards, and the dispersion of the concentration depth profile is slightly larger. A substantial tailing effect of the concentration depth profile is not perceivable. Especially with respect to the location of the peak, a very good approximation of the concentration depth profile is obtained if the arithmetic mean of the K d -values (K d = 113 cm 3 x g -1 ) and a slightly increased dispersion coefficient are used in the analytical solution of the classical convection-dispersion equation with constant K d . The evaluation of the observed concentration depth profile with the analytical solution of the classical convection-dispersion equation with constant parameters will, within the usual experimental limits, hardly reveal the presence of a log-normal random distribution of K d in the vertical direction in

  3. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  4. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...... are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...

  5. Understanding onsets of rainfall in Southern Africa using temporal probabilistic modelling

    CSIR Research Space (South Africa)

    Cheruiyot, D

    2010-12-01

    Full Text Available This research investigates an alternative approach to automatically evolve the hidden temporal distribution of onset of rainfall directly from multivariate time series (MTS) data in the absence of domain experts. Temporal probabilistic modelling...

  6. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  7. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  8. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  9. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  10. Probabilistic safety analysis using microcomputer

    International Nuclear Information System (INIS)

    Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos

    1990-01-01

    The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)

  11. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  12. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  13. On S.N. Bernstein's derivation of Mendel's Law and 'rediscovery' of the Hardy-Weinberg distribution

    Directory of Open Access Journals (Sweden)

    Alan Stark

    2012-01-01

    Full Text Available Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen's monograph influenced Bernstein is discussed.

  14. Estimation of the location parameter of distributions with known coefficient of variation by record values

    Directory of Open Access Journals (Sweden)

    N. K. Sajeevkumar

    2014-09-01

    Full Text Available In this article, we derived the Best Linear Unbiased Estimator (BLUE of the location parameter of certain distributions with known coefficient of variation by record values. Efficiency comparisons are also made on the proposed estimator with some of the usual estimators. Finally we give a real life data to explain the utility of results developed in this article.

  15. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  16. Probabilistic graphical models to deal with age estimation of living persons.

    Science.gov (United States)

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  17. Probabilistic wind power forecasting based on logarithmic transformation and boundary kernel

    International Nuclear Information System (INIS)

    Zhang, Yao; Wang, Jianxue; Luo, Xu

    2015-01-01

    Highlights: • Quantitative information on the uncertainty of wind power generation. • Kernel density estimator provides non-Gaussian predictive distributions. • Logarithmic transformation reduces the skewness of wind power density. • Boundary kernel method eliminates the density leakage near the boundary. - Abstracts: Probabilistic wind power forecasting not only produces the expectation of wind power output, but also gives quantitative information on the associated uncertainty, which is essential for making better decisions about power system and market operations with the increasing penetration of wind power generation. This paper presents a novel kernel density estimator for probabilistic wind power forecasting, addressing two characteristics of wind power which have adverse impacts on the forecast accuracy, namely, the heavily skewed and double-bounded nature of wind power density. Logarithmic transformation is used to reduce the skewness of wind power density, which improves the effectiveness of the kernel density estimator in a transformed scale. Transformations partially relieve the boundary effect problem of the kernel density estimator caused by the double-bounded nature of wind power density. However, the case study shows that there are still some serious problems of density leakage after the transformation. In order to solve this problem in the transformed scale, a boundary kernel method is employed to eliminate the density leak at the bounds of wind power distribution. The improvement of the proposed method over the standard kernel density estimator is demonstrated by short-term probabilistic forecasting results based on the data from an actual wind farm. Then, a detailed comparison is carried out of the proposed method and some existing probabilistic forecasting methods

  18. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  19. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  20. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  1. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  2. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  3. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  4. Short-term Probabilistic Load Forecasting with the Consideration of Human Body Amenity

    Directory of Open Access Journals (Sweden)

    Ning Lu

    2013-02-01

    Full Text Available Load forecasting is the basis of power system planning and design. It is important for the economic operation and reliability assurance of power system. However, the results of load forecasting given by most existing methods are deterministic. This study aims at probabilistic load forecasting. First, the support vector machine regression is used to acquire the deterministic results of load forecasting with the consideration of human body amenity. Then the probabilistic load forecasting at a certain confidence level is given after the analysis of error distribution law corresponding to certain heat index interval. The final simulation shows that this probabilistic forecasting method is easy to implement and can provide more information than the deterministic forecasting results, and thus is helpful for decision-makers to make reasonable decisions.

  5. Relation between distribution coefficient of radioactive strontium and solid-liquid distribution ratio of background stable strontium

    International Nuclear Information System (INIS)

    Igarashi, Toshifumi; Mahara, Yasunori; Okamura, Masaki; Ashikawa, Nobuo.

    1992-01-01

    Distribution coefficients (K d ) of nuclides, which are defined as the ratio of the adsorbed concentration to the solution concentration, are important in predicting nuclide migration in the subsurface environment. This study was undertaken to contrust an effective method of determining the most pertinent K d value for simulating in situ distribution phenomena between the solid and liquid phases, by using background stable isotopes. This paper describes the applicability of this method to Sr by carrying out a batch Sr adsorption experiment where stable Sr coexisted with the radioactive isotope, 85 Sr, and by comparing the concentration distribution ratio of the background stable Sr with the K d value obtained by the batch experiment. The results showed that the K d of 85 Sr (K d85 ) agreed well with the K d of the coexisting stable Sr (K ds ) and that the two values decreased with an increase in the concentration of the stable Sr, when sand was used as an adsorbent. In addition, the K d85 corresponded to the ratio of the exchangeable solid-phase concentration of background stable Sr to the concentration of the background stable Sr in groundwater when the concentration of the coexisting stable Sr approached the background level. On the other hand, when powdered rock samples were used, the K d85 did not agree with the K ds , and the concentration distribution ratio of the background stable Sr was greater than the K d85 . This discrepancy might be due to the disequilibrium resulting from grinding the rock matrices. This suggests that measurement of the background stable Sr distribution ratio between the solid and liquid phases can be an effective method of estimating the K d of radioactive Sr when the groundwater is in satisfactory contact with the adsorption medium. (author)

  6. A Numerical Procedure for Flow Distribution and Pressure Drops for U and Z Type Configurations Plate Heat Exchangers with Variable Coefficients

    International Nuclear Information System (INIS)

    López, R; Lecuona, A; Ventas, R; Vereda, C

    2012-01-01

    In Plate Heat Exchangers it is important to determine the flow distribution and pressure drops, because they affect directly the performance of a heat exchanger. This work proposes an incompressible, one-dimensional, steady state, discrete model allowing for variable overall momentum coefficients to determine these magnitudes. The model consists on a modified version of the Bajura and Jones model for dividing and combining flow manifolds. The numerical procedure is based on the finite differences approximation approach proposed by Datta and Majumdar. A linear overall momentum coefficient distribution is used in the dividing manifold, but the model is not limited to linear distributions. Comparisons are made with experimental, numerical and analytical data, yielding good results.

  7. Caffeine and paraxanthine in aquatic systems: Global exposure distributions and probabilistic risk assessment.

    Science.gov (United States)

    Rodríguez-Gil, J L; Cáceres, N; Dafouz, R; Valcárcel, Y

    2018-01-15

    This study presents one of the most complete applications of probabilistic methodologies to the risk assessment of emerging contaminants. Perhaps the most data-rich of these compounds, caffeine, as well as its main metabolite (paraxanthine), were selected for this study. Information for a total of 29,132 individual caffeine and 7442 paraxanthine samples was compiled, including samples where the compounds were not detected. The inclusion of non-detect samples (as censored data) in the estimation of environmental exposure distributions (EEDs) allowed for a realistic characterization of the global presence of these compounds in aquatic systems. EEDs were compared to species sensitivity distributions (SSDs), when possible, in order to calculate joint probability curves (JPCs) to describe the risk to aquatic organisms. This way, it was determined that unacceptable environmental risk (defined as 5% of the species being potentially exposed to concentrations able to cause effects in>5% of the cases) could be expected from chronic exposure to caffeine from effluent (28.4% of the cases), surface water (6.7% of the cases) and estuary water (5.4% of the cases). Probability of exceedance of acute predicted no-effect concentrations (PNECs) for paraxanthine were higher than 5% for all assessed matrices except for drinking water and ground water, however no experimental effects data was available for paraxanthine, resulting in a precautionary deterministic hazard assessment for this compound. Given the chemical similarities between both compounds, real effect thresholds, and thus risk, for paraxanthine, would be expected to be close to those observed for caffeine. Negligible Human health risk from exposure to caffeine via drinking or groundwater is expected from the compiled data. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [Univ. of Texas-Dallas, Richardson, TX (United States); Feng, Cong [Univ. of Texas-Dallas, Richardson, TX (United States); Wang, Zhenke [Univ. of Texas-Dallas, Richardson, TX (United States); Zhang, Jie [Univ. of Texas-Dallas, Richardson, TX (United States)

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  9. Probabilistic Analysis of Gas Turbine Field Performance

    Science.gov (United States)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  10. Probabilistic SSME blades structural response under random pulse loading

    Science.gov (United States)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  11. Probabilistic Assessment of the Occurrence and Duration of Ice Accretion on Cables

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee; Georgakis, Christos Thomas; Faber, Michael Havbro

    2015-01-01

    This paper presents an operational framework for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. The framework utilizes the features of the Bayesian Probabilistic Networks. and its performance is illustrated through a case study of the cable-stayed...... Oresund Bridge. The Bayesian Probabilistic Network model used for the estimation of the occurrence and duration probabilities is studied and it is found to be robust with respect to changes in the choice of distribution types used to model the meteorological variables that influence the two icing...

  12. STUDY OF REFLECTION COEFFICIENT DISTRIBUTION FOR ANTI-REFLECTION COATINGS ON SMALL-RADIUS OPTICAL PARTS

    Directory of Open Access Journals (Sweden)

    L. A. Gubanova

    2015-03-01

    Full Text Available The paper deals with findings for the energy reflection coefficient distribution of anti- reflection coating along the surface of optical elements with a very small radius (2-12 mm. The factors influencing the magnitude of the surface area of the optical element, in which the energy reflection coefficient is constant, were detected. The main principles for theoretical models that describe the spectral characteristics of the multilayer interference coatings were used to achieve these objectives. The relative size of the enlightenment area is defined as the ratio of the radius for the optical element surface, where the reflection is less than a certain value, to its radius (ρ/r. The result of research is the following: this size is constant for a different value of the curvature radius for the optical element made of the same material. Its value is determined by the refractive index of material (nm, from which the optical element was made, and the design of antireflection coatings. For single-layer coatings this value is ρ/r = 0.5 when nm = 1.51; and ρ/r = 0.73 when nm = 1.75; for two-layer coatings ρ/r = 0.35 when nm = 1.51 and ρ/r = 0.41 when nm = 1.75. It is shown that with increasing of the material refractive index for the substrate size, the area of minimum reflection coefficient is increased. The paper considers a single-layer, two-layer, three-layer and five-layer structures of antireflection coatings. The findings give the possibility to conclude that equal thickness coverings formed on the optical element surface with a small radius make no equal reflection from the entire surface, and distribution of the layer thickness needs to be looked for, providing a uniform radiation reflection at all points of the spherical surface.

  13. Probabilistic Space Weather Forecasting: a Bayesian Perspective

    Science.gov (United States)

    Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.

    2017-12-01

    Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.

  14. Simplified application of probabilistic safety analysis in nuclear power plants by means of artificial neural networks

    International Nuclear Information System (INIS)

    Oehmgen, T.; Knorr, J.

    2004-01-01

    Probabilistic safety analyses (PSA) are conducted to assess the balanced nature of plant design in terms of technical safety and the administrative management of plant operation in nuclear power plants. In the evaluation shown in this article of the operating experience accumulated in two nuclear power plants, all failures are traced back consistently to the plant media and component levels, respectively, for the calculation of reliability coefficients. Moreover, the use of neural networks for probabilistic calculations is examined. The results are verified on the basis of test examples. Calculations with neural networks are very easy to carry out in a kind of 'black box'. There is a possibility, for instance, to use the system in plant maintenance. (orig.) [de

  15. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  16. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  17. Probabilistic Sensitivity Amplification Control for Lower Extremity Exoskeleton

    Directory of Open Access Journals (Sweden)

    Likun Wang

    2018-03-01

    Full Text Available To achieve ideal force control of a functional autonomous exoskeleton, sensitivity amplification control is widely used in human strength augmentation applications. The original sensitivity amplification control aims to increase the closed-loop control system sensitivity based on positive feedback without any sensors between the pilot and the exoskeleton. Thus, the measurement system can be greatly simplified. Nevertheless, the controller lacks the ability to reject disturbance and has little robustness to the variation of the parameters. Consequently, a relatively precise dynamic model of the exoskeleton system is desired. Moreover, the human-robot interaction (HRI cannot be interpreted merely as a particular part of the driven torque quantitatively. Therefore, a novel control methodology, so-called probabilistic sensitivity amplification control, is presented in this paper. The innovation of the proposed control algorithm is two-fold: distributed hidden-state identification based on sensor observations and evolving learning of sensitivity factors for the purpose of dealing with the variational HRI. Compared to the other state-of-the-art algorithms, we verify the feasibility of the probabilistic sensitivity amplification control with several experiments, i.e., distributed identification model learning and walking with a human subject. The experimental result shows potential application feasibility.

  18. Spectro-ellipsometric studies of sputtered amorphous Titanium dioxide thin films: simultaneous determination of refractive index, extinction coefficient, and void distribution

    CERN Document Server

    Lee, S I; Oh, S G

    1999-01-01

    Amorphous titanium dioxide thin films were deposited onto silicon substrates by using RF magnetron sputtering, and the index of refraction, the extinction coefficient, and the void distribution of these films were simultaneously determined from the analyses of there ellipsometric spectra. In particular, our novel strategy, which combines the merits of multi-sample fitting, the dual dispersion function, and grid search, was proven successful in determining optical constants over a wide energy range, including the energy region where the extinction coefficient was large. Moreover, we found that the void distribution was dependent on the deposition conditions, such as the sputtering power, the substrate temperature, and the substrate surface.

  19. Probabilistic management of power distribution systems; Probabilistisch netbeheer

    Energy Technology Data Exchange (ETDEWEB)

    Roggen, M. (ed.)

    2005-07-01

    Managers of electricity networks must be able to answer the question how their investments contribute to minimizing the consequences of power failures. A probabilistic approach weighs risks based on which the most efficient method to prevent failures can be chosen. Use is made of a network calculation program (PowerFactory) and a simulation model and an Excel interface to carry out stochastic calculations. [Dutch] Sinds begin dit jaar geldt dat aan de voorwaarde van het n-1 principe voor netten met een spanningsniveau lager dan 220 kV niet langer in alle gevallen hoeft te worden voldaan. Bijvoorbeeld als de kosten van een investering niet opwegen tegen de maatschappelijke belangen. Dat geeft netbeheerders de ruimte om andere criteria aan te leggen voor ontwerp en onderhoud. Netbeheerders kunnen veel baat hebben bij een probabilistische benadering die alle risico's vooraf inzichtelijk en daarmee beheersbaar maakt. KEMA koppelde PowerFactory aan een probabilistisch simulatiemodel, in combinatie met een Excel-interface, waarmee allerhande variabelen kunnen worden ingevoerd om stochastische berekeningen mee uit te voeren.

  20. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  1. SU-C-BRB-05: Determining the Adequacy of Auto-Contouring Via Probabilistic Assessment of Ensuing Treatment Plan Metrics in Comparison with Manual Contours

    International Nuclear Information System (INIS)

    Nourzadeh, H; Watkins, W; Siebers, J; Ahmad, M

    2016-01-01

    Purpose: To determine if auto-contour and manual-contour—based plans differ when evaluated with respect to probabilistic coverage metrics and biological model endpoints for prostate IMRT. Methods: Manual and auto-contours were created for 149 CT image sets acquired from 16 unique prostate patients. A single physician manually contoured all images. Auto-contouring was completed utilizing Pinnacle’s Smart Probabilistic Image Contouring Engine (SPICE). For each CT, three different 78 Gy/39 fraction 7-beam IMRT plans are created; PD with drawn ROIs, PAS with auto-contoured ROIs, and PM with auto-contoured OARs with the manually drawn target. For each plan, 1000 virtual treatment simulations with different sampled systematic errors for each simulation and a different sampled random error for each fraction were performed using our in-house GPU-accelerated robustness analyzer tool which reports the statistical probability of achieving dose-volume metrics, NTCP, TCP, and the probability of achieving the optimization criteria for both auto-contoured (AS) and manually drawn (D) ROIs. Metrics are reported for all possible cross-evaluation pairs of ROI types (AS,D) and planning scenarios (PD,PAS,PM). Bhattacharyya coefficient (BC) is calculated to measure the PDF similarities for the dose-volume metric, NTCP, TCP, and objectives with respect to the manually drawn contour evaluated on base plan (D-PD). Results: We observe high BC values (BC≥0.94) for all OAR objectives. BC values of max dose objective on CTV also signify high resemblance (BC≥0.93) between the distributions. On the other hand, BC values for CTV’s D95 and Dmin objectives are small for AS-PM, AS-PD. NTCP distributions are similar across all evaluation pairs, while TCP distributions of AS-PM, AS-PD sustain variations up to %6 compared to other evaluated pairs. Conclusion: No significant probabilistic differences are observed in the metrics when auto-contoured OARs are used. The prostate auto-contour needs

  2. SU-C-BRB-05: Determining the Adequacy of Auto-Contouring Via Probabilistic Assessment of Ensuing Treatment Plan Metrics in Comparison with Manual Contours

    Energy Technology Data Exchange (ETDEWEB)

    Nourzadeh, H; Watkins, W; Siebers, J; Ahmad, M [University of Virginia Health Systems, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To determine if auto-contour and manual-contour—based plans differ when evaluated with respect to probabilistic coverage metrics and biological model endpoints for prostate IMRT. Methods: Manual and auto-contours were created for 149 CT image sets acquired from 16 unique prostate patients. A single physician manually contoured all images. Auto-contouring was completed utilizing Pinnacle’s Smart Probabilistic Image Contouring Engine (SPICE). For each CT, three different 78 Gy/39 fraction 7-beam IMRT plans are created; PD with drawn ROIs, PAS with auto-contoured ROIs, and PM with auto-contoured OARs with the manually drawn target. For each plan, 1000 virtual treatment simulations with different sampled systematic errors for each simulation and a different sampled random error for each fraction were performed using our in-house GPU-accelerated robustness analyzer tool which reports the statistical probability of achieving dose-volume metrics, NTCP, TCP, and the probability of achieving the optimization criteria for both auto-contoured (AS) and manually drawn (D) ROIs. Metrics are reported for all possible cross-evaluation pairs of ROI types (AS,D) and planning scenarios (PD,PAS,PM). Bhattacharyya coefficient (BC) is calculated to measure the PDF similarities for the dose-volume metric, NTCP, TCP, and objectives with respect to the manually drawn contour evaluated on base plan (D-PD). Results: We observe high BC values (BC≥0.94) for all OAR objectives. BC values of max dose objective on CTV also signify high resemblance (BC≥0.93) between the distributions. On the other hand, BC values for CTV’s D95 and Dmin objectives are small for AS-PM, AS-PD. NTCP distributions are similar across all evaluation pairs, while TCP distributions of AS-PM, AS-PD sustain variations up to %6 compared to other evaluated pairs. Conclusion: No significant probabilistic differences are observed in the metrics when auto-contoured OARs are used. The prostate auto-contour needs

  3. Classification of distribution coefficient data by mineral components and chemical forms

    International Nuclear Information System (INIS)

    Takeda, Seiji; Kimura, Hideo; Matsuzuru, Hideo

    1996-01-01

    The use of distribution coefficient (Kd) in radionuclide transport model has been reported in a number of papers. However, Kd data cover a wide range even for a specific element. In this study the Kd data of neptunium, uranium and selenium, which are included in sorption database (SDB, OECD/NEA) of radionuclides, were classified by a solid phase and a dominant species in a solution. The aqueous species of these elements were estimated by a geochemical model. The Kd data classified by the analyzed speciation were tested by a nonparametric statistical method. The results of tests proved that the Kd data of neptunium or uranium, which covered a wide range, were influenced by the supersaturation of Np(OH) 4 (s) or schoepite. The Kd data of neptunium could be classified by the dominant aqueous species, NpO 2 + , NpO 2 CO 3 - , NpO 2 OH(aq) and Np(OH) 4 (aq). The Kd data of these four dominant species which are not equilibrated with supersaturated Np(OH) 4 (s) are less than 100 ml/g. The analyzed aqueous species of uranium were UO 2 (OH) 2 (aq) and UO 2 (CO 3 ) n 2-2n (n=2,3) in hexavalent state. It is suggested that the distribution coefficient of neptunium and uranium depends on dominant aqueous species or charged species, i.e., cationic, anionic and nonionic forms. The dominant aqueous species of selenium are HSe - , HSeO 3 - , SeO 3 2- and SeO 4 2- . The result of the nonparametric statistical test shows that the Kd value of HSeO 3 - is higher than of other anionic forms. However, the influence of the species, HSe - , SeO 3 2- and SeO 4 2- , on Kd values is not clearly identified. Considering the dominant species, the Kd of elements are in ranges of 1 to 2 orders of magnitude being in general narrower than those classified by mineral and rock types. (author)

  4. Novel probabilistic and distributed algorithms for guidance, control, and nonlinear estimation of large-scale multi-agent systems

    Science.gov (United States)

    Bandyopadhyay, Saptarshi

    Multi-agent systems are widely used for constructing a desired formation shape, exploring an area, surveillance, coverage, and other cooperative tasks. This dissertation introduces novel algorithms in the three main areas of shape formation, distributed estimation, and attitude control of large-scale multi-agent systems. In the first part of this dissertation, we address the problem of shape formation for thousands to millions of agents. Here, we present two novel algorithms for guiding a large-scale swarm of robotic systems into a desired formation shape in a distributed and scalable manner. These probabilistic swarm guidance algorithms adopt an Eulerian framework, where the physical space is partitioned into bins and the swarm's density distribution over each bin is controlled using tunable Markov chains. In the first algorithm - Probabilistic Swarm Guidance using Inhomogeneous Markov Chains (PSG-IMC) - each agent determines its bin transition probabilities using a time-inhomogeneous Markov chain that is constructed in real-time using feedback from the current swarm distribution. This PSG-IMC algorithm minimizes the expected cost of the transitions required to achieve and maintain the desired formation shape, even when agents are added to or removed from the swarm. The algorithm scales well with a large number of agents and complex formation shapes, and can also be adapted for area exploration applications. In the second algorithm - Probabilistic Swarm Guidance using Optimal Transport (PSG-OT) - each agent determines its bin transition probabilities by solving an optimal transport problem, which is recast as a linear program. In the presence of perfect feedback of the current swarm distribution, this algorithm minimizes the given cost function, guarantees faster convergence, reduces the number of transitions for achieving the desired formation, and is robust to disturbances or damages to the formation. We demonstrate the effectiveness of these two proposed swarm

  5. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  6. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.

    Science.gov (United States)

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.

  7. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  8. Probabilistic estimation of the constitutive parameters of polymers

    Directory of Open Access Journals (Sweden)

    Siviour C.R.

    2012-08-01

    Full Text Available The Mulliken-Boyce constitutive model predicts the dynamic response of crystalline polymers as a function of strain rate and temperature. This paper describes the Mulliken-Boyce model-based estimation of the constitutive parameters in a Bayesian probabilistic framework. Experimental data from dynamic mechanical analysis and dynamic compression of PVC samples over a wide range of strain rates are analyzed. Both experimental uncertainty and natural variations in the material properties are simultaneously considered as independent and joint distributions; the posterior probability distributions are shown and compared with prior estimates of the material constitutive parameters. Additionally, particular statistical distributions are shown to be effective at capturing the rate and temperature dependence of internal phase transitions in DMA data.

  9. Experimental determination of the distribution coefficient (Kd) of lead and barium in soils of semiarid region of Bahia, Brazil

    International Nuclear Information System (INIS)

    Santos, Mariana M.; Fernandes, Heloisa H.F; Pontedeiro, Elizabeth M.; Su, Jian

    2013-01-01

    To determine the concentration of heavy metals and other contaminants in soils, aimed at evaluating the environmental impact, the use of the distribution coefficient is required (Kd), defined as the relationship between the concentrations adsorbed and in solution. The objective of this study was to determine the rates for the Lead and Barium metals in soil collected in Caetite, the state of Bahia, in two different depths. The importance of determining the distribution coefficient lies in the fact that being performed using a tropical soil. For the isotherms of Kd was used batch test method by adsorption to obtain the final concentrations. The first step was to determine the best ratio soil: solution obtained after equilibration time and finally the equilibrium concentration of the contaminant. Were also calculated percentages of the metal adsorbed by the soil and the amount of solute by the adsorbent. With the values obtained in experiments and using Mathematica 8.0 software, were made graphics equilibrium concentration versus quantity adsorbed (C vs. S). It can also plot isotherms for different models of Kd: linear, Langmuir and Freundlich in order to determine which adsorption model would fit best to the measured data and thus determine the distribution coefficient of the metal in the soil analyzed. The Freundlich isotherm was better adapted to the points of the two metals in both soils

  10. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  11. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  12. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  13. PROBABILISTIC ANALYSIS OF DEPRESSIVE EPISODES: APPLICATION OF RENEWAL THEORY UNDER UNIFORM PROBABILITY LAW.

    Directory of Open Access Journals (Sweden)

    Runjun Phookun

    2013-04-01

    Full Text Available The renewal process has been formulated on the basis of hazard rate of time between two consecutive occurrences of depressive episodes. The probabilistic analysis of depressive episodes can be performed under various forms of hazard rate viz. constant, linear etc. In this paper we are considering a particular form of hazard rate which is h(x=(b-x^- where b is a constant, x is the time between two consecutive episodes of depression. As a result time between two consecutive occurrences of depressive episodes follows uniform distribution in (a,b The distribution of range i.e. the difference between the longest and the shortest occurrence time to a depressive episode, and the expected number of depressive episodes in a random interval of time are obtained for the distribution under consideration. If the closed form of expression for the waiting time distribution is not available, then the Laplace transformation is used for the study of probabilistic analysis. Hazard rate of occurrence and expected number of depressive episodes have been presented graphically

  14. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  15. Estimation of distribution coefficient for uranium in soil around a waste disposal site at Trombay

    International Nuclear Information System (INIS)

    Mishra, S.; Chaudhary, D.K.; Sandeep, P.; Pandit, G.G.

    2014-01-01

    Soil contamination arising from the disposed waste from industrial origin is of major concern now a days. There is a possibility of run off as well as Ieaching of contaminants from the sites to nearby aquatic bodies through rain water. Distribution coefficient, K d in soil is an important parameter to predict the migration of contaminants. However it requires precise measurement not only for the accurate prediction of contaminant transport but also for describing the sorption behavior in a particular environment. The variation of K d values for a radionuclide is due to differences in geochemical conditions, soil materials, nature of water and methods used for the measurements. For the present study soil samples have been collected near a waste disposal site at Trombay and the sorption of uranium has been studied by measuring the distribution coefficient (K d ) by laboratory batch method. In our earlier studies, we could notice substantial effect of ionic composition of ground water on the K d values of uranium. In this study we have used rain water as the sorption media and the measured K d value s were compared with previous values for different soil and water characteristics from different regions of India

  16. E-Area LLWF Vadose Zone Model: Probabilistic Model for Estimating Subsided-Area Infiltration Rates

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-12-12

    A probabilistic model employing a Monte Carlo sampling technique was developed in Python to generate statistical distributions of the upslope-intact-area to subsided-area ratio (AreaUAi/AreaSAi) for closure cap subsidence scenarios that differ in assumed percent subsidence and the total number of intact plus subsided compartments. The plan is to use this model as a component in the probabilistic system model for the E-Area Performance Assessment (PA), contributing uncertainty in infiltration estimates.

  17. A Probabilistic Safety Assessment of a Pyro-processed Waste Repository

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Jeong, Jong Tae

    2012-01-01

    A GoldSim template program for a safety assessment of a hybrid-typed repository system, called A-KRS, in which two kinds of pyro-processed radioactive wastes, low-level metal wastes and ceramic high-level wastes that arise from the pyro-processing of PWR nuclear spent fuels are disposed of, has been developed. This program is ready both for a deterministic and probabilistic total system performance assessment which is able to evaluate nuclide release from the repository and farther transport into the geosphere and biosphere under various normal, disruptive natural and manmade events, and scenarios. The A-KRS has been probabilistically assessed with 9 selected input parameters, each of which has its own statistical distribution for a normal release and transport scenario associated with nuclide release and transport in and around the repository. Probabilistic dose exposure rates to the farming exposure group have been evaluated. A sensitivity of 9 selected parameters to the result has also been investigated to see which parameter is more sensitive and important to the exposure rates.

  18. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  19. Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions

    Science.gov (United States)

    Padilla, Miguel A.; Divers, Jasmin

    2013-01-01

    The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…

  20. Non-parametric probabilistic forecasts of wind power: required properties and evaluation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Nielsen, Henrik Aalborg; Møller, Jan Kloppenborg

    2007-01-01

    of a single or a set of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point...

  1. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  2. A microfluidic platform for the rapid determination of distribution coefficients by gravity assisted droplet-based liquid-liquid extraction

    DEFF Research Database (Denmark)

    Poulsen, Carl Esben; Wootton, Robert C. R.; Wolff, Anders

    2015-01-01

    The determination of pharmacokinetic properties of drugs, such as the distribution coefficient, D, is a crucial measurement in pharmaceutical research. Surprisingly, the conventional (gold standard) technique used for D measurements, the shake-flask method, is antiquated and unsuitable...... for the testing of valuable and scarce drug candidates. Herein we present a simple micro fluidic platform for the determination of distribution coefficients using droplet-based liquid-liquid extraction. For simplicity, this platform makes use of gravity to enable phase separation for analysis and is 48 times...... the apparent acid dissociation constant, pK', as a proxy for inter-system comparison. Our platform determines a pK' value of 7.24 ± 0.15, compared to 7.25 ± 0.58 for the shake-flask method in our hands and 7.21 for the shake-flask method in literature. Devices are fabricated using injection moulding, the batch...

  3. Iterative principles of recognition in probabilistic neural networks

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, Jan

    2008-01-01

    Roč. 21, č. 6 (2008), s. 838-846 ISSN 0893-6080 R&D Projects: GA MŠk 1M0572; GA ČR GA102/07/1594 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Probabilistic neural networks * Distribution mixtures * EM algorithm * Recognition of numerals * Recurrent reasoning Subject RIV: IN - Informatics, Computer Science Impact factor: 2.656, year: 2008

  4. Evaluation of probabilistic forecasts with the scoringRules package

    Science.gov (United States)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  5. Probabilistic Analysis of the Quality Calculus

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    2013-01-01

    We consider a fragment of the Quality Calculus, previously introduced for defensive programming of software components such that it becomes natural to plan for default behaviour in case the ideal behaviour fails due to unreliable communication. This paper develops a probabilistically based trust...... analysis supporting the Quality Calculus. It uses information about the probabilities that expected input will be absent in order to determine the trustworthiness of the data used for controlling the distributed system; the main challenge is to take accord of the stochastic dependency between some...

  6. CANDU type fuel behavior evaluation - a probabilistic approach

    International Nuclear Information System (INIS)

    Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.

    1995-01-01

    In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)

  7. Evaluation of distribution coefficients for the prediction of strontium and cesium migration in a uniform sand

    International Nuclear Information System (INIS)

    Reynolds, W.D.; Gillham, R.W.; Cherry, J.A.

    1982-01-01

    The validity of using a distribution coefficient (Ksub(d)) in the mathematical prediction of strontium and cesium transport through uniform saturated sand was investigated by comparing measured breakthrough curves with curves of simulations using the advection-dispersion and the advection equations. Values for Ksub(d) were determined by batch equilibration tests and, indirectly, by fitting the mathematical model to breakthrough data from column experiments. Although the advection-dispersion equation accurately represented the breakthrough curves for two nonreactive solutes (chloride and tritium), neither it nor the advection equation provided close representations of the strontium and cesium curves. The simulated breakthrough curves for strontium and cesium were nearly symmetrical, whereas the data curves were very asymmetrical, with long tails. Column experiments with different pore-water velocities indicated that the shape of the normalized breakthrough curves was not sensitive to velocity. This suggests that the asymmetry of the measured curves was the result of nonlinear partitioning of the cations between the solid and liquid phases, rather than nonequilibrium effects. The results indicate that the distribution coefficient, when used in advection-dispersion models for prediction of the migration of strontium and cesium in field situations, can result in significant error

  8. Optimization of the kernel functions in a probabilistic neural network analyzing the local pattern distribution.

    Science.gov (United States)

    Galleske, I; Castellanos, J

    2002-05-01

    This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

  9. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  10. Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Rodriguez, Michael C.; Maeda, Yukiko

    2006-01-01

    The meta-analysis of coefficient alpha across many studies is becoming more common in psychology by a methodology labeled reliability generalization. Existing reliability generalization studies have not used the sampling distribution of coefficient alpha for precision weighting and other common meta-analytic procedures. A framework is provided for…

  11. Application of simulation techniques in the probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    De Ruyter van Steveninck, J.L.

    1995-03-01

    The Monte Carlo simulation is applied on a model of the fracture mechanics in order to assess the applicability of this simulation technique in the probabilistic fracture mechanics. By means of the fracture mechanics model the brittle fracture of a steel container or pipe with defects can be predicted. By means of the Monte Carlo simulation also the uncertainty regarding failures can be determined. Based on the variations in the toughness of the fracture and the defect dimensions the distribution of the chance of failure is determined. Also attention is paid to the impact of dependency between uncertain variables. Furthermore, the influence of the applied distributions of the uncertain variables and non-destructive survey on the chance of failure is analyzed. The Monte Carlo simulation results agree quite well with the results of other methods from the probabilistic fracture mechanics. If an analytic expression can be found for the chance of failure, it is possible to determine the variation of the chance of failure, next to an estimation of the chance of failure. It also appears that the dependency between the uncertain variables has a large impact on the chance of failure. It is also concluded from the simulation that the chance of failure strongly depends on the crack depth, and therefore of the distribution of the crack depth. 15 figs., 7 tabs., 12 refs

  12. Field experiment determinations of distribution coefficients of actinide elements in alkaline lake environments

    International Nuclear Information System (INIS)

    Simpson, H.J.; Trier, R.M.; Li, Y.H.; Anderson, R.F.

    1984-01-01

    Radionuclide concentrations of a number of elements (Am, Pu, U, Pa, Th, Ac, Ra, Po, Pb, Cs, and Sr) have been measured in the water and sediments of a group of alkaline lakes in the western USA. These data demonstrate greatly enhanced soluble phase concentrations of elements with oxidation states of III, IV, V, and VI as the result of carbonate complexing. Dissolved concentrations of isotopes of U, Pa, and Th in a lake with pH = 10 and a total inorganic carbon concentration of 4 x 10 -1 moles/1 were greater than those in sea water (pH = 8, ΣCO 2 = 2 x 10 -3 moles/1) by order of magnitude for 233 U, 238 U (--10 2 ), 231 Pa, 228 Th, 230 Th (--10 3 ) and 22 Th (--10 5 ). Concentrations of fallout /sup 239,240/Pu in the more alkaline lakes were equivalent to effective distribution coefficients of --10 3 , about a factor of 10 2 lower than in most other natural lakes, rivers, estuaries and coastal marine waters. Measurements of radionuclides in natural systems are essential for assessment of the likely fate of radionuclides which may be released from high level waste repositories to ground water. Laboratory-scale experiments using tracer additions of radionuclides to mixtures of water and sediment yielded distribution coefficients which were significantly different from those derived from field measurements (10 1 -10 2 lower for Po and Pu). Order of magnitude calculations from thermodynamic data of expected maximum U and Th concentrations, limited by pure phase solubilities, suggest that carbonate complexing can enhance solubility by many orders of magnitude in natural waters, even at relatively low carbonate ion concentrations

  13. Bayesian Hierarchical Structure for Quantifying Population Variability to Inform Probabilistic Health Risk Assessments.

    Science.gov (United States)

    Shao, Kan; Allen, Bruce C; Wheeler, Matthew W

    2017-10-01

    Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations. © 2016 Society for Risk Analysis.

  14. Learning on probabilistic manifolds in massive fusion databases: Application to confinement regime identification

    International Nuclear Information System (INIS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-01-01

    Highlights: ► We present an integrated framework for pattern recognition in fusion data. ► We model measurement uncertainty through an appropriate probability distribution. ► We use the geodesic distance on probabilistic manifolds as a similarity measure. ► We apply the framework to confinement mode classification. ► The classification accuracy benefits from uncertainty information and its geometry. - Abstract: We present an integrated framework for (real-time) pattern recognition in fusion data. The main premise is the inherent probabilistic nature of measurements of plasma quantities. We propose the geodesic distance on probabilistic manifolds as a similarity measure between data points. Substructure induced by data dependencies may further reduce the dimensionality and redundancy of the data set. We present an application to confinement mode classification, showing the distinct advantage obtained by considering the measurement uncertainty and its geometry.

  15. Calculation of age-dependent dose conversion coefficients for radionuclides uniformly distributed in air

    International Nuclear Information System (INIS)

    Hung, Tran Van; Satoh, Daiki; Takahashi, Fumiaki; Tsuda, Shuichi; Endo, Akira; Saito, Kimiaki; Yamaguchi, Yasuhiro

    2005-02-01

    Age-dependent dose conversion coefficients for external exposure to photons emitted by radionuclides uniformly distributed in air were calculated. The size of the source region in the calculation was assumed to be effectively semi-infinite in extent. Firstly, organ doses were calculated with a series of age-specific MIRD-5 type phantoms using MCNP code, a Monte Carlo transport code. The calculations were performed for mono-energetic photon sources of twelve energies from 10 keV to 5 MeV and for phantoms of newborn, 1, 5, 10 and 15 years, and adult. Then, the effective doses to the different age-phantoms from the mono-energetic photon sources were estimated based on the obtained organ doses. The calculated effective doses were used to interpolate the conversion coefficients of the effective doses for 160 radionuclides, which are important for dose assessment of nuclear facilities. In the calculation, energies and intensities of emitted photons from radionuclides were taken from DECDC, a recent compilation of decay data for radiation dosimetry developed at JAERI. The results are tabulated in the form of effective dose per unit concentration and time (Sv per Bq s m -3 ). (author)

  16. Probabilistic integrity assessment of pressure tubes in an operating pressurized heavy water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Young-Jin; Park, Heung-Bae [KEPCO E and C, 188 Gumi-dong, Bundang-gu, Seongnam-si, Gyeonggi-do 463-870 (Korea, Republic of); Lee, Jung-Min; Kim, Young-Jin [School of Mechanical Engineering, Sungkyunkwan University, 300 Chunchun-dong, Jangan-gu, Suwon-si, Gyeonggi-do 440-746 (Korea, Republic of); Ko, Han-Ok [Korea Institute of Nuclear Safety, 34 Gwahak-ro, Yuseong-gu, Daejeon-si 305-338 (Korea, Republic of); Chang, Yoon-Suk, E-mail: yschang@khu.ac.kr [Department of Nuclear Engineering, Kyung Hee University, 1 Seocheon-dong, Giheung-gu, Yongin-si, Gyeonggi-do 446-701 (Korea, Republic of)

    2012-02-15

    Even though pressure tubes are major components of a pressurized heavy water reactor (PHWR), only small proportions of pressure tubes are sampled for inspection due to limited inspection time and costs. Since the inspection scope and integrity evaluation have been treated by using a deterministic approach in general, a set of conservative data was used instead of all known information related to in-service degradation mechanisms because of inherent uncertainties in the examination. Recently, in order that pressure tube degradations identified in a sample of inspected pressure tubes are taken into account to address the balance of the uninspected ones in the reactor core, a probabilistic approach has been introduced. In the present paper, probabilistic integrity assessments of PHWR pressure tubes were carried out based on accumulated operating experiences and enhanced technology. Parametric analyses on key variables were conducted, which were periodically measured by in-service inspection program, such as deuterium uptake rate, dimensional change rate of pressure tube and flaw size distribution. Subsequently, a methodology to decide optimum statistical distribution by using a robust method adopting a genetic algorithm was proposed and applied to the most influential variable to verify the reliability of the proposed method. Finally, pros and cons of the alternative distributions comparing with corresponding ones derived from the traditional method as well as technical findings from the statistical assessment were discussed to show applicability to the probabilistic assessment of pressure tubes.

  17. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  18. A probabilistic approach of sum rules for heat polynomials

    International Nuclear Information System (INIS)

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  19. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Kinetic Parameters, Temperature Coefficients and Power Distribution

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results of comparison of calculated and experimental results for temperature coefficients, kinetic parameters and fission rates spatial distributions are shown. (author)

  20. Optimization of radial systems with biomass fueled gas engine from a metaheuristic and probabilistic point of view

    International Nuclear Information System (INIS)

    Ruiz-Rodriguez, F.J.; Gomez-Gonzalez, M.; Jurado, F.

    2013-01-01

    Highlights: ► Loads and distributed generation production are modeled as random variables. ► Distribution system with biomass fueled gas engines. ► Random nature of lower heat value of biomass and load. ► The Cornish–Fisher expansion is used for approximating quantiles of a random variable. ► Computational cost is low enough than that required for Monte Carlo simulation. - Abstract: This paper shows that the technical constraints must be considered in radial distribution networks, where the voltage regulation is one of the primary problems to be dealt in distributed generation systems based on biomass fueled engine. Loads and distributed generation production are modeled as random variables. Results prove that the proposed method can be applied for the keeping of voltages within desired limits at all load buses of a distribution system with biomass fueled gas engines. To evaluate the performance of this distribution system, this paper has developed a probabilistic model that takes into account the random nature of lower heat value of biomass and load. The Cornish–Fisher expansion is used for approximating quantiles of a random variable. This work introduces a hybrid method that utilizes a new optimization method based on swarm intelligence and probabilistic radial load flow. It is demonstrated the reduction in computation time achieved by the more efficient probabilistic load flow in comparison to Monte Carlo simulation. Acceptable solutions are reached in a smaller number of iterations. Therefore, convergence is more rapidly attained and computational cost is significantly lower than that required for Monte Carlo methods.

  1. Activity coefficients of electrons and holes in semiconductors

    International Nuclear Information System (INIS)

    Orazem, M.E.; Newman, J.

    1984-01-01

    Dilute-solution transport equations with constant activity coefficients are commonly used to model semiconductors. These equations are consistent with a Boltzmann distribution and are invalid in regions where the species concentration is close to the respective site concentration. A more rigorous treatment of transport in a semiconductor requires activity coefficients which are functions of concentration. Expressions are presented for activity coefficients of electrons and holes in semiconductors for which conduction- and valence-band energy levels are given by the respective bandedge energy levels. These activity coefficients are functions of concentration and are thermodynamically consistent. The use of activity coefficients in macroscopic transport relationships allows a description of electron transport in a manner consistent with the Fermi-Dirac distribution

  2. Validation of seismic probabilistic risk assessments of nuclear power plants

    International Nuclear Information System (INIS)

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves

  3. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  4. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  5. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  6. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    Science.gov (United States)

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  7. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  8. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  9. A parameterization scheme for the x-ray linear attenuation coefficient and energy absorption coefficient.

    Science.gov (United States)

    Midgley, S M

    2004-01-21

    A novel parameterization of x-ray interaction cross-sections is developed, and employed to describe the x-ray linear attenuation coefficient and mass energy absorption coefficient for both elements and mixtures. The new parameterization scheme addresses the Z-dependence of elemental cross-sections (per electron) using a simple function of atomic number, Z. This obviates the need for a complicated mathematical formalism. Energy dependent coefficients describe the Z-direction curvature of the cross-sections. The composition dependent quantities are the electron density and statistical moments describing the elemental distribution. We show that it is possible to describe elemental cross-sections for the entire periodic table and at energies above the K-edge (from 6 keV to 125 MeV), with an accuracy of better than 2% using a parameterization containing not more than five coefficients. For the biologically important elements 1 coefficients. At higher energies, the parameterization uses fewer coefficients with only two coefficients needed at megavoltage energies.

  10. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  11. Application of probabilistic risk based optimization approaches in environmental restoration

    International Nuclear Information System (INIS)

    Goldammer, W.

    1995-01-01

    The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites

  12. Predicting volume of distribution with decision tree-based regression methods using predicted tissue:plasma partition coefficients.

    Science.gov (United States)

    Freitas, Alex A; Limbu, Kriti; Ghafourian, Taravat

    2015-01-01

    Volume of distribution is an important pharmacokinetic property that indicates the extent of a drug's distribution in the body tissues. This paper addresses the problem of how to estimate the apparent volume of distribution at steady state (Vss) of chemical compounds in the human body using decision tree-based regression methods from the area of data mining (or machine learning). Hence, the pros and cons of several different types of decision tree-based regression methods have been discussed. The regression methods predict Vss using, as predictive features, both the compounds' molecular descriptors and the compounds' tissue:plasma partition coefficients (Kt:p) - often used in physiologically-based pharmacokinetics. Therefore, this work has assessed whether the data mining-based prediction of Vss can be made more accurate by using as input not only the compounds' molecular descriptors but also (a subset of) their predicted Kt:p values. Comparison of the models that used only molecular descriptors, in particular, the Bagging decision tree (mean fold error of 2.33), with those employing predicted Kt:p values in addition to the molecular descriptors, such as the Bagging decision tree using adipose Kt:p (mean fold error of 2.29), indicated that the use of predicted Kt:p values as descriptors may be beneficial for accurate prediction of Vss using decision trees if prior feature selection is applied. Decision tree based models presented in this work have an accuracy that is reasonable and similar to the accuracy of reported Vss inter-species extrapolations in the literature. The estimation of Vss for new compounds in drug discovery will benefit from methods that are able to integrate large and varied sources of data and flexible non-linear data mining methods such as decision trees, which can produce interpretable models. Graphical AbstractDecision trees for the prediction of tissue partition coefficient and volume of distribution of drugs.

  13. Calculation of the fuel temperature coefficient of reactivity considering non-uniform radial temperature distribution in the fuel rod

    Energy Technology Data Exchange (ETDEWEB)

    Pazirandeh, Ali [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Science and Research Branch; Hooshyar Mobaraki, Almas

    2017-07-15

    The safe operation of a reactor is based on feedback models. In this paper we attempted to discuss the influence of a non-uniform radial temperature distribution on the fuel rod temperature coefficient of reactivity. The paper demonstrates that the neutron properties of a reactor core is based on effective temperature of the fuel to obtain the correct fuel temperature feedback. The value of volume-averaged temperature being used in the calculations of neutron physics with feedbacks would result in underestimating the probable event. In the calculation it is necessary to use the effective temperature of the fuel in order to provide correct accounting of the fuel temperature feedback. Fuel temperature changes in different zones of the core and consequently reactivity coefficient change are an important parameter for analysis of transient conditions. The restricting factor that compensates the inserted reactivity is the temperature reactivity coefficient and effective delayed neutron fraction.

  14. Limits of the memory coefficient in measuring correlated bursts

    Science.gov (United States)

    Jo, Hang-Hyun; Hiraoka, Takayuki

    2018-03-01

    Temporal inhomogeneities in event sequences of natural and social phenomena have been characterized in terms of interevent times and correlations between interevent times. The inhomogeneities of interevent times have been extensively studied, while the correlations between interevent times, often called correlated bursts, are far from being fully understood. For measuring the correlated bursts, two relevant approaches were suggested, i.e., memory coefficient and burst size distribution. Here a burst size denotes the number of events in a bursty train detected for a given time window. Empirical analyses have revealed that the larger memory coefficient tends to be associated with the heavier tail of the burst size distribution. In particular, empirical findings in human activities appear inconsistent, such that the memory coefficient is close to 0, while burst size distributions follow a power law. In order to comprehend these observations, by assuming the conditional independence between consecutive interevent times, we derive the analytical form of the memory coefficient as a function of parameters describing interevent time and burst size distributions. Our analytical result can explain the general tendency of the larger memory coefficient being associated with the heavier tail of burst size distribution. We also find that the apparently inconsistent observations in human activities are compatible with each other, indicating that the memory coefficient has limits to measure the correlated bursts.

  15. Probabilistic methods for seasonal forecasting in a changing climate: Cox-type regression models

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.

    2010-01-01

    For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative

  16. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  17. Probabilistic modeling of the flows and environmental risks of nano-silica

    International Nuclear Information System (INIS)

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-01-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053–3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg·y in the EU (0.19–12 mg/kg·y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. - Highlights: • We quantify the exposure of nano-silica to technical systems and the environment. • The median concentration in surface waters is predicted to be 0.12 μg/L in the EU. • Probabilistic species sensitivity distributions were computed for surface waters. • The risk assessment suggests that nano-silica poses no risk to aquatic organisms.

  18. Chaotic oscillations of the Klein-Gordon equation with distributed energy pumping and van der Pol boundary regulation and distributed time-varying coefficients

    Directory of Open Access Journals (Sweden)

    Bo Sun

    2014-09-01

    Full Text Available Consider the Klein-Gordon equation with variable coefficients, a van der Pol cubic nonlinearity in one of the boundary conditions and a spatially distributed antidamping term, we use a variable-substitution technique together with the analogy with the 1-dimensional wave equation to prove that for the Klein-Gordon equation chaos occurs for a class of equations and boundary conditions when system parameters enter a certain regime. Chaotic and nonchaotic profiles of solutions are illustrated by computer graphics.

  19. Attention as Inference: Selection Is Probabilistic; Responses Are All-or-None Samples

    Science.gov (United States)

    Vul, Edward; Hanus, Deborah; Kanwisher, Nancy

    2009-01-01

    Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than…

  20. Trait-Dependent Biogeography: (Re)Integrating Biology into Probabilistic Historical Biogeographical Models.

    Science.gov (United States)

    Sukumaran, Jeet; Knowles, L Lacey

    2018-04-20

    The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  2. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  3. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  4. Monitoring device for local power peaking coefficient

    International Nuclear Information System (INIS)

    Mitsuhashi, Ishi

    1987-01-01

    Purpose: To monitor the local power peaking coefficients obtained by the method not depending on the combination of fuel types. Method: A plurality of representative values for the local power distribution determined by the nuclear constant calculation for one fuel assembly are memorized regarding each of the burn-up degree and the void coefficient on every positions and fuel types in fuel rod assemblies. While on the other hand, the representative values for the local power distribution as described above are compensated by a compensation coefficient considering the effect of adjacent segments and a control rod compensation coefficient considering the effect due to the control rod insertion relative to the just-mentioned compensation coefficient. Then, the maximum value among them is selected to determine the local power peaking coefficient at each of the times and each of the segments, which is monitored. According to this system, the calculation and the working required for the fitting work depending on the combination of fuel types are no more required at all to facilitate the maintenance as well. (Horiuchi, T.)

  5. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    Science.gov (United States)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  6. Avalanche breakdown and the probabilistic nature of laser failure

    Energy Technology Data Exchange (ETDEWEB)

    Mitropol' skii, M M; Khote' enkov, V A; Khodakov, G S

    1976-01-01

    A study was made of the probabilistic aspects of the development of an electron avalanche arising under the influence of a powerful laser beam in a solid transparent dielectric. The distribution function of time and relative fluctuation of the number of electrons was found. The width of the probability function of failure was determined as a function of intensity. The relative dispersion of time of beginning of breakdown can also be determined. Its numerical value under identical conditions is +-7%. These results are similar to the experimentally defined dispersion from an earlier work. The data produced also show that, in spite of the clearly probabilistic nature of the development of an avalanche, the slight width of the distribution causes the use of the threshold criterion for rupture of transparent dielectrics by laser radiation to be practically correct. The dependence of I/sub 0/ on pulse length agrees with experimental data, whereas I/sub 0/ (V) is significantly weaker than the actually observed value. This disagreement can be explained by various imperfections in the structure of the crystals and by their contamination, the frequency of appearance of which, in the focal volume, is proportional to its size and which were not considered in the theoretical statements developed here.

  7. Probabilistic Power Flow Method Considering Continuous and Discrete Variables

    Directory of Open Access Journals (Sweden)

    Xuexia Zhang

    2017-04-01

    Full Text Available This paper proposes a probabilistic power flow (PPF method considering continuous and discrete variables (continuous and discrete power flow, CDPF for power systems. The proposed method—based on the cumulant method (CM and multiple deterministic power flow (MDPF calculations—can deal with continuous variables such as wind power generation (WPG and loads, and discrete variables such as fuel cell generation (FCG. In this paper, continuous variables follow a normal distribution (loads or a non-normal distribution (WPG, and discrete variables follow a binomial distribution (FCG. Through testing on IEEE 14-bus and IEEE 118-bus power systems, the proposed method (CDPF has better accuracy compared with the CM, and higher efficiency compared with the Monte Carlo simulation method (MCSM.

  8. Up-gradient transport in a probabilistic transport model

    DEFF Research Database (Denmark)

    Gavnholt, J.; Juul Rasmussen, J.; Garcia, O.E.

    2005-01-01

    The transport of particles or heat against the driving gradient is studied by employing a probabilistic transport model with a characteristic particle step length that depends on the local concentration or heat gradient. When this gradient is larger than a prescribed critical value, the standard....... These results supplement recent works by van Milligen [Phys. Plasmas 11, 3787 (2004)], which applied Levy distributed step sizes in the case of supercritical gradients to obtain the up-gradient transport. (c) 2005 American Institute of Physics....

  9. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  10. Probabilistic Amplitude Shaping With Hard Decision Decoding and Staircase Codes

    Science.gov (United States)

    Sheikh, Alireza; Amat, Alexandre Graell i.; Liva, Gianluigi; Steiner, Fabian

    2018-05-01

    We consider probabilistic amplitude shaping (PAS) as a means of increasing the spectral efficiency of fiber-optic communication systems. In contrast to previous works in the literature, we consider probabilistic shaping with hard decision decoding (HDD). In particular, we apply the PAS recently introduced by B\\"ocherer \\emph{et al.} to a coded modulation (CM) scheme with bit-wise HDD that uses a staircase code as the forward error correction code. We show that the CM scheme with PAS and staircase codes yields significant gains in spectral efficiency with respect to the baseline scheme using a staircase code and a standard constellation with uniformly distributed signal points. Using a single staircase code, the proposed scheme achieves performance within $0.57$--$1.44$ dB of the corresponding achievable information rate for a wide range of spectral efficiencies.

  11. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  12. Probabilistic Load-Flow Analysis of Biomass-Fuelled Gas Engines with Electrical Vehicles in Distribution Systems

    Directory of Open Access Journals (Sweden)

    Francisco J. Ruiz-Rodríguez

    2017-10-01

    Full Text Available Feeding biomass-fueled gas engines (BFGEs with olive tree pruning residues offers new opportunities to decrease fossil fuel use in road vehicles and electricity generation. BFGEs, coupled to radial distribution systems (RDSs, provide renewable energy and power that can feed electric vehicle (EV charging stations. However, the combined impact of BFGEs and EVs on RDSs must be assessed to assure the technical constraint fulfilment. Because of the stochastic nature of source/load, it was decided that a probabilistic approach was the most viable option for this assessment. Consequently, this research developed an analytical technique to evaluate the technical constraint fulfilment in RDSs with this combined interaction. The proposed analytical technique (PAT involved the calculation of cumulants and the linearization of load-flow equations, along with the application of the cumulant method, and Cornish-Fisher expansion. The uncertainties related to biomass stock and its heating value (HV were important factors that were assessed for the first time. Application of the PAT in a Spanish RDS with BFGEs and EVs confirmed the feasibility of the proposal and its additional benefits. Specifically, BFGEs were found to clearly contribute to the voltage constraint fulfilment. The computational cost of the PAT was lower than that associated with Monte-Carlo simulations (MCSs.

  13. Variable-coefficient higher-order nonlinear Schroedinger model in optical fibers: Variable-coefficient bilinear form, Baecklund transformation, brightons and symbolic computation

    International Nuclear Information System (INIS)

    Tian Bo; Gao Yitian; Zhu Hongwu

    2007-01-01

    Symbolically investigated in this Letter is a variable-coefficient higher-order nonlinear Schroedinger (vcHNLS) model for ultrafast signal-routing, fiber laser systems and optical communication systems with distributed dispersion and nonlinearity management. Of physical and optical interests, with bilinear method extend, the vcHNLS model is transformed into a variable-coefficient bilinear form, and then an auto-Baecklund transformation is constructed. Constraints on coefficient functions are analyzed. Potentially observable with future optical-fiber experiments, variable-coefficient brightons are illustrated. Relevant properties and features are discussed as well. Baecklund transformation and other results of this Letter will be of certain value to the studies on inhomogeneous fiber media, core of dispersion-managed brightons, fiber amplifiers, laser systems and optical communication links with distributed dispersion and nonlinearity management

  14. Probabilistic design of fibre concrete structures

    Science.gov (United States)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  15. Probabilistically modeling lava flows with MOLASSES

    Science.gov (United States)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  16. Probabilistic Capacity Assessment of Lattice Transmission Towers under Strong Wind

    Directory of Open Access Journals (Sweden)

    Wei eZhang

    2015-10-01

    Full Text Available Serving as one key component of the most important lifeline infrastructure system, transmission towers are vulnerable to multiple nature hazards including strong wind and could pose severe threats to the power system security with possible blackouts under extreme weather conditions, such as hurricanes, derechoes, or winter storms. For the security and resiliency of the power system, it is important to ensure the structural safety with enough capacity for all possible failure modes, such as structural stability. The study is to develop a probabilistic capacity assessment approach for transmission towers under strong wind loads. Due to the complicated structural details of lattice transmission towers, wind tunnel experiments are carried out to understand the complex interactions of wind and the lattice sections of transmission tower and drag coefficients and the dynamic amplification factor for different panels of the transmission tower are obtained. The wind profile is generated and the wind time histories are simulated as a summation of time-varying mean and fluctuating components. The capacity curve for the transmission towers is obtained from the incremental dynamic analysis (IDA method. To consider the stochastic nature of wind field, probabilistic capacity curves are generated by implementing IDA analysis for different wind yaw angles and different randomly generated wind speed time histories. After building the limit state functions based on the maximum allowable drift to height ratio, the probabilities of failure are obtained based on the meteorological data at a given site. As the transmission tower serves as the key nodes for the power network, the probabilistic capacity curves can be incorporated into the performance based design of the power transmission network.

  17. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  18. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  19. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Probabilistic Models

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results of comparison of calculated and experimental results for critical configurations, temperature coefficients, kinetic parameters and fission rates evaluated with probabilistic models spatial distributions are shown. (author)

  20. Online Monitoring of Water-Quality Anomaly in Water Distribution Systems Based on Probabilistic Principal Component Analysis by UV-Vis Absorption Spectroscopy

    Directory of Open Access Journals (Sweden)

    Dibo Hou

    2014-01-01

    Full Text Available This study proposes a probabilistic principal component analysis- (PPCA- based method for online monitoring of water-quality contaminant events by UV-Vis (ultraviolet-visible spectroscopy. The purpose of this method is to achieve fast and sound protection against accidental and intentional contaminate injection into the water distribution system. The method is achieved first by properly imposing a sliding window onto simultaneously updated online monitoring data collected by the automated spectrometer. The PPCA algorithm is then executed to simplify the large amount of spectrum data while maintaining the necessary spectral information to the largest extent. Finally, a monitoring chart extensively employed in fault diagnosis field methods is used here to search for potential anomaly events and to determine whether the current water-quality is normal or abnormal. A small-scale water-pipe distribution network is tested to detect water contamination events. The tests demonstrate that the PPCA-based online monitoring model can achieve satisfactory results under the ROC curve, which denotes a low false alarm rate and high probability of detecting water contamination events.

  1. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  2. Sorption distribution coefficients of uranium, thorium and radium of selected Malaysian peat soils

    International Nuclear Information System (INIS)

    Mohd Zaidi Ibrahim; Zalina Laili; Muhamat Omar; Phillip, Esther

    2010-01-01

    A study on sorption of uranium, thorium and radium on Malaysian peat soils was conducted to determine their distribution coefficient (K d ) values. Batch studies were performed to investigate the influence of pH and the concentrations of radionuclides. Peat soil samples used in this study were collected from Bachok, Batu Pahat, Dalat, Hutan Melintang and Pekan. The peat samples from different location have different chemical characteristics and K d values. No correlation was found between chemical characteristics and the K d values for radium and thorium, but K d value for uranium was found correlated with humic and organic content. The K d value was found to be influenced by soluble humic substances or humic substances leach out from peat soils. (author)

  3. Evaluating and categorizing the reliability of distribution coefficient values in the sorption database

    International Nuclear Information System (INIS)

    Ochs, Michael; Saito, Yoshihiko; Kitamura, Akira; Shibata, Masahiro; Sasamoto, Hiroshi; Yui, Mikazu

    2007-03-01

    Japan Atomic Energy Agency (JAEA) has developed the sorption database (JNC-SDB) for bentonite and rocks in order to assess the retardation property of important radioactive elements in natural and engineered barriers in the H12 report. The database includes distribution coefficient (K d ) of important radionuclides. The K d values in the SDB are about 20,000 data. The SDB includes a great variety of K d and additional key information from many different literatures. Accordingly, the classification guideline and classification system were developed in order to evaluate the reliability of each K d value (Th, Pa, U, Np, Pu, Am, Cm, Cs, Ra, Se, Tc on bentonite). The reliability of 3740 K d values are evaluated and categorized. (author)

  4. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing.

    Science.gov (United States)

    Pajak, Bozena; Fine, Alex B; Kleinschmidt, Dave F; Jaeger, T Florian

    2016-12-01

    We present a framework of second and additional language (L2/L n ) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/L n learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/L n acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/L n learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa.

  5. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    Science.gov (United States)

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  6. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food

    International Nuclear Information System (INIS)

    Jacobs, Rianne; Voet, Hilko van der; Braak, Cajo J. F. ter

    2015-01-01

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5–200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects

  7. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Rianne, E-mail: rianne.jacobs@wur.nl; Voet, Hilko van der; Braak, Cajo J. F. ter [Wageningen University and Research Centre, Biometris (Netherlands)

    2015-06-15

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5–200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  8. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  9. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  10. Moment Distributions of Phase Type

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2011-01-01

    Moment distributions of phase-type and matrix-exponential distributions are shown to remain within their respective classes. We provide a probabilistic phase-type representation for the former case and an alternative representation, with an analytically appealing form, for the latter. First order...

  11. Performing Probabilistic Risk Assessment Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  12. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  13. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  14. Very-short-term wind power probabilistic forecasts by sparse vector autoregression

    DEFF Research Database (Denmark)

    Dowell, Jethro; Pinson, Pierre

    2016-01-01

    A spatio-temporal method for producing very-shortterm parametric probabilistic wind power forecasts at a large number of locations is presented. Smart grids containing tens, or hundreds, of wind generators require skilled very-short-term forecasts to operate effectively, and spatial information...... is highly desirable. In addition, probabilistic forecasts are widely regarded as necessary for optimal power system management as they quantify the uncertainty associated with point forecasts. Here we work within a parametric framework based on the logit-normal distribution and forecast its parameters....... The location parameter for multiple wind farms is modelled as a vector-valued spatiotemporal process, and the scale parameter is tracked by modified exponential smoothing. A state-of-the-art technique for fitting sparse vector autoregressive models is employed to model the location parameter and demonstrates...

  15. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2016-02-11

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgrid system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.

  16. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  17. Using the Gini Coefficient for Bug Prediction in Eclipse

    NARCIS (Netherlands)

    Giger, E.; Pinzger, M.; Gall, H.C.

    2011-01-01

    The Gini coefficient is a prominent measure to quantify the inequality of a distribution. It is often used in the field of economy to describe how goods, e.g., wealth or farmland, are distributed among people. We use the Gini coefficient to measure code ownership by investigating how changes made to

  18. All-possible-couplings approach to measuring probabilistic context.

    Directory of Open Access Journals (Sweden)

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  19. A probabilistic risk assessment for field radiography based on expert judgment and opinion

    International Nuclear Information System (INIS)

    Jang, Han-Ki; Ryu, Hyung-Joon; Kim, Ji-Young; Lee, Jai-Ki; Cho, Kun-Woo

    2011-01-01

    A probabilistic approach was applied to assess radiation risk associated with the field radiography using gamma sources. The Delphi method based on the expert judgments and opinions was used in the process of characterization of parameters affecting risk, which are inevitably subject to large uncertainties. A mathematical approach applying the Bayesian inferences was employed for data processing to improve the Delphi results. This process consists of three phases: (1) setting prior distributions, (2) constructing the likelihood functions and (3) deriving the posterior distributions based on the likelihood functions. The approach for characterizing input parameters using the Bayesian inference is provided for improved risk estimates without intentional rejection of part of the data, which demonstrated utility of Bayesian updating of distributions of uncertain input parameters in PRA (Probabilistic Risk Assessment). The data analysis portion for PRA in field radiography is addressed for estimates of the parameters used to determine the frequencies and consequences of the various events modeled. In this study, radiological risks for the worker and the public member in the vicinity of the work place are estimated for field radiography system in Korea based on two-dimensional Monte Carlo Analysis (2D MCA). (author)

  20. Application of probabilistic methods for sizing of safety factors in studies on defect harm fullness

    International Nuclear Information System (INIS)

    Ardillon, E.; Pitner, P.

    1996-01-01

    The design rules that are currently under application in nuclear engineering recommend the use of deterministic analysis methods. Probabilistic methods allow the uncertainties inherent in input variables of the analytical model to be taken into account owing to data provided by operation feedback so as to better evaluate the link between the deterministic margins adopted and the actual risk level. In the Resistance R/Loading L elementary case where the variables are Gaussian, there is an explicit relation between the required safety level and the partial safety coefficients which affect each variable. In the complex case of a flawed pipe subjected to various modes of ruin where many random variables are not Gaussian, one can obtain implicit relations. These relations allow a certain flexibility when choosing the coefficients, which poses the problem of their optimum calibration. The choice of coefficients based upon the coordinates of the ''most probable failure point'' illustrates this approach. (authors). 7 refs., 5 figs., 2 tabs

  1. Effects of structural nonlinearity and foundation sliding on probabilistic response of a nuclear structure

    International Nuclear Information System (INIS)

    Hashemi, Alidad; Elkhoraibi, Tarek; Ostadan, Farhang

    2015-01-01

    Highlights: • Probabilistic SSI analysis including structural nonlinearity and sliding are shown. • Analysis is done for a soil and a rock site and probabilistic demands are obtained. • Structural drift ratios and In-structure response spectra are evaluated. • Structural nonlinearity significantly impacts local demands in the structure. • Sliding generally reduces seismic demands and can be accommodated in design. - Abstract: This paper examines the effects of structural nonlinearity and foundation sliding on the results of probabilistic structural analysis of a typical nuclear structure where structural nonlinearity, foundation sliding and soil-structure interaction (SSI) are explicitly included. The evaluation is carried out for a soil and a rock site at 10"4, 10"5, and 10"6 year return periods (1E − 4, 1E − 5, and 1E − 6 hazard levels, respectively). The input motions at each considered hazard level are deaggregated into low frequency (LF) and high frequency (HF) motions and a sample size of 30 is used for uncertainty propagation. The statistical distribution of structural responses including story drifts, and in-structure response spectra (ISRS) as well as foundation sliding displacements are examined. The probabilistic implementation of explicit structural nonlinearity and foundation sliding in combination with the SSI effects are demonstrated using nonlinear response history analysis (RHA) of the structure with the foundation motions obtained from elastic SSI analyses, which are applied as input to fixed-base inelastic analyses. This approach quantifies the expected structural nonlinearity and sliding for the particular structural configuration and provides a robust analytical basis for the estimation of the probabilistic distribution of selected demands parameters both at the design level and beyond design level seismic input. For the subject structure, the inclusion of foundation sliding in the analysis is found to have reduced both

  2. Probabilistic forecasting of the solar irradiance with recursive ARMA and GARCH models

    DEFF Research Database (Denmark)

    David, M.; Ramahatana, F.; Trombe, Pierre-Julien

    2016-01-01

    Forecasting of the solar irradiance is a key feature in order to increase the penetration rate of solar energy into the energy grids. Indeed, the anticipation of the fluctuations of the solar renewables allows a better management of the production means of electricity and a better operation...... sky index show some similarities with that of financial time series. The aim of this paper is to assess the performances of a commonly used combination of two linear models (ARMA and GARCH) in econometrics in order to provide probabilistic forecasts of solar irradiance. In addition, a recursive...... regarding the statistical distribution of the error, the reliability of the probabilistic forecasts stands in the same order of magnitude as other works done in the field of solar forecasting....

  3. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  4. Probabilistic assessments of fuel performance

    International Nuclear Information System (INIS)

    Kelppe, S.; Ranta-Puska, K.

    1998-01-01

    The probabilistic Monte Carlo Method, coupled with quasi-random sampling, is applied for the fuel performance analyses. By using known distributions of fabrication parameters and real power histories with their randomly selected combinations, and by making a large number of ENIGMA code calculations, one expects to find out the state of the whole reactor fuel. Good statistics requires thousands of runs. A sample case representing VVER-440 reactor fuel indicates relatively low fuel temperatures and mainly athermal fission gas release if any. The rod internal pressure remains typically below 2.5 MPa, which leaves a large margin to the system pressure of 12 MPa Gap conductance, an essential parameter in the accident evaluations, shows no decrease from its start-of-life value. (orig.)

  5. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  6. Probabilistic fracture finite elements

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-05-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  7. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  8. Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.

    Science.gov (United States)

    Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin

    2007-07-01

    Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.

  9. Random matrix theory analysis of cross-correlations in the US stock market: Evidence from Pearson’s correlation coefficient and detrended cross-correlation coefficient

    Science.gov (United States)

    Wang, Gang-Jin; Xie, Chi; Chen, Shou; Yang, Jiao-Jiao; Yang, Ming-Yan

    2013-09-01

    In this study, we first build two empirical cross-correlation matrices in the US stock market by two different methods, namely the Pearson’s correlation coefficient and the detrended cross-correlation coefficient (DCCA coefficient). Then, combining the two matrices with the method of random matrix theory (RMT), we mainly investigate the statistical properties of cross-correlations in the US stock market. We choose the daily closing prices of 462 constituent stocks of S&P 500 index as the research objects and select the sample data from January 3, 2005 to August 31, 2012. In the empirical analysis, we examine the statistical properties of cross-correlation coefficients, the distribution of eigenvalues, the distribution of eigenvector components, and the inverse participation ratio. From the two methods, we find some new results of the cross-correlations in the US stock market in our study, which are different from the conclusions reached by previous studies. The empirical cross-correlation matrices constructed by the DCCA coefficient show several interesting properties at different time scales in the US stock market, which are useful to the risk management and optimal portfolio selection, especially to the diversity of the asset portfolio. It will be an interesting and meaningful work to find the theoretical eigenvalue distribution of a completely random matrix R for the DCCA coefficient because it does not obey the Marčenko-Pastur distribution.

  10. Resummed coefficient function for the shape function

    OpenAIRE

    Aglietti, U.

    2001-01-01

    We present a leading evaluation of the resummed coefficient function for the shape function. It is also shown that the coefficient function is short-distance-dominated. Our results allow relating the shape function computed on the lattice to the physical QCD distributions.

  11. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  12. Correlation Coefficients: Appropriate Use and Interpretation.

    Science.gov (United States)

    Schober, Patrick; Boer, Christa; Schwarte, Lothar A

    2018-05-01

    Correlation in the broadest sense is a measure of an association between variables. In correlated data, the change in the magnitude of 1 variable is associated with a change in the magnitude of another variable, either in the same (positive correlation) or in the opposite (negative correlation) direction. Most often, the term correlation is used in the context of a linear relationship between 2 continuous variables and expressed as Pearson product-moment correlation. The Pearson correlation coefficient is typically used for jointly normally distributed data (data that follow a bivariate normal distribution). For nonnormally distributed continuous data, for ordinal data, or for data with relevant outliers, a Spearman rank correlation can be used as a measure of a monotonic association. Both correlation coefficients are scaled such that they range from -1 to +1, where 0 indicates that there is no linear or monotonic association, and the relationship gets stronger and ultimately approaches a straight line (Pearson correlation) or a constantly increasing or decreasing curve (Spearman correlation) as the coefficient approaches an absolute value of 1. Hypothesis tests and confidence intervals can be used to address the statistical significance of the results and to estimate the strength of the relationship in the population from which the data were sampled. The aim of this tutorial is to guide researchers and clinicians in the appropriate use and interpretation of correlation coefficients.

  13. Probabilistic methods for evaluation of erosion-corrosion wall thinning in french pressurized water reactors

    International Nuclear Information System (INIS)

    Ardillon, E.; Bouchacourt, M.

    1994-04-01

    This paper describes the application of the probabilistic approach to a selected study section having known characteristics. The method is based on the physico-chemical model of erosion-corrosion, the variables of which are probabilized. The three main aspects of the model, namely the thermohydraulic flow conditions, the chemistry of the fluid, and the geometry of the installation, are described. The study ultimately makes it possible determine: - the evolution of wall thinning distribution, using the power station's measurements; - the main parameters of influence on the kinetics of wall thinning; - the evolution of the fracture probabilistic of the pipe in question. (authors). 10 figs., 7 refs

  14. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    input motions obtained from Probabilistic Seismic Hazard Analysis (PSHA) and the site response analysis is conducted with simulated soil profiles and accompanying soil nonlinearity curves. The deterministic approach utilizes three strain-compatible soil profiles (Lower Bound (LB), Best Estimate (BE) and Upper Bound (UB)) determined based on the variation of strain-compatible soil profiles obtained from the probabilistic site response analysis and uses SSI analysis to determine a conservative estimate of the required response as the envelope of the SSI results from LB, BE and UB soil cases. In contrast, the probabilistic SSI analysis propagates the uncertainty in the soil and structural properties and provides rigorous estimates for the statistical distribution of the response parameters of interest. The engineering demand parameters considered are the story drifts and ISRS at key locations in the example structure. The results from the deterministic and probabilistic approaches, with and without ground motion incoherency effects, are compared and discussed. Recommendations are made regarding the efficient use of statistical methods in probabilistic SSI analysis and the use of such results in Integrated Soil-Structure Fragility Analysis (ISSFA) and performance-based design

  15. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    input motions obtained from Probabilistic Seismic Hazard Analysis (PSHA) and the site response analysis is conducted with simulated soil profiles and accompanying soil nonlinearity curves. The deterministic approach utilizes three strain-compatible soil profiles (Lower Bound (LB), Best Estimate (BE) and Upper Bound (UB)) determined based on the variation of strain-compatible soil profiles obtained from the probabilistic site response analysis and uses SSI analysis to determine a conservative estimate of the required response as the envelope of the SSI results from LB, BE and UB soil cases. In contrast, the probabilistic SSI analysis propagates the uncertainty in the soil and structural properties and provides rigorous estimates for the statistical distribution of the response parameters of interest. The engineering demand parameters considered are the story drifts and ISRS at key locations in the example structure. The results from the deterministic and probabilistic approaches, with and without ground motion incoherency effects, are compared and discussed. Recommendations are made regarding the efficient use of statistical methods in probabilistic SSI analysis and the use of such results in Integrated Soil-Structure Fragility Analysis (ISSFA) and performance-based design.

  16. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    Science.gov (United States)

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  17. A study on the weather sampling method for probabilistic consequence analysis

    International Nuclear Information System (INIS)

    Oh, Hae Cheol

    1996-02-01

    The main task of probabilistic accident consequence analysis model is to predict the radiological situation and to provide a reliable quantitative data base for making decisions on countermeasures. The magnitude of accident consequence is depended on the characteristic of the accident and the weather coincident. In probabilistic accident consequence analysis, it is necessary to repeat the atmospheric dispersion calculation with several hundreds of weather sequences to predict the full distribution of consequences which may occur following a postulated accident release. It is desirable to select a representative sample of weather sequences from a meteorological record which is typical of the area over which the released radionuclides will disperse and which spans a sufficiently long period. The selection process is done by means of sampling techniques from a full year of hourly weather data characteristic of the plant site. In this study, the proposed Weighted importance sampling method selects proportional to the each bin size to closely approximate the true frequency distribution of weather condition at the site. The Weighted importance sampling method results in substantially less sampling uncertainty than the previous technique. The proposed technique can result in improve confidence in risk estimates

  18. Supporting Keyword Search for Image Retrieval with Integration of Probabilistic Annotation

    Directory of Open Access Journals (Sweden)

    Tie Hua Zhou

    2015-05-01

    Full Text Available The ever-increasing quantities of digital photo resources are annotated with enriching vocabularies to form semantic annotations. Photo-sharing social networks have boosted the need for efficient and intuitive querying to respond to user requirements in large-scale image collections. In order to help users formulate efficient and effective image retrieval, we present a novel integration of a probabilistic model based on keyword query architecture that models the probability distribution of image annotations: allowing users to obtain satisfactory results from image retrieval via the integration of multiple annotations. We focus on the annotation integration step in order to specify the meaning of each image annotation, thus leading to the most representative annotations of the intent of a keyword search. For this demonstration, we show how a probabilistic model has been integrated to semantic annotations to allow users to intuitively define explicit and precise keyword queries in order to retrieve satisfactory image results distributed in heterogeneous large data sources. Our experiments on SBU (collected by Stony Brook University database show that (i our integrated annotation contains higher quality representatives and semantic matches; and (ii the results indicating annotation integration can indeed improve image search result quality.

  19. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  20. Study of the mineralogic composition influence of the ground of Goiania, Brazil, in the distribution coefficient of the 137 Cs

    International Nuclear Information System (INIS)

    Anon

    2000-01-01

    Interactions between the soil and a solute in a liquid effluent depend on the characteristics of this solute as well as on the nature of the soil. The soil distribution coefficient K d in non altered natural systems can be estimated by the summation of the K d 's of its individual mineral components. In thesis this approach allows an evaluation of the distribution coefficient on the basis of the qualitative and quantitative identification of the mineral constituents of a given soil. This work reported here aimed at gathering preliminary data and at pointing the relevance of the approach in studies of radionuclide sorption by soils. A sample of the Goiania soil was collected and analyzed by X Ray diffractometry. The thus identified mineral components were subsequently submitted to individual batch tests, using 137 Cs as tracer. The mineral species tested were: kaolinite, gibbsite, goethite, hematite, muscovite, quartz and zirconite. The values obtained for K d are presented and discussed. (author)

  1. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  2. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.

  3. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  4. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  5. Performance of intraclass correlation coefficient (ICC) as a reliability index under various distributions in scale reliability studies.

    Science.gov (United States)

    Mehta, Shraddha; Bastero-Caballero, Rowena F; Sun, Yijun; Zhu, Ray; Murphy, Diane K; Hardas, Bhushan; Koch, Gary

    2018-04-29

    Many published scale validation studies determine inter-rater reliability using the intra-class correlation coefficient (ICC). However, the use of this statistic must consider its advantages, limitations, and applicability. This paper evaluates how interaction of subject distribution, sample size, and levels of rater disagreement affects ICC and provides an approach for obtaining relevant ICC estimates under suboptimal conditions. Simulation results suggest that for a fixed number of subjects, ICC from the convex distribution is smaller than ICC for the uniform distribution, which in turn is smaller than ICC for the concave distribution. The variance component estimates also show that the dissimilarity of ICC among distributions is attributed to the study design (ie, distribution of subjects) component of subject variability and not the scale quality component of rater error variability. The dependency of ICC on the distribution of subjects makes it difficult to compare results across reliability studies. Hence, it is proposed that reliability studies should be designed using a uniform distribution of subjects because of the standardization it provides for representing objective disagreement. In the absence of uniform distribution, a sampling method is proposed to reduce the non-uniformity. In addition, as expected, high levels of disagreement result in low ICC, and when the type of distribution is fixed, any increase in the number of subjects beyond a moderately large specification such as n = 80 does not have a major impact on ICC. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  7. Radionuclide adsorption distribution coefficients measured in Hanford sediments for the low level waste performance assessment project

    International Nuclear Information System (INIS)

    Kaplan, D.I.; Serne, R.J.; Owen, A.T.

    1996-08-01

    Preliminary modeling efforts for the Hanford Site's Low Level Waste-Performance Assessment (LLW PA) identified 129 I, 237 Np, 79 Se, 99 Tc, and 234 , 235 , 238 U as posing the greatest potential health hazard. It was also determined that the outcome of these simulations was very sensitive to the parameter describing the extent to which radionuclides sorb to the subsurface matrix, i.e., the distribution coefficient (K d ). The distribution coefficient is a ratio of the radionuclide concentration associated with the solid phase to that in the liquid phase. The objectives of this study were to (1) measure iodine, neptunium, technetium, and uranium K d values using laboratory conditions similar to those expected at the LLW PA disposal site, and (2) evaluate the effect of selected environmental parameters, such as pH, ionic strength, moisture concentration, and radio nuclide concentration, on K d values of selected radionuclides. It is the intent of these studies to develop technically defensible K d values for the PA. The approach taken throughout these studies was to measure the key radio nuclide K d values as a function of several environmental parameters likely to affect their values. Such an approach provides technical defensibility by identifying the mechanisms responsible for trends in K d values. Additionally, such studies provide valuable guidance regarding the range of K d values likely to be encountered in the proposed disposal site

  8. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  9. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  10. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  11. Free vibration analysis of straight-line beam regarded as distributed system by combining Wittrick-Williams algorithm and transfer dynamic stiffness coefficient method

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Myung Soo; Yang, Kyong Uk [Chonnam National University, Yeosu (Korea, Republic of); Kondou, Takahiro [Kyushu University, Fukuoka (Japan); Bonkobara, Yasuhiro [University of Miyazaki, Miyazaki (Japan)

    2016-03-15

    We developed a method for analyzing the free vibration of a structure regarded as a distributed system, by combining the Wittrick-Williams algorithm and the transfer dynamic stiffness coefficient method. A computational algorithm was formulated for analyzing the free vibration of a straight-line beam regarded as a distributed system, to explain the concept of the developed method. To verify the effectiveness of the developed method, the natural frequencies of straight-line beams were computed using the finite element method, transfer matrix method, transfer dynamic stiffness coefficient method, the exact solution, and the developed method. By comparing the computational results of the developed method with those of the other methods, we confirmed that the developed method exhibited superior performance over the other methods in terms of computational accuracy, cost and user convenience.

  12. Uncertainty propagation in probabilistic risk assessment: A comparative study

    International Nuclear Information System (INIS)

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  13. Development of RESRAD probabilistic computer codes for NRC decommissioning and license termination applications

    International Nuclear Information System (INIS)

    Chen, S. Y.; Yu, C.; Mo, T.; Trottier, C.

    2000-01-01

    In 1999, the US Nuclear Regulatory Commission (NRC) tasked Argonne National Laboratory to modify the existing RESRAD and RESRAD-BUILD codes to perform probabilistic, site-specific dose analysis for use with the NRC's Standard Review Plan for demonstrating compliance with the license termination rule. The RESRAD codes have been developed by Argonne to support the US Department of Energy's (DOEs) cleanup efforts. Through more than a decade of application, the codes already have established a large user base in the nation and a rigorous QA support. The primary objectives of the NRC task are to: (1) extend the codes' capabilities to include probabilistic analysis, and (2) develop parameter distribution functions and perform probabilistic analysis with the codes. The new codes also contain user-friendly features specially designed with graphic-user interface. In October 2000, the revised RESRAD (version 6.0) and RESRAD-BUILD (version 3.0), together with the user's guide and relevant parameter information, have been developed and are made available to the general public via the Internet for use

  14. Probabilistic model for fatigue crack growth and fracture of welded joints in civil engineering structures

    NARCIS (Netherlands)

    Maljaars, J.; Steenbergen, H.M.G.M.; Vrouwenvelder, A.C.W.M.

    2012-01-01

    This paper presents a probabilistic assessment model for linear elastic fracture mechanics (LEFM). The model allows the determination of the failure probability of a structure subjected to fatigue loading. The distributions of the random variables for civil engineering structures are provided, and

  15. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  16. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  17. Solvent Extraction Batch Distribution Coefficients with Savannah River Site Dissolved Salt Cake

    International Nuclear Information System (INIS)

    Walker, D.D.

    2002-01-01

    Researchers characterized high-level waste derived from dissolved salt cake from the Savannah River Site (SRS) tank farm and measured the cesium distribution coefficients (DCs) for extraction, scrub, and stripping steps of the caustic-side solvent extraction (CSSX) flowsheet. The measurements used two SRS high-level waste samples derived entirely or in part from salt cake. The chemical compositions of both samples are reported. Dissolved salt cake waste contained less Cs-137 and more dianions than is typical of supernate samples. Extraction and scrub DCs values for both samples exceeded process requirements and agreed well with model predictions. Strip DCs values for the Tank 46F sample also met process requirements. However, strip DCs values could not be calculated for the Tank 38H sample due to the poor material balance for Cs-137. Potential explanations for the poor material balance are discussed and additional work to determine the cause is described

  18. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  19. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  20. Probabilistic prediction of expected ground condition and construction time and costs in road tunnels

    Directory of Open Access Journals (Sweden)

    A. Mahmoodzadeh

    2016-10-01

    Full Text Available Ground condition and construction (excavation and support time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic estimation of ground condition and construction time and costs is proposed, which is an integration of the ground prediction approach based on Markov process, and the time and cost variance analysis based on Monte-Carlo (MC simulation. The former provides the probabilistic description of ground classification along tunnel alignment according to the geological information revealed from geological profile and boreholes. The latter provides the probabilistic description of the expected construction time and costs for each operation according to the survey feedbacks from experts. Then an engineering application to Hamro tunnel is presented to demonstrate how the ground condition and the construction time and costs are estimated in a probabilistic way. In most items, in order to estimate the data needed for this methodology, a number of questionnaires are distributed among the tunneling experts and finally the mean values of the respondents are applied. These facilitate both the owners and the contractors to be aware of the risk that they should carry before construction, and are useful for both tendering and bidding.

  1. TensorFlow Distributions

    OpenAIRE

    Dillon, Joshua V.; Langmore, Ian; Tran, Dustin; Brevdo, Eugene; Vasudevan, Srinivas; Moore, Dave; Patton, Brian; Alemi, Alex; Hoffman, Matt; Saurous, Rif A.

    2017-01-01

    The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabilistic computation. Distributions provide fast, numerically stable methods for generating samples and computing statistics, e.g., log density. Bijectors provide composable volume-tracking transformations with automatic caching. Together these enable...

  2. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  3. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  4. A probabilistic analysis of cumulative carbon emissions and long-term planetary warming

    International Nuclear Information System (INIS)

    Fyke, Jeremy; Matthews, H Damon

    2015-01-01

    Efforts to mitigate and adapt to long-term climate change could benefit greatly from probabilistic estimates of cumulative carbon emissions due to fossil fuel burning and resulting CO 2 -induced planetary warming. Here we demonstrate the use of a reduced-form model to project these variables. We performed simulations using a large-ensemble framework with parametric uncertainty sampled to produce distributions of future cumulative emissions and consequent planetary warming. A hind-cast ensemble of simulations captured 1980–2012 historical CO 2 emissions trends and an ensemble of future projection simulations generated a distribution of emission scenarios that qualitatively resembled the suite of Representative and Extended Concentration Pathways. The resulting cumulative carbon emission and temperature change distributions are characterized by 5–95th percentile ranges of 0.96–4.9 teratonnes C (Tt C) and 1.4 °C–8.5 °C, respectively, with 50th percentiles at 3.1 Tt C and 4.7 °C. Within the wide range of policy-related parameter combinations that produced these distributions, we found that low-emission simulations were characterized by both high carbon prices and low costs of non-fossil fuel energy sources, suggesting the importance of these two policy levers in particular for avoiding dangerous levels of climate warming. With this analysis we demonstrate a probabilistic approach to the challenge of identifying strategies for limiting cumulative carbon emissions and assessing likelihoods of surpassing dangerous temperature thresholds. (letter)

  5. A PROBABILISTIC DEMAND APPLICATION IN THE AMERICAN CRACKER MARKET

    Directory of Open Access Journals (Sweden)

    Rutherford Cd. Johnson

    2016-07-01

    Full Text Available Knowledge of the distribution of consumer buying strategies by producers may permit improved marketing strategies and improved ability to respond to volatile market conditions. In order to investigate potential ways of gaining such knowledge, this study extends the work of Kahneman, Russell, and Thaler through the application of a probabilistic demand framework using Choice Wave theory, based on the Schrödinger Wave Equation in quantum mechanics. Probabilistic variability of response to health information and its potential influence on buying strategies is also investigated, extending the work of Clement, Johnson, Hu, Pagoulatos, and Debertin. In the present study, the domestic cracker market within fourteen U.S. metropolitan areas is segmented, using the Choice Wave Probabilistic Demand approach, into two statistically independent “Consumer Types” of seven metropolitan areas each. The two Consumer Types are shown to have statistically different elasticities than each other and from the combined market as a whole. This approach may provide not only improved marketing strategies through improved awareness of consumer preferences and buying strategies, especially in the volatile agricultural sector, but also may be useful in aiding producers of store brand/private label products in finding desirable markets in which to compete against national brands. The results also suggest that supply/producer-side strategies should take into account the ways in which information, whether under the direct control of the producer or not, may influence and change consumer buying strategies.

  6. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    Science.gov (United States)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  7. Adhesively bonded joints composed of pultruded adherends: Considerations at the upper tail of the material strength statistical distribution

    International Nuclear Information System (INIS)

    Vallee, T.; Keller, Th.; Fourestey, G.; Fournier, B.; Correia, J.R.

    2009-01-01

    The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)

  8. Adhesively bonded joints composed of pultruded adherends: Considerations at the upper tail of the material strength statistical distribution

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, T.; Keller, Th. [Ecole Polytech Fed Lausanne, CCLab, CH-1015 Lausanne, (Switzerland); Fourestey, G. [Ecole Polytech Fed Lausanne, IACS, Chair Modeling and Sci Comp, CH-1015 Lausanne, (Switzerland); Fournier, B. [CEA SACLAY ENSMP, DEN, DANS, DMN, SRMA, LC2M, F-91191 Gif Sur Yvette, (France); Correia, J.R. [Univ Tecn Lisbon, Inst Super Tecn, Civil Engn and Architecture Dept, P-1049001 Lisbon, (Portugal)

    2009-07-01

    The Weibull distribution, used to describe the scaling of strength of materials, has been verified on a wide range of materials and geometries: however, the quality of the fitting tended to be less good towards the upper tail. Based on a previously developed probabilistic strength prediction method for adhesively bonded joints composed of pultruded glass fiber-reinforced polymer (GFRP) adherends, where it was verified that a two-parameter Weibull probabilistic distribution was not able to model accurately the upper tail of a material strength distribution, different improved probabilistic distributions were compared to enhance the quality of strength predictions. The following probabilistic distributions were examined: a two-parameter Weibull (as a reference), m-fold Weibull, a Grafted Distribution, a Birnbaum-Saunders Distribution and a Generalized Lambda Distribution. The Generalized Lambda Distribution turned out to be the best analytical approximation for the strength data, providing a good fit to the experimental data, and leading to more accurate joint strength predictions than the original two-parameter Weibull distribution. It was found that a proper modeling of the upper tail leads to a noticeable increase of the quality of the predictions. (authors)

  9. Quantitative differentiation of breast lesions at 3T diffusion-weighted imaging (DWI) using the ratio of distributed diffusion coefficient (DDC).

    Science.gov (United States)

    Ertas, Gokhan; Onaygil, Can; Akin, Yasin; Kaya, Handan; Aribal, Erkin

    2016-12-01

    To investigate the accuracy of diffusion coefficients and diffusion coefficient ratios of breast lesions and of glandular breast tissue from mono- and stretched-exponential models for quantitative diagnosis in diffusion-weighted magnetic resonance imaging (MRI). We analyzed pathologically confirmed 170 lesions (85 benign and 85 malignant) imaged using a 3.0T MR scanner. Small regions of interest (ROIs) focusing on the highest signal intensity for lesions and also for glandular tissue of contralateral breast were obtained. Apparent diffusion coefficient (ADC) and distributed diffusion coefficient (DDC) were estimated by performing nonlinear fittings using mono- and stretched-exponential models, respectively. Coefficient ratios were calculated by dividing the lesion coefficient by the glandular tissue coefficient. A stretched exponential model provides significantly better fits then the monoexponential model (P DDC ratio (area under the curve [AUC] = 0.93) when compared with lesion DDC, ADC ratio, and lesion ADC (AUC = 0.91, 0.90, 0.90) but with no statistically significant difference (P > 0.05). At optimal thresholds, the DDC ratio achieves 93% sensitivity, 80% specificity, and 87% overall diagnostic accuracy, while ADC ratio leads to 89% sensitivity, 78% specificity, and 83% overall diagnostic accuracy. The stretched exponential model fits better with signal intensity measurements from both lesion and glandular tissue ROIs. Although the DDC ratio estimated by using the model shows a higher diagnostic accuracy than the ADC ratio, lesion DDC, and ADC, it is not statistically significant. J. Magn. Reson. Imaging 2016;44:1633-1641. © 2016 International Society for Magnetic Resonance in Medicine.

  10. Probabilistic analysis of bladed turbine disks and the effect of mistuning

    Science.gov (United States)

    Shah, Ashwin; Nagpal, V. K.; Chamis, C. C.

    1990-01-01

    Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping have been included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.

  11. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  12. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  13. Distribution coefficients of amino acid, peptide and enzyme in respect to aqueous two phase system composed of dextran, polyethylene glycol and water; Dekisutoran+poriechiren gurikoru+mizu karanaru suiseinisokei ni taisuru aminosan, pepuchido oyobi koso no bunpai keisu

    Energy Technology Data Exchange (ETDEWEB)

    Iwai, Yoshio [Kyushu University, Fukuoka (Japan); Kakizaka, Keijiro; Shindo, Takashi; Ishida, Otetsu; Arai, Yasuhiko

    1999-01-05

    Distribution coefficients of five kinds of amino acids (aspartic acid, asparagines, methionine, cysteine and cytidine) and two kinds of peptides (glycylglycine and hexane glycine) were measured. These distribution coefficients are in good correlation with the osmosis viral expression. The interaction parameter in the osmosis viral expression can be estimated by hydrophilic group parameter. The distribution coefficient of {alpha}-amylase was estimated by the osmosis viral expression using the above-mentioned hydrophilic group parameter, and the estimated value showed substantially good correspondence with the actually measured value, but for the distribution coefficient of {beta}-amylase, no coincidence was found. (translated by NEDO)

  14. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  15. An employment of distribution coefficients for the valuation of the soils with regard to their radiological risks

    International Nuclear Information System (INIS)

    Carini, F.; Silva, S.; Fontana, P.

    1985-01-01

    As a preliminary step it has been demonstrated that the distribution coefficients may prove useful items in order to classify soils to a radioprotective purpose. We have obtained transfer factors from soils, which were different according to their principal chemical and physical features, into plants typical of the middle Po valley agriculture. By utilizing these factors as guiding indexes,it is possible, through cluster analysis, to determine a range of soils with regard to their radiological risk starting only from the pedologic parameters

  16. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Ferreira, Clé cio S.; Genton, Marc G.

    2018-01-01

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down

  17. Stability analysis for discrete-time stochastic memristive neural networks with both leakage and probabilistic delays.

    Science.gov (United States)

    Liu, Hongjian; Wang, Zidong; Shen, Bo; Huang, Tingwen; Alsaadi, Fuad E

    2018-06-01

    This paper is concerned with the globally exponential stability problem for a class of discrete-time stochastic memristive neural networks (DSMNNs) with both leakage delays as well as probabilistic time-varying delays. For the probabilistic delays, a sequence of Bernoulli distributed random variables is utilized to determine within which intervals the time-varying delays fall at certain time instant. The sector-bounded activation function is considered in the addressed DSMNN. By taking into account the state-dependent characteristics of the network parameters and choosing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are established under which the underlying DSMNN is globally exponentially stable in the mean square. The derived conditions are made dependent on both the leakage and the probabilistic delays, and are therefore less conservative than the traditional delay-independent criteria. A simulation example is given to show the effectiveness of the proposed stability criterion. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Bayesian parameter estimation in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Siu, Nathan O.; Kelly, Dana L.

    1998-01-01

    Bayesian statistical methods are widely used in probabilistic risk assessment (PRA) because of their ability to provide useful estimates of model parameters when data are sparse and because the subjective probability framework, from which these methods are derived, is a natural framework to address the decision problems motivating PRA. This paper presents a tutorial on Bayesian parameter estimation especially relevant to PRA. It summarizes the philosophy behind these methods, approaches for constructing likelihood functions and prior distributions, some simple but realistic examples, and a variety of cautions and lessons regarding practical applications. References are also provided for more in-depth coverage of various topics

  19. Probabilistic Forecasts of Solar Irradiance by Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning; Morales González, Juan Miguel; Møller, Jan Kloppenborg

    2014-01-01

    approach allows for characterizing both the interdependence structure of prediction errors of short-term solar irradiance and their predictive distribution. Three different stochastic differential equation models are first fitted to a training data set and subsequently evaluated on a one-year test set...... included in probabilistic forecasts may be paramount for decision makers to efficiently make use of this uncertain and variable generation. In this paper, a stochastic differential equation framework for modeling the uncertainty associated with the solar irradiance point forecast is proposed. This modeling...

  20. Comparison of the distribution coefficients of plutonium and other radionuclides in Lake Michigan to those in other systems

    International Nuclear Information System (INIS)

    Wahlgren, M.A.; Nelson, D.M.

    1976-01-01

    Filtration of Lake Michigan water samples has been carried out routinely since 1973, and some plutonium concentrations in the seston have been reported. During 1975 and 1976 a sufficient number of filter samples from various depths was obtained throughout the field seasons to establish whether or not a distribution coefficient also controls the uptake of plutonium by the formation of particulates and their settling from the surface waters. Samples from ANL station 5 (10 km SW of Grand Haven, Michigan, water depth 67 m), of the southern basin, and from the lower Great Lakes have been analyzed for dry weight, ash weight, total organic (loss of weight on ignition), amorphous silica, calcite, and residual minerals. Distribution coefficients were calculated on the basis of each of these solid components, and self-consistent values were observed for depth, season, or lake only, on the basis of dry weight of seston. The findings strongly suggest that the uptake of fallout plutonium (including inputs of new fallout during the summer of 1975) is dominated by a surface coating process common to all seston particle types. An insufficient number of 137 Cs analyses were obtained to correlate its uptake to a specific component of the seston, but its behavior is clearly different from that of plutonium

  1. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  2. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    2003-11-01

    of correlation), and determining convergence may all affect calculated results. The report discusses the key issues for each stage and how these issues can be addressed in implementing probabilistic calculations and considered in reviews of such calculations. An important element of an assessment, whichever approach is adopted for undertaking calculations, is the communication of results. The report describes the use of outputs specific to probabilistic calculations, such as probability distribution and cumulative distribution functions, and also the application of general types of output for presenting probabilistic results. Illustrating the way in which a disposal system may evolve is an important part of assessments, and the report describes the issues that must be considered in using and interpreting risk and dose versus time plots

  3. On Continuous Distributions and Parameter Estimation in Probabilistic Logic Programs (Over continue verdelingen en het schatten van parameters in probabilistische logische programma's)

    OpenAIRE

    Gutmann, Bernd

    2011-01-01

    In the last decade remarkable progress has been made on combining statistical machine learning techniques, reasoning under uncertainty, and relational representations. The branch of Artificial Intelligence working on the synthesis of these three areas is known as statistical relational learning or probabilistic logic learning.ProbLog, one of the probabilistic frameworks developed, is an extension of the logic programming language Prolog with independent random variables that are defined by an...

  4. Temperature dependence of Kerr coefficient and quadratic polarized optical coefficient of a paraelectric Mn:Fe:KTN crystal

    Directory of Open Access Journals (Sweden)

    Qieni Lu

    2015-08-01

    Full Text Available We measure temperature dependence on Kerr coefficient and quadratic polarized optical coefficient of a paraelectric Mn:Fe:KTN crystal simultaneously in this work, based on digital holographic interferometry (DHI. And the spatial distribution of the field-induced refractive index change can also be visualized and estimated by numerically retrieving sequential phase maps of Mn:Fe:KTN crystal from recording digital holograms in different states. The refractive indices decrease with increasing temperature and quadratic polarized optical coefficient is insensitive to temperature. The experimental results suggest that the DHI method presented here is highly applicable in both visualizing the temporal and spatial behavior of the internal electric field and accurately measuring electro-optic coefficient for electrooptical media.

  5. Age-Specific Mortality and Fertility Rates for Probabilistic Population Projections

    OpenAIRE

    Ševčíková, Hana; Li, Nan; Kantorová, Vladimíra; Gerland, Patrick; Raftery, Adrian E.

    2015-01-01

    The United Nations released official probabilistic population projections (PPP) for all countries for the first time in July 2014. These were obtained by projecting the period total fertility rate (TFR) and life expectancy at birth ($e_0$) using Bayesian hierarchical models, yielding a large set of future trajectories of TFR and $e_0$ for all countries and future time periods to 2100, sampled from their joint predictive distribution. Each trajectory was then converted to age-specific mortalit...

  6. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  7. POISSON, Analysis Solution of Poisson Problems in Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1986-01-01

    1 - Description of program or function: Purpose of program: Analytic treatment of two-stage Poisson problem in Probabilistic Risk Assessment. Input: estimated a-priori mean failure rate and error factor of system considered (for calculation of stage-1 prior), number of failures and operating times for similar systems (for calculation of stage-2 prior). Output: a-posteriori probability distributions on linear and logarithmic time scale (on specified time grid) and expectation values of failure rate and error factors are calculated for: - stage-1 a-priori distribution, - stage-1 a-posteriori distribution, - stage-2 a-priori distribution, - stage-2 a-posteriori distribution. 2 - Method of solution: Bayesian approach with conjugate stage-1 prior, improved with experience from similar systems to yield stage-2 prior, and likelihood function from experience with system under study (documentation see below under 10.). 3 - Restrictions on the complexity of the problem: Up to 100 similar systems (including the system considered), arbitrary number of problems (failure types) with same grid

  8. Are Parton Distributions Positive?

    CERN Document Server

    Forte, Stefano; Ridolfi, Giovanni; Altarelli, Guido; Forte, Stefano; Ridolfi, Giovanni

    1999-01-01

    We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution

  9. Are parton distributions positive?

    International Nuclear Information System (INIS)

    Forte, Stefano; Altarelli, Guido; Ridolfi, Giovanni

    1999-01-01

    We show that the naive positivity conditions on polarized parton distributions which follow from their probabilistic interpretation in the naive parton model are reproduced in perturbative QCD at the leading log level if the quark and gluon distribution are defined in terms of physical processes. We show how these conditions are modified at the next-to-leading level, and discuss their phenomenological implications, in particular in view of the determination of the polarized gluon distribution

  10. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    Science.gov (United States)

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  11. Probabilistic modeling of material resistance to crack initiation due to hydrided region overloads in CANDU Zr-2.5%Nb pressure tubes

    International Nuclear Information System (INIS)

    Gutkin, L.; Scarth, D.A.

    2014-01-01

    Zr-2.5%Nb pressure tubes in CANDU nuclear reactors are susceptible to hydride-assisted cracking at the locations of stress concentration, such as in-service flaws. Probabilistic methodology is being developed to evaluate such flaws for crack initiation due to hydrided region overloads, which occur when the applied stress acting on a flaw with an existing hydrided region at its tip exceeds the stress at which the hydrided region is formed. As part of this development, probabilistic modeling of pressure tube material resistance to overload crack initiation has been performed on the basis of a set of test data specifically designed to study the effects of non-ratcheting hydride formation conditions and load reduction prior to hydride formation. In the modeling framework, the overload resistance is represented as a power-law function of the material resistance to initiation of delayed hydride cracking under constant loading, where both the overload crack initiation coefficient and the overload crack initiation exponent vary with the flaw geometry. In addition, the overload crack initiation coefficient varies with the extent of load reduction prior to hydride formation as well as the number of non-ratcheting hydride formation thermal cycles. (author)

  12. Probabilistic modeling of material resistance to crack initiation due to hydrided region overloads in CANDU Zr-2.5%Nb pressure tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gutkin, L.; Scarth, D.A. [Kinectrics Inc., Toronto, ON (Canada)

    2014-07-01

    Zr-2.5%Nb pressure tubes in CANDU nuclear reactors are susceptible to hydride-assisted cracking at the locations of stress concentration, such as in-service flaws. Probabilistic methodology is being developed to evaluate such flaws for crack initiation due to hydrided region overloads, which occur when the applied stress acting on a flaw with an existing hydrided region at its tip exceeds the stress at which the hydrided region is formed. As part of this development, probabilistic modeling of pressure tube material resistance to overload crack initiation has been performed on the basis of a set of test data specifically designed to study the effects of non-ratcheting hydride formation conditions and load reduction prior to hydride formation. In the modeling framework, the overload resistance is represented as a power-law function of the material resistance to initiation of delayed hydride cracking under constant loading, where both the overload crack initiation coefficient and the overload crack initiation exponent vary with the flaw geometry. In addition, the overload crack initiation coefficient varies with the extent of load reduction prior to hydride formation as well as the number of non-ratcheting hydride formation thermal cycles. (author)

  13. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  14. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  15. Distribution coefficient Kd in surface soils collected in Aomori prefecture

    International Nuclear Information System (INIS)

    Tsukada, Hirofumi; Hasegawa, Hidenao; Hisamatsu, Shun'ichi; Inaba, Jiro

    2000-01-01

    Soil-solution distribution coefficients (Kds), which are the ratio of an element concentration in a soil solid phase to that in a solution phase, for 32 elements in Andosols, Wet Andosols and Gleyed Andosols collected throughout Aomori Prefecture were determined. A dried soil sample was mixed with a 10-fold amount of pure water in a PPCO centrifuge tube, and then gently shaken for 24 h. The Kd values were obtained by measurement of element concentrations in solid and solution phases (batch method). The Kd values in this work were up to three orders of magnitude higher than the IAEA reported values, and their 95% confidence intervals were within two orders of magnitude. Most Kd values of elements were decreasing with increasing electrical conductivity of the solution phase. The Kd of Ca had a good correlation with that of Sr. However, the correlation between the Kds of K and Cs was not good. The Kd values were also determined by another method. The soil solutions were separated from the fresh soil samples by means of high speed centrifuging. The Kd values were calculated from the element concentration in solid phase and soil solution (centrifugation method). The Kd values obtained by the centrifugation method agreed within one order of magnitude with those by the batch method, and both variation patterns in elements correlated well. (author)

  16. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  17. Probabilistic siting analysis of nuclear power plants emphasizing atmospheric dispersion of radioactive releases and radiation-induced health effects

    International Nuclear Information System (INIS)

    Savolainen, Ilkka

    1980-01-01

    A presentation is made of probabilistic evaluation schemes for nuclear power plant siting. Effects on health attributable to ionizing radiation are reviewed, for the purpose of assessment of the numbers of the most important health effect cases in light-water reactor accidents. The atmospheric dispersion of radioactive releases from nuclear power plants is discussed, and there is presented an environmental consequence assessment model in which the radioactive releases and atmospheric dispersion of the releases are treated by the application of probabilistic methods. In the model, the environmental effects arising from exposure to radiation are expressed as cumulative probability distributions and expectation values. The probabilistic environmental consequence assessment model has been applied to nuclear power plant site evaluation, including risk-benefit and cost-benefit analyses, and the comparison of various alternative sites. (author)

  18. Can model weighting improve probabilistic projections of climate change?

    Energy Technology Data Exchange (ETDEWEB)

    Raeisaenen, Jouni; Ylhaeisi, Jussi S. [Department of Physics, P.O. Box 48, University of Helsinki (Finland)

    2012-10-15

    Recently, Raeisaenen and co-authors proposed a weighting scheme in which the relationship between observable climate and climate change within a multi-model ensemble determines to what extent agreement with observations affects model weights in climate change projection. Within the Third Coupled Model Intercomparison Project (CMIP3) dataset, this scheme slightly improved the cross-validated accuracy of deterministic projections of temperature change. Here the same scheme is applied to probabilistic temperature change projection, under the strong limiting assumption that the CMIP3 ensemble spans the actual modeling uncertainty. Cross-validation suggests that probabilistic temperature change projections may also be improved by this weighting scheme. However, the improvement relative to uniform weighting is smaller in the tail-sensitive logarithmic score than in the continuous ranked probability score. The impact of the weighting on projection of real-world twenty-first century temperature change is modest in most parts of the world. However, in some areas mainly over the high-latitude oceans, the mean of the distribution is substantially changed and/or the distribution is considerably narrowed. The weights of individual models vary strongly with location, so that a model that receives nearly zero weight in some area may still get a large weight elsewhere. Although the details of this variation are method-specific, it suggests that the relative strengths of different models may be difficult to harness by weighting schemes that use spatially uniform model weights. (orig.)

  19. A probabilistic approach to Radiological Environmental Impact Assessment

    International Nuclear Information System (INIS)

    Avila, Rodolfo; Larsson, Carl-Magnus

    2001-01-01

    Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined

  20. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  1. Form coefficient of helical toroidal solenoids

    International Nuclear Information System (INIS)

    Amelin, V.Z.; Kunchenko, V.B.

    1982-01-01

    For toroidal solenoids with continuous spiral coil, winded according to the laws of equiinclined and simple cylindrical spirals with homogeneous, linearly increasing to the coil periphery and ''Bitter'' distribution of current density, the analytical expressions for the dependence between capacity consumed and generated magnetic field, expressions for coefficients of form similar to Fabry coefficient for cylindrical solenoids are obtained and dependence of the form coefficient and relative volume of solenoid conductor on the number of revolutions of screw line per one circumvention over the large torus radius is also investigated. Analytical expressions of form coefficients and graphical material permit to select the optimum geometry as to capacity consumed both for spiral (including ''force-free'') and conventional toroidal solenoids of magnetic systems in thermonulear installations

  2. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  3. Probabilistic Modelling of Timber Material Properties

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber; Köhler, Jochen; Sørensen, John Dalsgaard

    2001-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for grading of timber in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be compared. The tail behavior of the probability distributions of timber material characteristics play an important role...... such that they may readily be applied in structural reliability analysis and the format appears to be appropriate for codification purposes of quality control and selection for grading procedures...

  4. Distribution functions to estimate radionuclide solid-liquid distribution coefficients in soils: the case of Cs

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez-Guinart, Oriol; Rigol, Anna; Vidal, Miquel [Analytical Chemistry department, Faculty of Chemistry, University of Barcelona, Mart i Franques 1-11, 08028, Barcelona (Spain)

    2014-07-01

    In the frame of the revision of the IAEA TRS 364 (Handbook of parameter values for the prediction of radionuclide transfer in temperate environments), a database of radionuclide solid-liquid distribution coefficients (K{sub d}) in soils was compiled with data coming from field and laboratory experiments, from references mostly from 1990 onwards, including data from reports, reviewed papers, and grey literature. The K{sub d} values were grouped for each radionuclide according to two criteria. The first criterion was based on the sand and clay mineral percentages referred to the mineral matter, and the organic matter (OM) content in the soil. This defined the 'texture/OM' criterion. The second criterion was to group soils regarding specific soil factors governing the radionuclide-soil interaction ('cofactor' criterion). The cofactors depended on the radionuclide considered. An advantage of using cofactors was that the variability of K{sub d} ranges for a given soil group decreased considerably compared with that observed when the classification was based solely on sand, clay and organic matter contents. The K{sub d} best estimates were defined as the calculated GM values assuming that K{sub d} values were always log-normally distributed. Risk assessment models may require as input data for a given parameter either a single value (a best estimate) or a continuous function from which not only individual best estimates but also confidence ranges and data variability can be derived. In the case of the K{sub d} parameter, a suitable continuous function which contains the statistical parameters (e.g. arithmetical/geometric mean, arithmetical/geometric standard deviation, mode, etc.) that better explain the distribution among the K{sub d} values of a dataset is the Cumulative Distribution Function (CDF). To our knowledge, appropriate CDFs has not been proposed for radionuclide K{sub d} in soils yet. Therefore, the aim of this works is to create CDFs for

  5. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2009-01-01

    on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...

  6. PROBABILISTIC FINITE ELEMENT ANALYSIS OF A HEAVY DUTY RADIATOR UNDER INTERNAL PRESSURE LOADING

    Directory of Open Access Journals (Sweden)

    ROBIN ROY P.

    2017-09-01

    Full Text Available Engine cooling is vital in keeping the engine at most efficient temperature for the different vehicle speed and operating road conditions. Radiator is one of the key components in the heavy duty engine cooling system. Heavy duty radiator is subjected to various kinds of loading such as pressure, thermal, vibration, internal erosion, external corrosion, creep. Pressure cycle durability is one of the most important characteristic in the design of heavy duty radiator. Current design methodologies involve design of heavy duty radiator using the nominal finite element approach which does not take into account of the variations occurring in the geometry, material and boundary condition, leading to over conservative and uneconomical designs of radiator system. A new approach is presented in the paper to integrate traditional linear finite element method and probabilistic approach to design a heavy duty radiator by including the uncertainty in the computational model. As a first step, nominal run is performed with input design variables and desired responses are extracted. A probabilistic finite elementanalysis is performed to identify the robust designs and validated for reliability. Probabilistic finite element includes the uncertainty of the material thickness, dimensional and geometrical variation. Gaussian distribution is employed to define the random variation and uncertainty. Monte Carlo method is used to generate the random design points.Output response distributions of the random design points are post-processed using different statistical and probability technique to find the robust design. The above approach of systematic virtual modelling and analysis of the data helps to find efficient and reliable robust design.

  7. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  8. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  9. MARATHON - a computer code for the probabilistic estimation of leak-before-break time in CANDU reactors

    International Nuclear Information System (INIS)

    Walker, J.R.

    1990-02-01

    The presence of high levels of moisture in the annulus gas system of a CANDU reactor indicates that a leaking crack may be present in a pressure tube. This will initiate the shutdown of the reactor to prevent the possibility of fuel channel damage. It is also desirable, however, to keep the reactor partially pressurized at hot shutdown for as long as it is necessary to unambiguously identify the leaking pressure tube. A premature full depressurization may cause an extended shutdown while the leaking tube is being located. However, fast fracture could occur during an excessively long hot shutdown period. A probabilistic methodology, together with an associated computer code (called MARATHON), has been developed to calculate the time from first leakage to unstable fracture in a probabilistic format. The methodology explicitly uses distributions of material properties and allows the risk associated with leak-before-break to be estimated. A model of the leak detection system is integrated into the methodology to calculate the time from leak detection to unstable fracture. The sensitivity of the risk to changing reactor conditions allows the optimization of reactor management after leak detection. In this report we describe the probabilistic model and give details of the quality assurance and verification of the MARATHON code. Examples of the use of MARATHON are given using preliminary material property distributions. These preliminary material property distributions indicate that the probability of unstable fracture is very low, and that ample time is available to locate the leaking tube

  10. Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety

    Science.gov (United States)

    Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent

    2016-04-01

    The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.

  11. Probabilistic maximum-value wind prediction for offshore environments

    DEFF Research Database (Denmark)

    Staid, Andrea; Pinson, Pierre; Guikema, Seth D.

    2015-01-01

    statistical models to predict the full distribution of the maximum-value wind speeds in a 3 h interval. We take a detailed look at the performance of linear models, generalized additive models and multivariate adaptive regression splines models using meteorological covariates such as gust speed, wind speed......, convective available potential energy, Charnock, mean sea-level pressure and temperature, as given by the European Center for Medium-Range Weather Forecasts forecasts. The models are trained to predict the mean value of maximum wind speed, and the residuals from training the models are used to develop...... the full probabilistic distribution of maximum wind speed. Knowledge of the maximum wind speed for an offshore location within a given period can inform decision-making regarding turbine operations, planned maintenance operations and power grid scheduling in order to improve safety and reliability...

  12. Probabilistic Cloning of Three Real States with Optimal Success Probabilities

    Science.gov (United States)

    Rui, Pin-shu

    2017-06-01

    We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→ N PQC case.

  13. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  14. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  15. Probabilistic induction of delayed health hazards in occupational radiation workers

    International Nuclear Information System (INIS)

    Mohamad, M.H.M.; Abdel-Ghani, A.H.

    2003-01-01

    Occupational radiation workers are periodically monitored for their personal occupational dose. Various types of radiation measurement devices are used, mostly film badges and thermoluminescent dosimeters. Several thousand occupational radiation workers were monitored over a period of seven years (jan. 1995- Dec. 2001). These included atomic energy personnel, nuclear materials personnel, staff of mediology departments (diagnostic, therapeutic and nuclear medicine) and industrial occupational workers handling industrial radiography equipment besides other applications of radiation sources in industry. The probably of induction of health hazards in these radiation workers was assessed using the nominal probability coefficient adopted by the ICRP (1991) for both hereditary effects and cancer induction. In this treatise, data procured are presented and discussed inthe light of basic postulations of probabilistic occurrence of radiation induced delayed health effects

  16. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  17. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  18. Evolution of Sr distribution coefficient as a function of time, incubation conditions and measurement technique

    International Nuclear Information System (INIS)

    Wang Guo; Staunton, Siobhan

    2005-01-01

    A thorough understanding of the dynamics of radiostrontium in soil is required to allow accurate long-term predictions of its mobility. We have followed the soil solution distribution of 85 Sr as a function of time under controlled conditions over 4 months and studied the effect of soil moisture content and organic matter amendments. Data have been compared to redox conditions and soil pH. To fuel the ongoing debate on the validity of distribution coefficient (K d ) values measured in dilute suspension, we have compared values obtained from the activity concentration in soil solution obtained by centrifugation to data obtained in suspension with or without air-drying of the soil samples after incubation. The 85 Sr adsorption properties of soil, incubated without prior contamination were also measured. There is some time-dependent adsorption of Sr. This is partly due to changing soil composition due to the decomposition of added organic matter and anaerobic conditions induced by flooding. There is also a kinetic effect, but adsorption remains largely reversible. Most of the observed effects are lost when soil is suspended in electrolyte solution

  19. [Assessment of ecosystem in giant panda distribution area based on entropy method and coefficient of variation].

    Science.gov (United States)

    Yan, Zhi Gang; Li, Jun Qing

    2017-12-01

    The areas of the habitat and bamboo forest, and the size of the giant panda wild population have greatly increased, while habitat fragmentation and local population isolation have also intensified in recent years. Accurate evaluation of ecosystem status of the panda in the giant panda distribution area is important for giant panda conservation. The ecosystems of the distribution area and six mountain ranges were subdivided into habitat and population subsystems based on the hie-rarchical system theory. Using the panda distribution area as the study area and the three national surveys as the time node, the evolution laws of ecosystems were studied using the entropy method, coefficient of variation, and correlation analysis. We found that with continuous improvement, some differences existed in the evolution and present situation of the ecosystems of six mountain ranges could be divided into three groups. Ecosystems classified into the same group showed many commonalities, and difference between the groups was considerable. Problems of habitat fragmentation and local population isolation became more serious, resulting in ecosystem degradation. Individuali-zed ecological protection measures should be formulated and implemented in accordance with the conditions in each mountain system to achieve the best results.

  20. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.