WorldWideScience

Sample records for random variable representing

  1. Contextuality in canonical systems of random variables

    Science.gov (United States)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  2. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  3. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  4. Strong Decomposition of Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.

    2007-01-01

    A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....

  5. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  6. Ordered random variables theory and applications

    CERN Document Server

    Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M

    2016-01-01

    Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).

  7. Contextuality is about identity of random variables

    International Nuclear Information System (INIS)

    Dzhafarov, Ehtibar N; Kujala, Janne V

    2014-01-01

    Contextual situations are those in which seemingly ‘the same’ random variable changes its identity depending on the conditions under which it is recorded. Such a change of identity is observed whenever the assumption that the variable is one and the same under different conditions leads to contradictions when one considers its joint distribution with other random variables (this is the essence of all Bell-type theorems). In our Contextuality-by-Default approach, instead of asking why or how the conditions force ‘one and the same’ random variable to change ‘its’ identity, any two random variables recorded under different conditions are considered different ‘automatically.’ They are never the same, nor are they jointly distributed, but one can always impose on them a joint distribution (probabilistic coupling). The special situations when there is a coupling in which these random variables are equal with probability 1 are considered noncontextual. Contextuality means that such couplings do not exist. We argue that the determination of the identity of random variables by conditions under which they are recorded is not a causal relationship and cannot violate laws of physics. (paper)

  8. A random number generator for continuous random variables

    Science.gov (United States)

    Guerra, V. M.; Tapia, R. A.; Thompson, J. R.

    1972-01-01

    A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.

  9. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  10. On Complex Random Variables

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable  is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of  have a complex univariate normal distribution. The characteristic function of  has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector  is Hermitian positive definite. Marginal distributions of  have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.

  11. Polynomial chaos expansion with random and fuzzy variables

    Science.gov (United States)

    Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.

    2016-06-01

    A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.

  12. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    Science.gov (United States)

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  13. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Directory of Open Access Journals (Sweden)

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  14. Maximal Inequalities for Dependent Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jorgensen, Jorgen

    2016-01-01

    Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X-k. Then a......Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X......-k. Then a maximal inequality gives conditions ensuring that the maximal partial sum M-n = max(1) (...

  15. Benford's law and continuous dependent random variables

    Science.gov (United States)

    Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine

    2018-01-01

    Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.

  16. Hoeffding’s Inequality for Sums of Dependent Random Variables

    Czech Academy of Sciences Publication Activity Database

    Pelekis, Christos; Ramon, J.

    2017-01-01

    Roč. 14, č. 6 (2017), č. článku 243. ISSN 1660-5446 Institutional support: RVO:67985807 Keywords : dependent random variables * Hoeffding’s inequality * k-wise independent random variables * martingale differences Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.868, year: 2016

  17. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    Science.gov (United States)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  18. On the product and ratio of Bessel random variables

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2005-01-01

    Full Text Available The distributions of products and ratios of random variables are of interest in many areas of the sciences. In this paper, the exact distributions of the product |XY| and the ratio |X/Y| are derived when X and Y are independent Bessel function random variables. An application of the results is provided by tabulating the associated percentage points.

  19. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    .e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  20. A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jae-Hee Hur

    2017-01-01

    Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.

  1. Exponential Inequalities for Positively Associated Random Variables and Applications

    Directory of Open Access Journals (Sweden)

    Yang Shanchao

    2008-01-01

    Full Text Available Abstract We establish some exponential inequalities for positively associated random variables without the boundedness assumption. These inequalities improve the corresponding results obtained by Oliveira (2005. By one of the inequalities, we obtain the convergence rate for the case of geometrically decreasing covariances, which closes to the optimal achievable convergence rate for independent random variables under the Hartman-Wintner law of the iterated logarithm and improves the convergence rate derived by Oliveira (2005 for the above case.

  2. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  3. Development of a localized probabilistic sensitivity method to determine random variable regional importance

    International Nuclear Information System (INIS)

    Millwater, Harry; Singh, Gulshan; Cortina, Miguel

    2012-01-01

    There are many methods to identify the important variable out of a set of random variables, i.e., “inter-variable” importance; however, to date there are no comparable methods to identify the “region” of importance within a random variable, i.e., “intra-variable” importance. Knowledge of the critical region of an input random variable (tail, near-tail, and central region) can provide valuable information towards characterizing, understanding, and improving a model through additional modeling or testing. As a result, an intra-variable probabilistic sensitivity method was developed and demonstrated for independent random variables that computes the partial derivative of a probabilistic response with respect to a localized perturbation in the CDF values of each random variable. These sensitivities are then normalized in absolute value with respect to the largest sensitivity within a distribution to indicate the region of importance. The methodology is implemented using the Score Function kernel-based method such that existing samples can be used to compute sensitivities for negligible cost. Numerical examples demonstrate the accuracy of the method through comparisons with finite difference and numerical integration quadrature estimates. - Highlights: ► Probabilistic sensitivity methodology. ► Determines the “region” of importance within random variables such as left tail, near tail, center, right tail, etc. ► Uses the Score Function approach to reuse the samples, hence, negligible cost. ► No restrictions on the random variable types or limit states.

  4. New Results On the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2015-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented.

  5. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2016-01-06

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  6. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza; Alouini, Mohamed-Slim

    2016-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  7. Stochastic Optimal Estimation with Fuzzy Random Variables and Fuzzy Kalman Filtering

    Institute of Scientific and Technical Information of China (English)

    FENG Yu-hu

    2005-01-01

    By constructing a mean-square performance index in the case of fuzzy random variable, the optimal estimation theorem for unknown fuzzy state using the fuzzy observation data are given. The state and output of linear discrete-time dynamic fuzzy system with Gaussian noise are Gaussian fuzzy random variable sequences. An approach to fuzzy Kalman filtering is discussed. Fuzzy Kalman filtering contains two parts: a real-valued non-random recurrence equation and the standard Kalman filtering.

  8. Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Dragana Č. Pavlović

    2013-01-01

    Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.

  9. Partial summations of stationary sequences of non-Gaussian random variables

    DEFF Research Database (Denmark)

    Mohr, Gunnar; Ditlevsen, Ove Dalager

    1996-01-01

    The distribution of the sum of a finite number of identically distributed random variables is in many cases easily determined given that the variables are independent. The moments of any order of the sum can always be expressed by the moments of the single term without computational problems...... of convergence of the distribution of a sum (or an integral) of mutually dependent random variables to the Gaussian distribution. The paper is closely related to the work in Ditlevsen el al. [Ditlevsen, O., Mohr, G. & Hoffmeyer, P. Integration of non-Gaussian fields. Prob. Engng Mech 11 (1996) 15-23](2)....... lognormal variables or polynomials of standard Gaussian variables. The dependency structure is induced by specifying the autocorrelation structure of the sequence of standard Gaussian variables. Particularly useful polynomials are the Winterstein approximations that distributionally fit with non...

  10. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    Science.gov (United States)

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  11. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  12. A review of instrumental variable estimators for Mendelian randomization.

    Science.gov (United States)

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  13. A Particle Swarm Optimization Algorithm with Variable Random Functions and Mutation

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xiao-Jun; YANG Chun-Hua; GUI Wei-Hua; DONG Tian-Xue

    2014-01-01

    The convergence analysis of the standard particle swarm optimization (PSO) has shown that the changing of random functions, personal best and group best has the potential to improve the performance of the PSO. In this paper, a novel strategy with variable random functions and polynomial mutation is introduced into the PSO, which is called particle swarm optimization algorithm with variable random functions and mutation (PSO-RM). Random functions are adjusted with the density of the population so as to manipulate the weight of cognition part and social part. Mutation is executed on both personal best particle and group best particle to explore new areas. Experiment results have demonstrated the effectiveness of the strategy.

  14. On mean square displacement behaviors of anomalous diffusions with variable and random orders

    International Nuclear Information System (INIS)

    Sun Hongguang; Chen Wen; Sheng Hu; Chen Yangquan

    2010-01-01

    Mean square displacement (MSD) is used to characterize anomalous diffusion. Recently, models of anomalous diffusion with variable-order and random-order were proposed, but no MSD analysis has been given so far. The purpose of this Letter is to offer a concise derivation of MSD functions for the variable-order model and the random-order model. Numerical results are presented to illustrate the analytical results. In addition, we show how to establish a variable-random-order model for a given MSD function which has clear application potentials.

  15. Limit theorems for multi-indexed sums of random variables

    CERN Document Server

    Klesov, Oleg

    2014-01-01

    Presenting the first unified treatment of limit theorems for multiple sums of independent random variables, this volume fills an important gap in the field. Several new results are introduced, even in the classical setting, as well as some new approaches that are simpler than those already established in the literature. In particular, new proofs of the strong law of large numbers and the Hajek-Renyi inequality are detailed. Applications of the described theory include Gibbs fields, spin glasses, polymer models, image analysis and random shapes. Limit theorems form the backbone of probability theory and statistical theory alike. The theory of multiple sums of random variables is a direct generalization of the classical study of limit theorems, whose importance and wide application in science is unquestionable. However, to date, the subject of multiple sums has only been treated in journals. The results described in this book will be of interest to advanced undergraduates, graduate students and researchers who ...

  16. Compound Poisson Approximations for Sums of Random Variables

    OpenAIRE

    Serfozo, Richard F.

    1986-01-01

    We show that a sum of dependent random variables is approximately compound Poisson when the variables are rarely nonzero and, given they are nonzero, their conditional distributions are nearly identical. We give several upper bounds on the total-variation distance between the distribution of such a sum and a compound Poisson distribution. Included is an example for Markovian occurrences of a rare event. Our bounds are consistent with those that are known for Poisson approximations for sums of...

  17. Randomized trial of intermittent or continuous amnioinfusion for variable decelerations.

    Science.gov (United States)

    Rinehart, B K; Terrone, D A; Barrow, J H; Isler, C M; Barrilleaux, P S; Roberts, W E

    2000-10-01

    To determine whether continuous or intermittent bolus amnioinfusion is more effective in relieving variable decelerations. Patients with repetitive variable decelerations were randomized to an intermittent bolus or continuous amnioinfusion. The intermittent bolus infusion group received boluses of 500 mL of normal saline, each over 30 minutes, with boluses repeated if variable decelerations recurred. The continuous infusion group received a bolus infusion of 500 mL of normal saline over 30 minutes and then 3 mL per minute until delivery occurred. The ability of the amnioinfusion to abolish variable decelerations was analyzed, as were maternal demographic and pregnancy outcome variables. Power analysis indicated that 64 patients would be required. Thirty-five patients were randomized to intermittent infusion and 30 to continuous infusion. There were no differences between groups in terms of maternal demographics, gestational age, delivery mode, neonatal outcome, median time to resolution of variable decelerations, or the number of times variable decelerations recurred. The median volume infused in the intermittent infusion group (500 mL) was significantly less than that in the continuous infusion group (905 mL, P =.003). Intermittent bolus amnioinfusion is as effective as continuous infusion in relieving variable decelerations in labor. Further investigation is necessary to determine whether either of these techniques is associated with increased occurrence of rare complications such as cord prolapse or uterine rupture.

  18. ESEARCH OF THE LAW OF DISTRIBUTION OF THE RANDOM VARIABLE OF THE COMPRESSION

    Directory of Open Access Journals (Sweden)

    I. Sarayeva

    2011-01-01

    Full Text Available At research of diagnosing the process of modern automobile engines by means of methods of mathematical statistics the experimental data of the random variable of compression are analysed and it is proved that the random variable of compression has the form of the normal law of distribution.

  19. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  20. Characteristics of quantum open systems: free random variables approach

    International Nuclear Information System (INIS)

    Gudowska-Nowak, E.; Papp, G.; Brickmann, J.

    1998-01-01

    Random Matrix Theory provides an interesting tool for modelling a number of phenomena where noises (fluctuations) play a prominent role. Various applications range from the theory of mesoscopic systems in nuclear and atomic physics to biophysical models, like Hopfield-type models of neural networks and protein folding. Random Matrix Theory is also used to study dissipative systems with broken time-reversal invariance providing a setup for analysis of dynamic processes in condensed, disordered media. In the paper we use the Random Matrix Theory (RMT) within the formalism of Free Random Variables (alias Blue's functions), which allows to characterize spectral properties of non-Hermitean ''Hamiltonians''. The relevance of using the Blue's function method is discussed in connection with application of non-Hermitean operators in various problems of physical chemistry. (author)

  1. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  2. A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables

    Science.gov (United States)

    Vernizzi, Graziano; Nakai, Miki

    2015-01-01

    It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…

  3. Zero Distribution of System with Unknown Random Variables Case Study: Avoiding Collision Path

    Directory of Open Access Journals (Sweden)

    Parman Setyamartana

    2014-07-01

    Full Text Available This paper presents the stochastic analysis of finding the feasible trajectories of robotics arm motion at obstacle surrounding. Unknown variables are coefficients of polynomials joint angle so that the collision-free motion is achieved. ãk is matrix consisting of these unknown feasible polynomial coefficients. The pattern of feasible polynomial in the obstacle environment shows as random. This paper proposes to model the pattern of this randomness values using random polynomial with unknown variables as coefficients. The behavior of the system will be obtained from zero distribution as the characteristic of such random polynomial. Results show that the pattern of random polynomial of avoiding collision can be constructed from zero distribution. Zero distribution is like building block of the system with obstacles as uncertainty factor. By scale factor k, which has range, the random coefficient pattern can be predicted.

  4. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  5. Assessing terpene content variability of whitebark pine in order to estimate representative sample size

    Directory of Open Access Journals (Sweden)

    Stefanović Milena

    2013-01-01

    Full Text Available In studies of population variability, particular attention has to be paid to the selection of a representative sample. The aim of this study was to assess the size of the new representative sample on the basis of the variability of chemical content of the initial sample on the example of a whitebark pine population. Statistical analysis included the content of 19 characteristics (terpene hydrocarbons and their derivates of the initial sample of 10 elements (trees. It was determined that the new sample should contain 20 trees so that the mean value calculated from it represents a basic set with a probability higher than 95 %. Determination of the lower limit of the representative sample size that guarantees a satisfactory reliability of generalization proved to be very important in order to achieve cost efficiency of the research. [Projekat Ministarstva nauke Republike Srbije, br. OI-173011, br. TR-37002 i br. III-43007

  6. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  7. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    Science.gov (United States)

    Bancroft, Stacie L.; Bourret, Jason C.

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…

  8. CONVERGENCE OF THE FRACTIONAL PARTS OF THE RANDOM VARIABLES TO THE TRUNCATED EXPONENTIAL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Bogdan Gheorghe Munteanu

    2013-01-01

    Full Text Available Using the stochastic approximations, in this paper it was studiedthe convergence in distribution of the fractional parts of the sum of random variables to the truncated exponential distribution with parameter lambda. This fact is feasible by means of the Fourier-Stieltjes sequence (FSS of the random variable.

  9. Output variability caused by random seeds in a multi-agent transport simulation model

    DEFF Research Database (Denmark)

    Paulsen, Mads; Rasmussen, Thomas Kjær; Nielsen, Otto Anker

    2018-01-01

    Dynamic transport simulators are intended to support decision makers in transport-related issues, and as such it is valuable that the random variability of their outputs is as small as possible. In this study we analyse the output variability caused by random seeds of a multi-agent transport...... simulator (MATSim) when applied to a case study of Santiago de Chile. Results based on 100 different random seeds shows that the relative accuracies of estimated link loads tend to increase with link load, but that relative errors of up to 10 % do occur even for links with large volumes. Although...

  10. Strong Laws of Large Numbers for Arrays of Rowwise NA and LNQD Random Variables

    Directory of Open Access Journals (Sweden)

    Jiangfeng Wang

    2011-01-01

    Full Text Available Some strong laws of large numbers and strong convergence properties for arrays of rowwise negatively associated and linearly negative quadrant dependent random variables are obtained. The results obtained not only generalize the result of Hu and Taylor to negatively associated and linearly negative quadrant dependent random variables, but also improve it.

  11. Extended q -Gaussian and q -exponential distributions from gamma random variables

    Science.gov (United States)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  12. Reserves Represented by Random Walks

    International Nuclear Information System (INIS)

    Filipe, J A; Ferreira, M A M; Andrade, M

    2012-01-01

    The reserves problem is studied through models based on Random Walks. Random walks are a classical particular case in the analysis of stochastic processes. They do not appear only to study reserves evolution models. They are also used to build more complex systems and as analysis instruments, in a theoretical feature, of other kind of systems. In this work by studying the reserves, the main objective is to see and guarantee that pensions funds get sustainable. Being the use of these models considering this goal a classical approach in the study of pensions funds, this work concluded about the problematic of reserves. A concrete example is presented.

  13. Piecewise linearisation of the first order loss function for families of arbitrarily distributed random variables

    NARCIS (Netherlands)

    Rossi, R.; Hendrix, E.M.T.

    2014-01-01

    We discuss the problem of computing optimal linearisation parameters for the first order loss function of a family of arbitrarily distributed random variable. We demonstrate that, in contrast to the problem in which parameters must be determined for the loss function of a single random variable,

  14. An infinite-dimensional weak KAM theory via random variables

    KAUST Repository

    Gomes, Diogo A.

    2016-08-31

    We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables\\' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.

  15. An infinite-dimensional weak KAM theory via random variables

    KAUST Repository

    Gomes, Diogo A.; Nurbekyan, Levon

    2016-01-01

    We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.

  16. Extensions of von Neumann's method for generating random variables

    International Nuclear Information System (INIS)

    Monahan, J.F.

    1979-01-01

    Von Neumann's method of generating random variables with the exponential distribution and Forsythe's method for obtaining distributions with densities of the form e/sup -G//sup( x/) are generalized to apply to certain power series representations. The flexibility of the power series methods is illustrated by algorithms for the Cauchy and geometric distributions

  17. Random and systematic spatial variability of 137Cs inventories at reference sites in South-Central Brazil

    Directory of Open Access Journals (Sweden)

    Correchel Vladia

    2005-01-01

    Full Text Available The precision of the 137Cs fallout redistribution technique for the evaluation of soil erosion rates is strongly dependent on the quality of an average inventory taken at a representative reference site. The knowledge of the sources and of the degree of variation of the 137Cs fallout spatial distribution plays an important role on its use. Four reference sites were selected in the South-Central region of Brazil which were characterized in terms of soil chemical, physical and mineralogical aspects as well as the spatial variability of 137Cs inventories. Some important differences in the patterns of 137Cs depth distribution in the soil profiles of the different sites were found. They are probably associated to chemical, physical, mineralogical and biological differences of the soils but many questions still remain open for future investigation, mainly those regarding the adsorption and dynamics of the 137Cs ions in soil profiles under tropical conditions. The random spatial variability (inside each reference site was higher than the systematic spatial variability (between reference sites but their causes were not clearly identified as possible consequences of chemical, physical, mineralogical variability, and/or precipitation.

  18. How a dependent's variable non-randomness affects taper equation ...

    African Journals Online (AJOL)

    In order to apply the least squares method in regression analysis, the values of the dependent variable Y should be random. In an example of regression analysis linear and nonlinear taper equations, which estimate the diameter of the tree dhi at any height of the tree hi, were compared. For each tree the diameter at the ...

  19. SOERP, Statistics and 2. Order Error Propagation for Function of Random Variables

    International Nuclear Information System (INIS)

    Cox, N. D.; Miller, C. F.

    1985-01-01

    1 - Description of problem or function: SOERP computes second-order error propagation equations for the first four moments of a function of independently distributed random variables. SOERP was written for a rigorous second-order error propagation of any function which may be expanded in a multivariable Taylor series, the input variables being independently distributed. The required input consists of numbers directly related to the partial derivatives of the function, evaluated at the nominal values of the input variables and the central moments of the input variables from the second through the eighth. 2 - Method of solution: The development of equations for computing the propagation of errors begins by expressing the function of random variables in a multivariable Taylor series expansion. The Taylor series expansion is then truncated, and statistical operations are applied to the series in order to obtain equations for the moments (about the origin) of the distribution of the computed value. If the Taylor series is truncated after powers of two, the procedure produces second-order error propagation equations. 3 - Restrictions on the complexity of the problem: The maximum number of component variables allowed is 30. The IBM version will only process one set of input data per run

  20. Higher order moments of a sum of random variables: remarks and applications.

    Directory of Open Access Journals (Sweden)

    Luisa Tibiletti

    1996-02-01

    Full Text Available The moments of a sum of random variables depend on both the pure moments of each random addendum and on the addendum mixed moments. In this note we introduce a simple measure to evaluate the relative impedance to attach to the latter. Once the pure moments are fixed, the functional relation between the random addenda leading to the extreme values is also provided. Applications to Finance, Decision Theory and Actuarial Sciences are also suggested.

  1. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Ahmed, Sajid

    2016-01-13

    Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.

  2. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Ahmed, Sajid; Alouini, Mohamed-Slim; Jardak, Seifallah

    2016-01-01

    Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.

  3. Respiratory variability preceding and following sighs: a resetter hypothesis.

    Science.gov (United States)

    Vlemincx, Elke; Van Diest, Ilse; Lehrer, Paul M; Aubert, André E; Van den Bergh, Omer

    2010-04-01

    Respiratory behavior is characterized by complex variability with structured and random components. Assuming that both a lack of variability and too much randomness represent suboptimal breathing regulation, we hypothesized that sighing acts as a resetter inducing structured variability. Spontaneous breathing was measured in healthy persons (N=42) during a 20min period of quiet sitting using the LifeShirt(®) System. Four blocks of 10 breaths with a 50% window overlap were determined before and after spontaneous sighs. Total respiratory variability of minute ventilation was measured using the coefficient of variation and structured (correlated) variability was quantified using autocorrelation. Towards a sigh, total variability gradually increased without concomittant changes in correlated variability, suggesting that randomness increased. After a sigh, correlated variability increased. No changes in variability were found in comparable epochs without intermediate sighs. We conclude that a sigh resets structured respiratory variability, enhancing information processing in the respiratory system. Copyright © 2009 Elsevier B.V. All rights reserved.

  4. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Possibility/Necessity-Based Probabilistic Expectation Models for Linear Programming Problems with Discrete Fuzzy Random Variables

    Directory of Open Access Journals (Sweden)

    Hideki Katagiri

    2017-10-01

    Full Text Available This paper considers linear programming problems (LPPs where the objective functions involve discrete fuzzy random variables (fuzzy set-valued discrete random variables. New decision making models, which are useful in fuzzy stochastic environments, are proposed based on both possibility theory and probability theory. In multi-objective cases, Pareto optimal solutions of the proposed models are newly defined. Computational algorithms for obtaining the Pareto optimal solutions of the proposed models are provided. It is shown that problems involving discrete fuzzy random variables can be transformed into deterministic nonlinear mathematical programming problems which can be solved through a conventional mathematical programming solver under practically reasonable assumptions. A numerical example of agriculture production problems is given to demonstrate the applicability of the proposed models to real-world problems in fuzzy stochastic environments.

  6. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  7. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    Science.gov (United States)

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  8. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  9. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    Science.gov (United States)

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  10. Central limit theorem for the Banach-valued weakly dependent random variables

    International Nuclear Information System (INIS)

    Dmitrovskij, V.A.; Ermakov, S.V.; Ostrovskij, E.I.

    1983-01-01

    The central limit theorem (CLT) for the Banach-valued weakly dependent random variables is proved. In proving CLT convergence of finite-measured (i.e. cylindrical) distributions is established. A weak compactness of the family of measures generated by a certain sequence is confirmed. The continuity of the limiting field is checked

  11. Free random variables

    CERN Document Server

    Voiculescu, Dan; Nica, Alexandru

    1992-01-01

    This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.

  12. AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM

    International Nuclear Information System (INIS)

    Farrell, Sean A.; Murphy, Tara; Lo, Kitty K.

    2015-01-01

    In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of a random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.

  13. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  14. Non-uniform approximations for sums of discrete m-dependent random variables

    OpenAIRE

    Vellaisamy, P.; Cekanavicius, V.

    2013-01-01

    Non-uniform estimates are obtained for Poisson, compound Poisson, translated Poisson, negative binomial and binomial approximations to sums of of m-dependent integer-valued random variables. Estimates for Wasserstein metric also follow easily from our results. The results are then exemplified by the approximation of Poisson binomial distribution, 2-runs and $m$-dependent $(k_1,k_2)$-events.

  15. Stable Graphical Model Estimation with Random Forests for Discrete, Continuous, and Mixed Variables

    OpenAIRE

    Fellinghauer, Bernd; Bühlmann, Peter; Ryffel, Martin; von Rhein, Michael; Reinhardt, Jan D.

    2011-01-01

    A conditional independence graph is a concise representation of pairwise conditional independence among many variables. Graphical Random Forests (GRaFo) are a novel method for estimating pairwise conditional independence relationships among mixed-type, i.e. continuous and discrete, variables. The number of edges is a tuning parameter in any graphical model estimator and there is no obvious number that constitutes a good choice. Stability Selection helps choosing this parameter with respect to...

  16. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  17. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  18. On bounds in Poisson approximation for distributions of independent negative-binomial distributed random variables.

    Science.gov (United States)

    Hung, Tran Loc; Giang, Le Truong

    2016-01-01

    Using the Stein-Chen method some upper bounds in Poisson approximation for distributions of row-wise triangular arrays of independent negative-binomial distributed random variables are established in this note.

  19. Multiobjective Two-Stage Stochastic Programming Problems with Interval Discrete Random Variables

    Directory of Open Access Journals (Sweden)

    S. K. Barik

    2012-01-01

    Full Text Available Most of the real-life decision-making problems have more than one conflicting and incommensurable objective functions. In this paper, we present a multiobjective two-stage stochastic linear programming problem considering some parameters of the linear constraints as interval type discrete random variables with known probability distribution. Randomness of the discrete intervals are considered for the model parameters. Further, the concepts of best optimum and worst optimum solution are analyzed in two-stage stochastic programming. To solve the stated problem, first we remove the randomness of the problem and formulate an equivalent deterministic linear programming model with multiobjective interval coefficients. Then the deterministic multiobjective model is solved using weighting method, where we apply the solution procedure of interval linear programming technique. We obtain the upper and lower bound of the objective function as the best and the worst value, respectively. It highlights the possible risk involved in the decision-making tool. A numerical example is presented to demonstrate the proposed solution procedure.

  20. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Jardak, Seifallah

    2014-09-01

    Correlated waveforms have a number of applications in different fields, such as radar and communication. It is very easy to generate correlated waveforms using infinite alphabets, but for some of the applications, it is very challenging to use them in practice. Moreover, to generate infinite alphabet constant envelope correlated waveforms, the available research uses iterative algorithms, which are computationally very expensive. In this work, we propose simple novel methods to generate correlated waveforms using finite alphabet constant and non-constant-envelope symbols. To generate finite alphabet waveforms, the proposed method map the Gaussian random variables onto the phase-shift-keying, pulse-amplitude, and quadrature-amplitude modulation schemes. For such mapping, the probability-density-function of Gaussian random variables is divided into M regions, where M is the number of alphabets in the corresponding modulation scheme. By exploiting the mapping function, the relationship between the cross-correlation of Gaussian and finite alphabet symbols is derived. To generate equiprobable symbols, the area of each region is kept same. If the requirement is to have each symbol with its own unique probability, the proposed scheme allows us that as well. Although, the proposed scheme is general, the main focus of this paper is to generate finite alphabet waveforms for multiple-input multiple-output radar, where correlated waveforms are used to achieve desired beampatterns. © 2014 IEEE.

  1. Using randomized variable practice in the treatment of childhood apraxia of speech.

    Science.gov (United States)

    Skelton, Steven L; Hagopian, Aubrie Lynn

    2014-11-01

    The purpose of this study was to determine if randomized variable practice, a central component of concurrent treatment, would be effective and efficient in treating childhood apraxia of speech (CAS). Concurrent treatment is a treatment program that takes the speech task hierarchy and randomizes it so that all tasks are worked on in one session. Previous studies have shown the treatment program to be effective and efficient in treating phonological and articulation disorders. The program was adapted to be used with children with CAS. A research design of multiple baselines across participants was used. Probes of generalization to untaught words were administered every fifth session. Three children, ranging in age from 4 to 6 years old, were the participants. Data were collected as percent correct productions during baseline, treatment, and probes of generalization of target sounds to untaught words and three-word phrases. All participants showed an increase in correct productions during treatment and during probes. Effect sizes (standard mean difference) for treatment were 3.61-5.00, and for generalization probes, they were 3.15-8.51. The results obtained from this study suggest that randomized variable practice as used in concurrent treatment can be adapted for use in treating children with CAS. Replication of this study with other children presenting CAS will be needed to establish generality of the findings.

  2. Problems of variance reduction in the simulation of random variables

    International Nuclear Information System (INIS)

    Lessi, O.

    1987-01-01

    The definition of the uniform linear generator is given and some of the mostly used tests to evaluate the uniformity and the independence of the obtained determinations are listed. The problem of calculating, through simulation, some moment W of a random variable function is taken into account. The Monte Carlo method enables the moment W to be estimated and the estimator variance to be obtained. Some techniques for the construction of other estimators of W with a reduced variance are introduced

  3. On the Distribution of Indefinite Quadratic Forms in Gaussian Random Variables

    KAUST Repository

    Al-Naffouri, Tareq Y.

    2015-10-30

    © 2015 IEEE. In this work, we propose a unified approach to evaluating the CDF and PDF of indefinite quadratic forms in Gaussian random variables. Such a quantity appears in many applications in communications, signal processing, information theory, and adaptive filtering. For example, this quantity appears in the mean-square-error (MSE) analysis of the normalized least-meansquare (NLMS) adaptive algorithm, and SINR associated with each beam in beam forming applications. The trick of the proposed approach is to replace inequalities that appear in the CDF calculation with unit step functions and to use complex integral representation of the the unit step function. Complex integration allows us then to evaluate the CDF in closed form for the zero mean case and as a single dimensional integral for the non-zero mean case. Utilizing the saddle point technique allows us to closely approximate such integrals in non zero mean case. We demonstrate how our approach can be extended to other scenarios such as the joint distribution of quadratic forms and ratios of such forms, and to characterize quadratic forms in isotropic distributed random variables.We also evaluate the outage probability in multiuser beamforming using our approach to provide an application of indefinite forms in communications.

  4. An edgeworth expansion for a sum of M-Dependent random variables

    Directory of Open Access Journals (Sweden)

    Wan Soo Rhee

    1985-01-01

    Full Text Available Given a sequence X1,X2,…,Xn of m-dependent random variables with moments of order 3+α (0<α≦1, we give an Edgeworth expansion of the distribution of Sσ−1(S=X1+X2+…+Xn, σ2=ES2 under the assumption that E[exp(it Sσ1] is small away from the origin. The result is of the best possible order.

  5. Analysis of Secret Key Randomness Exploiting the Radio Channel Variability

    Directory of Open Access Journals (Sweden)

    Taghrid Mazloum

    2015-01-01

    Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.

  6. Analysis of linguistic terms of variables representing the wave of arterial diameter variation in radial arteries using fuzzy entropies

    International Nuclear Information System (INIS)

    Nuno Almirantearena, F; Introzzi, A; Clara, F; Burillo Lopez, P

    2007-01-01

    In this work we use 53 Arterial Diameter Variation (ADV) waves extracted from radial artery of normotense males, along with the values of variables that represent the ADV wave, obtained by means of multivariate analysis. Then, we specify the linguistic variables and the linguistic terms. The variables are fuzzified using triangular and trapezoidal fuzzy numbers. We analyze the fuzziness of the linguistic terms by applying discrete and continuous fuzzy entropies. Finally, we infer which variable presents the greatest disorder associated to the loss of arterial elasticity in radial artery

  7. A simulation study on estimating biomarker-treatment interaction effects in randomized trials with prognostic variables.

    Science.gov (United States)

    Haller, Bernhard; Ulm, Kurt

    2018-02-20

    To individualize treatment decisions based on patient characteristics, identification of an interaction between a biomarker and treatment is necessary. Often such potential interactions are analysed using data from randomized clinical trials intended for comparison of two treatments. Tests of interactions are often lacking statistical power and we investigated if and how a consideration of further prognostic variables can improve power and decrease the bias of estimated biomarker-treatment interactions in randomized clinical trials with time-to-event outcomes. A simulation study was performed to assess how prognostic factors affect the estimate of the biomarker-treatment interaction for a time-to-event outcome, when different approaches, like ignoring other prognostic factors, including all available covariates or using variable selection strategies, are applied. Different scenarios regarding the proportion of censored observations, the correlation structure between the covariate of interest and further potential prognostic variables, and the strength of the interaction were considered. The simulation study revealed that in a regression model for estimating a biomarker-treatment interaction, the probability of detecting a biomarker-treatment interaction can be increased by including prognostic variables that are associated with the outcome, and that the interaction estimate is biased when relevant prognostic variables are not considered. However, the probability of a false-positive finding increases if too many potential predictors are included or if variable selection is performed inadequately. We recommend undertaking an adequate literature search before data analysis to derive information about potential prognostic variables and to gain power for detecting true interaction effects and pre-specifying analyses to avoid selective reporting and increased false-positive rates.

  8. A Statistical and Spectral Model for Representing Noisy Sounds with Short-Time Sinusoids

    Directory of Open Access Journals (Sweden)

    Myriam Desainte-Catherine

    2005-07-01

    Full Text Available We propose an original model for noise analysis, transformation, and synthesis: the CNSS model. Noisy sounds are represented with short-time sinusoids whose frequencies and phases are random variables. This spectral and statistical model represents information about the spectral density of frequencies. This perceptually relevant property is modeled by three mathematical parameters that define the distribution of the frequencies. This model also represents the spectral envelope. The mathematical parameters are defined and the analysis algorithms to extract these parameters from sounds are introduced. Then algorithms for generating sounds from the parameters of the model are presented. Applications of this model include tools for composers, psychoacoustic experiments, and pedagogy.

  9. Application of a random network with a variable geometry of links to the kinetics of drug elimination in healthy and diseased livers

    Science.gov (United States)

    Chelminiak, P.; Dixon, J. M.; Tuszyński, J. A.; Marsh, R. E.

    2006-05-01

    This paper discusses an application of a random network with a variable number of links and traps to the elimination of drug molecules from the body by the liver. The nodes and links represent the transport vessels, and the traps represent liver cells with metabolic enzymes that eliminate drug molecules. By varying the number and configuration of links and nodes, different disease states of the liver related to vascular damage have been simulated, and the effects on the rate of elimination of a drug have been investigated. Results of numerical simulations show the prevalence of exponential decay curves with rates that depend on the concentration of links. In the case of fractal lattices at the percolation threshold, we find that the decay of the concentration is described by exponential functions for high trap concentrations but transitions to stretched exponential behavior at low trap concentrations.

  10. What variables are important in predicting bovine viral diarrhea virus? A random forest approach.

    Science.gov (United States)

    Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo

    2015-07-24

    Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.

  11. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  12. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  13. Equivalent conditions of complete moment convergence for extended negatively dependent random variables

    Directory of Open Access Journals (Sweden)

    Qunying Wu

    2017-05-01

    Full Text Available Abstract In this paper, we study the equivalent conditions of complete moment convergence for sequences of identically distributed extended negatively dependent random variables. As a result, we extend and generalize some results of complete moment convergence obtained by Chow (Bull. Inst. Math. Acad. Sin. 16:177-201, 1988 and Li and Spătaru (J. Theor. Probab. 18:933-947, 2005 from the i.i.d. case to extended negatively dependent sequences.

  14. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  15. Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables

    KAUST Repository

    Jardak, Seifallah

    2012-11-01

    The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.

  16. Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables

    KAUST Repository

    Jardak, Seifallah; Ahmed, Sajid; Alouini, Mohamed-Slim

    2012-01-01

    The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.

  17. Spontaneous temporal changes and variability of peripheral nerve conduction analyzed using a random effects model

    DEFF Research Database (Denmark)

    Krøigård, Thomas; Gaist, David; Otto, Marit

    2014-01-01

    SUMMARY: The reproducibility of variables commonly included in studies of peripheral nerve conduction in healthy individuals has not previously been analyzed using a random effects regression model. We examined the temporal changes and variability of standard nerve conduction measures in the leg...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity......, sural nerve sensory conduction velocity, and tibial nerve minimal F-wave latency. Between-subject variability was greater than within-subject variability. Sample sizes ranging from 21 to 128 would be required to show changes twice the magnitude of the spontaneous changes observed in this study. Nerve...

  18. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  19. The Distribution of Minimum of Ratios of Two Random Variables and Its Application in Analysis of Multi-hop Systems

    Directory of Open Access Journals (Sweden)

    A. Stankovic

    2012-12-01

    Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.

  20. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  1. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  2. Clinical Implications of Glucose Variability: Chronic Complications of Diabetes

    Directory of Open Access Journals (Sweden)

    Hye Seung Jung

    2015-06-01

    Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.

  3. Physical activity, mindfulness meditation, or heart rate variability biofeedback for stress reduction: a randomized controlled trial

    NARCIS (Netherlands)

    van der Zwan, J.E.; de Vente, W.; Huizink, A.C.; Bögels, S.M.; de Bruin, E.I.

    2015-01-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing

  4. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik; Alouini, Mohamed-Slim; Yang, Hongchuan

    2010-01-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs

  5. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-05-02

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).

  6. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  7. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  8. Residual and Past Entropy for Concomitants of Ordered Random Variables of Morgenstern Family

    Directory of Open Access Journals (Sweden)

    M. M. Mohie EL-Din

    2015-01-01

    Full Text Available For a system, which is observed at time t, the residual and past entropies measure the uncertainty about the remaining and the past life of the distribution, respectively. In this paper, we have presented the residual and past entropy of Morgenstern family based on the concomitants of the different types of generalized order statistics (gos and give the linear transformation of such model. Characterization results for these dynamic entropies for concomitants of ordered random variables have been considered.

  9. Representing Degree Distributions, Clustering, and Homophily in Social Networks With Latent Cluster Random Effects Models.

    Science.gov (United States)

    Krivitsky, Pavel N; Handcock, Mark S; Raftery, Adrian E; Hoff, Peter D

    2009-07-01

    Social network data often involve transitivity, homophily on observed attributes, clustering, and heterogeneity of actor degrees. We propose a latent cluster random effects model to represent all of these features, and we describe a Bayesian estimation method for it. The model is applicable to both binary and non-binary network data. We illustrate the model using two real datasets. We also apply it to two simulated network datasets with the same, highly skewed, degree distribution, but very different network behavior: one unstructured and the other with transitivity and clustering. Models based on degree distributions, such as scale-free, preferential attachment and power-law models, cannot distinguish between these very different situations, but our model does.

  10. Competing order parameters in quenched random alloys: Fe/sub 1-x/Co/sub x/Cl2

    International Nuclear Information System (INIS)

    Wong, P.; Horn, P.M.; Birgeneau, R.J.; Safinya, C.R.; Shirane, G.

    1980-01-01

    A study is reported of the magnetic properties of the random alloy Fe/sub 1-x/Co/sub x/Cl 2 , which represents an archetypal example of a system with competing orthogonal spin anisotropies. Behavior similar to previous experiments and theoretical predictions is found, but with important qualitative and quantitative differences; in particular the phase transition in one variable is drastically altered by the existence of long-range order in the other variable. It is hypothesized that this is due to microscopic random-field effects

  11. On the strong law of large numbers for $\\varphi$-subgaussian random variables

    OpenAIRE

    Zajkowski, Krzysztof

    2016-01-01

    For $p\\ge 1$ let $\\varphi_p(x)=x^2/2$ if $|x|\\le 1$ and $\\varphi_p(x)=1/p|x|^p-1/p+1/2$ if $|x|>1$. For a random variable $\\xi$ let $\\tau_{\\varphi_p}(\\xi)$ denote $\\inf\\{a\\ge 0:\\;\\forall_{\\lambda\\in\\mathbb{R}}\\; \\ln\\mathbb{E}\\exp(\\lambda\\xi)\\le\\varphi_p(a\\lambda)\\}$; $\\tau_{\\varphi_p}$ is a norm in a space $Sub_{\\varphi_p}=\\{\\xi:\\;\\tau_{\\varphi_p}(\\xi)1$) there exist positive constants $c$ and $\\alpha$ such that for every natural number $n$ the following inequality $\\tau_{\\varphi_p}(\\sum_{i=1...

  12. Convolutions of Heavy Tailed Random Variables and Applications to Portfolio Diversification and MA(1) Time Series

    NARCIS (Netherlands)

    J.L. Geluk (Jaap); L. Peng (Liang); C.G. de Vries (Casper)

    1999-01-01

    textabstractThe paper characterizes first and second order tail behavior of convolutions of i.i.d. heavy tailed random variables with support on the real line. The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.

  13. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  14. Burnout in Customer Service Representatives

    Directory of Open Access Journals (Sweden)

    Tariq Jalees

    2008-09-01

    Full Text Available The purpose and aim of this research was to (1 identify the factors that contributes towards job burnout in sales service representative (2 What are the relationships of these factors (3 To empirically test the relationships of the determinants relating to burnout in customer service representatives. Based on literature survey six different variables related to burnout were identified. The variables were (1 Emotional exhaustion.(2 Reduced personal accomplishment.(3 Job induced tension.(4 Job satisfaction.(5 Workload (6 Job satisfaction.Each of the variables contained 3 sub-variables. Five different hypotheses were developed and tested through techniques such as Z-test, F-test and regression analysis. The questionnaire administered for the study contained 15 questions including personal data. The subject was Moblink company customers sales service representative in Karachi.The valid sample size was 98 drawn through multi-cluster technique. Techniques such as measure of dispersion and measure of central tendencies were used for analyzing the data. Regression, Z-test, and F-test were used for testing the developed hypothesis.According to the respondents’ opinions, the reduced personal accomplishment had a high rating with a mean of 3.75 and job induced tension has the lowest mean of 3.58. The standard deviation of respondents’ opinions was highest for dimension depersonalization and least for dimension work load. This indicates that there is a high polarization of the respondents’ opinions on the dimension depersonalization moral and least on the dimension work load.The Skew nesses for all the dimensions were in negative except the determinants emotional exhaustion and workload. This indicates that the majority of respondents’ opinions on all the dimensions were below the mean except in the case of emotional exhaustion and workload.Five hypotheses were developed and tested:a The hypothesis relating to low level of burnout in customers

  15. Simulation-based production planning for engineer-to-order systems with random yield

    NARCIS (Netherlands)

    Akcay, Alp; Martagan, Tugce

    2018-01-01

    We consider an engineer-to-order production system with unknown yield. We model the yield as a random variable which represents the percentage output obtained from one unit of production quantity. We develop a beta-regression model in which the mean value of the yield depends on the unique

  16. Simple, efficient estimators of treatment effects in randomized trials using generalized linear models to leverage baseline variables.

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J

    2010-04-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.

  17. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  18. The use of random amplified polymorphic DNA to evaluate the genetic variability of Ponkan mandarin (Citrus reticulata Blanco accessions

    Directory of Open Access Journals (Sweden)

    Coletta Filho Helvécio Della

    2000-01-01

    Full Text Available RAPD analysis of 19 Ponkan mandarin accessions was performed using 25 random primers. Of 112 amplification products selected, only 32 were polymorphic across five accessions. The absence of genetic variability among the other 14 accessions suggested that they were either clonal propagations with different local names, or that they had undetectable genetic variability, such as point mutations which cannot be detected by RAPD.

  19. THE COVARIATION FUNCTION FOR SYMMETRIC &ALPHA;-STABLE RANDOM VARIABLES WITH FINITE FIRST MOMENTS

    Directory of Open Access Journals (Sweden)

    Dedi Rosadi

    2012-05-01

    Full Text Available In this paper, we discuss a generalized dependence measure which is designed to measure dependence of two symmetric α-stable random variables with finite mean(1<α<=2 and contains the covariance function as the special case (when α=2. Weshortly discuss some basic properties of the function and consider several methods to estimate the function and further investigate the numerical properties of the estimatorusing the simulated data. We show how to apply this function to measure dependence of some stock returns on the composite index LQ45 in Indonesia Stock Exchange.

  20. A Method of Approximating Expectations of Functions of Sums of Independent Random Variables

    OpenAIRE

    Klass, Michael J.

    1981-01-01

    Let $X_1, X_2, \\cdots$ be a sequence of independent random variables with $S_n = \\sum^n_{i = 1} X_i$. Fix $\\alpha > 0$. Let $\\Phi(\\cdot)$ be a continuous, strictly increasing function on $\\lbrack 0, \\infty)$ such that $\\Phi(0) = 0$ and $\\Phi(cx) \\leq c^\\alpha\\Phi(x)$ for all $x > 0$ and all $c \\geq 2$. Suppose $a$ is a real number and $J$ is a finite nonempty subset of the positive integers. In this paper we are interested in approximating $E \\max_{j \\in J} \\Phi(|a + S_j|)$. We construct a nu...

  1. Effects of randomness on chaos and order of coupled logistic maps

    International Nuclear Information System (INIS)

    Savi, Marcelo A.

    2007-01-01

    Natural systems are essentially nonlinear being neither completely ordered nor completely random. These nonlinearities are responsible for a great variety of possibilities that includes chaos. On this basis, the effect of randomness on chaos and order of nonlinear dynamical systems is an important feature to be understood. This Letter considers randomness as fluctuations and uncertainties due to noise and investigates its influence in the nonlinear dynamical behavior of coupled logistic maps. The noise effect is included by adding random variations either to parameters or to state variables. Besides, the coupling uncertainty is investigated by assuming tinny values for the connection parameters, representing the idea that all Nature is, in some sense, weakly connected. Results from numerical simulations show situations where noise alters the system nonlinear dynamics

  2. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient.

    Science.gov (United States)

    Plass-Johnson, Jeremiah G; Taylor, Marc H; Husain, Aidah A A; Teichberg, Mirta C; Ferse, Sebastian C A

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased

  3. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    International Nuclear Information System (INIS)

    Loubenets, Elena R.

    2015-01-01

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence of this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)

  4. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  5. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  6. Uniformity transition for ray intensities in random media

    Science.gov (United States)

    Pradas, Marc; Pumir, Alain; Wilkinson, Michael

    2018-04-01

    This paper analyses a model for the intensity of distribution for rays propagating without absorption in a random medium. The random medium is modelled as a dynamical map. After N iterations, the intensity is modelled as a sum S of {{\\mathcal N}} contributions from different trajectories, each of which is a product of N independent identically distributed random variables x k , representing successive focussing or de-focussing events. The number of ray trajectories reaching a given point is assumed to proliferate exponentially: {{\\mathcal N}}=ΛN , for some Λ>1 . We investigate the probability distribution of S. We find a phase transition as parameters of the model are varied. There is a phase where the fluctuations of S are suppressed as N\\to ∞ , and a phase where the S has large fluctuations, for which we provide a large deviation analysis.

  7. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely.

    Science.gov (United States)

    Widaman, Keith F; Grimm, Kevin J; Early, Dawnté R; Robins, Richard W; Conger, Rand D

    2013-07-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group.

  8. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  9. Subjective measures of household resilience to climate variability and change: insights from a nationally representative survey of Tanzania

    Directory of Open Access Journals (Sweden)

    Lindsey Jones

    2018-03-01

    Full Text Available Promoting household resilience to climate extremes has emerged as a key development priority. Yet tracking and evaluating resilience at this level remains a critical challenge. Most quantitative approaches rely on objective indicators and assessment frameworks, but these are not fully satisfactory. Much of the difficulty arises from a combination of conceptual ambiguities, challenges in selecting appropriate indicators, and in measuring the many intangible aspects that contribute to household resilience. More recently, subjective measures of resilience have been advocated in helping to overcome some of the limitations of traditional objective characterizations. However, few large-scale studies of quantitative subjective approaches to resilience measurement have been conducted. In this study, we address this gap by exploring perceived levels of household resilience to climate extremes in Tanzania and the utility of standardized subjective methods for its assessment. A nationally representative cross-sectional survey involving 1294 individuals was carried out by mobile phone in June 2015 among randomly selected adult respondents aged 18 and above. Factors that are most associated with resilience-related capacities are having had advance knowledge of a previous flood, and to a lesser extent, believing flooding to be a serious community problem. Somewhat surprisingly, though a small number of weak relationships are apparent, most socio-demographic variables do not exhibit statistically significant differences with regards to perceived resilience-related capacities. These findings may challenge traditional assumptions about what factors characterize household resilience, offering a motivation for studying both subjective and objective perspectives, and understanding better their relationship to one another. If further validated, subjective measures may offer potential as both a complement and alternative to traditional objective methods of resilience

  10. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    Science.gov (United States)

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  11. Convolutions of Heavy Tailed Random Variables and Applications to Portfolio Diversification and MA(1) Time Series

    OpenAIRE

    Geluk, Jaap; Peng, Liang; de Vries, Casper G.

    1999-01-01

    Suppose X1,X2 are independent random variables satisfying a second-order regular variation condition on the tail-sum and a balance condition on the tails. In this paper we give a description of the asymptotic behaviour as t → ∞ for P(X1 + X2 > t). The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.

  12. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  13. Drop Spreading with Random Viscosity

    Science.gov (United States)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  14. Vertical random variability of the distribution coefficient in the soil and its effect on the migration of fallout radionuclides

    International Nuclear Information System (INIS)

    Bunzl, K.

    2002-01-01

    In the field, the distribution coefficient, K d , for the sorption of a radionuclide by the soil cannot be expected to be constant. Even in a well defined soil horizon, K d will vary stochastically in horizontal as well as in vertical direction around a mean value. The horizontal random variability of K d produce a pronounced tailing effect in the concentration depth profile of a fallout radionuclide, much less is known on the corresponding effect of the vertical random variability. To analyze this effect theoretically, the classical convection-dispersion model in combination with the random-walk particle method was applied. The concentration depth profile of a radionuclide was calculated one year after deposition assuming constant values of the pore water velocity, the diffusion/dispersion coefficient, and the distribution coefficient (K d = 100 cm 3 x g -1 ) and exhibiting a vertical variability for K d according to a log-normal distribution with a geometric mean of 100 cm 3 x g -1 and a coefficient of variation of CV 0.53. The results show that these two concentration depth profiles are only slightly different, the location of the peak is shifted somewhat upwards, and the dispersion of the concentration depth profile is slightly larger. A substantial tailing effect of the concentration depth profile is not perceivable. Especially with respect to the location of the peak, a very good approximation of the concentration depth profile is obtained if the arithmetic mean of the K d -values (K d = 113 cm 3 x g -1 ) and a slightly increased dispersion coefficient are used in the analytical solution of the classical convection-dispersion equation with constant K d . The evaluation of the observed concentration depth profile with the analytical solution of the classical convection-dispersion equation with constant parameters will, within the usual experimental limits, hardly reveal the presence of a log-normal random distribution of K d in the vertical direction in

  15. An AUC-based permutation variable importance measure for random forests.

    Science.gov (United States)

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  16. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  17. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    In this paper, we investigate a dynamical system in a random setting of dual randomness in space and time variables in which both the imperfection of the structure and the load function are considered random , each with a statistical zero-mean .The auto- covariance of the load is correlated as an exponentially decaying ...

  18. How well can body size represent effects of the environment on demographic rates? Disentangling correlated explanatory variables.

    Science.gov (United States)

    Brooks, Mollie E; Mugabo, Marianne; Rodgers, Gwendolen M; Benton, Timothy G; Ozgul, Arpat

    2016-03-01

    Demographic rates are shaped by the interaction of past and current environments that individuals in a population experience. Past environments shape individual states via selection and plasticity, and fitness-related traits (e.g. individual size) are commonly used in demographic analyses to represent the effect of past environments on demographic rates. We quantified how well the size of individuals captures the effects of a population's past and current environments on demographic rates in a well-studied experimental system of soil mites. We decomposed these interrelated sources of variation with a novel method of multiple regression that is useful for understanding nonlinear relationships between responses and multicollinear explanatory variables. We graphically present the results using area-proportional Venn diagrams. Our novel method was developed by combining existing methods and expanding upon them. We showed that the strength of size as a proxy for the past environment varied widely among vital rates. For instance, in this organism with an income breeding life history, the environment had more effect on reproduction than individual size, but with substantial overlap indicating that size encompassed some of the effects of the past environment on fecundity. This demonstrates that the strength of size as a proxy for the past environment can vary widely among life-history processes within a species, and this variation should be taken into consideration in trait-based demographic or individual-based approaches that focus on phenotypic traits as state variables. Furthermore, the strength of a proxy will depend on what state variable(s) and what demographic rate is being examined; that is, different measures of body size (e.g. length, volume, mass, fat stores) will be better or worse proxies for various life-history processes. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  19. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  20. Genetic Variants Contribute to Gene Expression Variability in Humans

    Science.gov (United States)

    Hulse, Amanda M.; Cai, James J.

    2013-01-01

    Expression quantitative trait loci (eQTL) studies have established convincing relationships between genetic variants and gene expression. Most of these studies focused on the mean of gene expression level, but not the variance of gene expression level (i.e., gene expression variability). In the present study, we systematically explore genome-wide association between genetic variants and gene expression variability in humans. We adapt the double generalized linear model (dglm) to simultaneously fit the means and the variances of gene expression among the three possible genotypes of a biallelic SNP. The genomic loci showing significant association between the variances of gene expression and the genotypes are termed expression variability QTL (evQTL). Using a data set of gene expression in lymphoblastoid cell lines (LCLs) derived from 210 HapMap individuals, we identify cis-acting evQTL involving 218 distinct genes, among which 8 genes, ADCY1, CTNNA2, DAAM2, FERMT2, IL6, PLOD2, SNX7, and TNFRSF11B, are cross-validated using an extra expression data set of the same LCLs. We also identify ∼300 trans-acting evQTL between >13,000 common SNPs and 500 randomly selected representative genes. We employ two distinct scenarios, emphasizing single-SNP and multiple-SNP effects on expression variability, to explain the formation of evQTL. We argue that detecting evQTL may represent a novel method for effectively screening for genetic interactions, especially when the multiple-SNP influence on expression variability is implied. The implication of our results for revealing genetic mechanisms of gene expression variability is discussed. PMID:23150607

  1. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    Science.gov (United States)

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  2. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-01

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss common modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges

  3. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  4. Comparison of variability in pork carcass composition and quality between barrows and gilts.

    Science.gov (United States)

    Overholt, M F; Arkfeld, E K; Mohrhauser, D A; King, D A; Wheeler, T L; Dilger, A C; Shackelford, S D; Boler, D D

    2016-10-01

    Pigs ( = 8,042) raised in 8 different barns representing 2 seasons (cold and hot) and 2 production focuses (lean growth and meat quality) were used to characterize variability of carcass composition and quality traits between barrows and gilts. Data were collected on 7,684 pigs at the abattoir. Carcass characteristics, subjective loin quality, and fresh ham face color (muscles) were measured on a targeted 100% of carcasses. Fresh belly characteristics, boneless loin weight, instrumental loin color, and ultimate loin pH measurements were collected from 50% of the carcasses each slaughter day. Adipose tissue iodine value (IV), 30-min loin pH, LM slice shear force, and fresh ham muscle characteristic measurements were recorded on 10% of carcasses each slaughter day. Data were analyzed using the MIXED procedure of SAS as a 1-way ANOVA in a randomized complete block design with 2 levels (barrows and gilts). Barn (block), marketing group, production focus, and season were random variables. A 2-variance model was fit using the REPEATED statement of the MIXED procedure, grouped by sex for analysis of least squares means. Homogeneity of variance was tested on raw data using Levene's test of the GLM procedure. Hot carcass weight of pigs (94.6 kg) in this study was similar to U.S. industry average HCW (93.1 kg). Therefore, these data are representative of typical U.S. pork carcasses. There was no difference ( ≥ 0.09) in variability of HCW or loin depth between barrow and gilt carcasses. Back fat depth and estimated carcass lean were more variable ( ≤ 0.0001) and IV was less variable ( = 0.05) in carcasses from barrows than in carcasses from gilts. Fresh belly weight and thickness were more variable ( ≤ 0.01) for bellies of barrows than bellies of gilts, but there was no difference in variability for belly length, width, or flop distance ( ≥ 0.06). Fresh loin subjective color was less variable ( ham traits. Overall, traits associated with carcass fatness, including

  5. On the fluctuations of sums of independent random variables.

    Science.gov (United States)

    Feller, W

    1969-07-01

    If X(1), X(2),... are independent random variables with zero expectation and finite variances, the cumulative sums S(n) are, on the average, of the order of magnitude S(n), where S(n) (2) = E(S(n) (2)). The occasional maxima of the ratios S(n)/S(n) are surprisingly large and the problem is to estimate the extent of their probable fluctuations.Specifically, let S(n) (*) = (S(n) - b(n))/a(n), where {a(n)} and {b(n)}, two numerical sequences. For any interval I, denote by p(I) the probability that the event S(n) (*) epsilon I occurs for infinitely many n. Under mild conditions on {a(n)} and {b(n)}, it is shown that p(I) equals 0 or 1 according as a certain series converges or diverges. To obtain the upper limit of S(n)/a(n), one has to set b(n) = +/- epsilon a(n), but finer results are obtained with smaller b(n). No assumptions concerning the under-lying distributions are made; the criteria explain structurally which features of {X(n)} affect the fluctuations, but for concrete results something about P{S(n)>a(n)} must be known. For example, a complete solution is possible when the X(n) are normal, replacing the classical law of the iterated logarithm. Further concrete estimates may be obtained by combining the new criteria with some recently developed limit theorems.

  6. Representing general theoretical concepts in structural equation models: The role of composite variables

    Science.gov (United States)

    Grace, J.B.; Bollen, K.A.

    2008-01-01

    Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically-based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling heterogeneous concepts of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially-reduced-form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influence of suites of variables are often of interest. ?? Springer Science+Business Media, LLC 2007.

  7. MODELING THE TIME VARIABILITY OF SDSS STRIPE 82 QUASARS AS A DAMPED RANDOM WALK

    International Nuclear Information System (INIS)

    MacLeod, C. L.; Ivezic, Z.; Bullock, E.; Kimball, A.; Sesar, B.; Westman, D.; Brooks, K.; Gibson, R.; Becker, A. C.; Kochanek, C. S.; Kozlowski, S.; Kelly, B.; De Vries, W. H.

    2010-01-01

    We model the time variability of ∼9000 spectroscopically confirmed quasars in SDSS Stripe 82 as a damped random walk (DRW). Using 2.7 million photometric measurements collected over 10 yr, we confirm the results of Kelly et al. and Kozlowski et al. that this model can explain quasar light curves at an impressive fidelity level (0.01-0.02 mag). The DRW model provides a simple, fast (O(N) for N data points), and powerful statistical description of quasar light curves by a characteristic timescale (τ) and an asymptotic rms variability on long timescales (SF ∞ ). We searched for correlations between these two variability parameters and physical parameters such as luminosity and black hole mass, and rest-frame wavelength. Our analysis shows SF ∞ to increase with decreasing luminosity and rest-frame wavelength as observed previously, and without a correlation with redshift. We find a correlation between SF ∞ and black hole mass with a power-law index of 0.18 ± 0.03, independent of the anti-correlation with luminosity. We find that τ increases with increasing wavelength with a power-law index of 0.17, remains nearly constant with redshift and luminosity, and increases with increasing black hole mass with a power-law index of 0.21 ± 0.07. The amplitude of variability is anti-correlated with the Eddington ratio, which suggests a scenario where optical fluctuations are tied to variations in the accretion rate. However, we find an additional dependence on luminosity and/or black hole mass that cannot be explained by the trend with Eddington ratio. The radio-loudest quasars have systematically larger variability amplitudes by about 30%, when corrected for the other observed trends, while the distribution of their characteristic timescale is indistinguishable from that of the full sample. We do not detect any statistically robust differences in the characteristic timescale and variability amplitude between the full sample and the small subsample of quasars detected

  8. Contribution to the application of the random vibration theory to the seismic analysis of structures via state variables

    International Nuclear Information System (INIS)

    Maestrini, A.P.

    1979-04-01

    Several problems related to the application of the theory of random by means of state variables are studied. The well-known equations that define the propagation of the mean and the variance for linear and non-linear systems are first presented. The Monte Carlo method is next resorted to in order to determine the applicability of the hypothesis of a normally distributed output in case of linear systems subjected to non-Gaussian excitations. Finally, attention is focused on the properties of linear filters and modulation functions proposed to simulate seismic excitations as non stationary random processes. Acceleration spectra obtained by multiplying rms spectra by a constant factor are compared with design spectra suggested by several authors for various soil conditions. In every case, filter properties are given. (Author) [pt

  9. Bottom-up and Top-down Input Augment the Variability of Cortical Neurons

    Science.gov (United States)

    Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.

    2016-01-01

    SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459

  10. Convergence Analysis of Semi-Implicit Euler Methods for Solving Stochastic Age-Dependent Capital System with Variable Delays and Random Jump Magnitudes

    Directory of Open Access Journals (Sweden)

    Qinghui Du

    2014-01-01

    Full Text Available We consider semi-implicit Euler methods for stochastic age-dependent capital system with variable delays and random jump magnitudes, and investigate the convergence of the numerical approximation. It is proved that the numerical approximate solutions converge to the analytical solutions in the mean-square sense under given conditions.

  11. Qualitatively Assessing Randomness in SVD Results

    Science.gov (United States)

    Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.

    2012-12-01

    Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.

  12. Randomness and locality in quantum mechanics

    International Nuclear Information System (INIS)

    Bub, J.

    1976-01-01

    This paper considers the problem of representing the statistical states of a quantum mechanical system by measures on a classical probability space. The Kochen and Specker theorem proves the impossibility of embedding the possibility structure of a quantum mechanical system into a Boolean algebra. It is shown that a hidden variable theory involves a Boolean representation which is not an embedding, and that such a representation cannot recover the quantum statistics for sequential probabilities without introducing a randomization process for the hidden variables which is assumed to apply only on measurement. It is suggested that the relation of incompatability is to be understood as a type of stochastic independence, and that the indeterminism of a quantum mechanical system is engendered by the existence of independent families of properties. Thus, the statistical relations reflect the possibility structure of the system: the probabilities are logical. The hidden variable thesis is influenced by the Copenhagen interpretation of quantum mechanics, i.e. by some version of the disturbance theory of measurement. Hence, the significance of the representation problem is missed, and the completeness of quantum mechanics is seen to turn on the possibility of recovering the quantum statistics by a hidden variable scheme which satisfies certain physically motivated conditions, such as locality. Bell's proof that no local hidden variable theory can reproduce the statistical relations of quantum mechanics is considered. (Auth.)

  13. Does Self-Selection Affect Samples’ Representativeness in Online Surveys? An Investigation in Online Video Game Research

    Science.gov (United States)

    van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-01-01

    Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007

  14. Some limit theorems for negatively associated random variables

    Indian Academy of Sciences (India)

    random sampling without replacement, and (i) joint distribution of ranks. ... wide applications in multivariate statistical analysis and system reliability, the ... strong law of large numbers for negatively associated sequences under the case where.

  15. A biorthogonal decomposition for the identification and simulation of non-stationary and non-Gaussian random fields

    Energy Technology Data Exchange (ETDEWEB)

    Zentner, I. [IMSIA, UMR EDF-ENSTA-CNRS-CEA 9219, Université Paris-Saclay, 828 Boulevard des Maréchaux, 91762 Palaiseau Cedex (France); Ferré, G., E-mail: gregoire.ferre@ponts.org [CERMICS – Ecole des Ponts ParisTech, 6 et 8 avenue Blaise Pascal, Cité Descartes, Champs sur Marne, 77455 Marne la Vallée Cedex 2 (France); Poirion, F. [Department of Structural Dynamics and Aeroelasticity, ONERA, BP 72, 29 avenue de la Division Leclerc, 92322 Chatillon Cedex (France); Benoit, M. [Institut de Recherche sur les Phénomènes Hors Equilibre (IRPHE), UMR 7342 (CNRS, Aix-Marseille Université, Ecole Centrale Marseille), 49 rue Frédéric Joliot-Curie, BP 146, 13384 Marseille Cedex 13 (France)

    2016-06-01

    In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated by applications to earthquakes (seismic ground motion) and sea states (wave heights).

  16. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2008-03-01

    Full Text Available The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space. These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras. The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  17. A simplified method for random vibration analysis of structures with random parameters

    International Nuclear Information System (INIS)

    Ghienne, Martin; Blanzé, Claude

    2016-01-01

    Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues. (paper)

  18. Are glucose levels, glucose variability and autonomic control influenced by inspiratory muscle exercise in patients with type 2 diabetes? Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Schein, Aso; Correa, Aps; Casali, Karina Rabello; Schaan, Beatriz D

    2016-01-20

    Physical exercise reduces glucose levels and glucose variability in patients with type 2 diabetes. Acute inspiratory muscle exercise has been shown to reduce these parameters in a small group of patients with type 2 diabetes, but these results have yet to be confirmed in a well-designed study. The aim of this study is to investigate the effect of acute inspiratory muscle exercise on glucose levels, glucose variability, and cardiovascular autonomic function in patients with type 2 diabetes. This study will use a randomized clinical trial crossover design. A total of 14 subjects will be recruited and randomly allocated to two groups to perform acute inspiratory muscle loading at 2 % of maximal inspiratory pressure (PImax, placebo load) or 60 % of PImax (experimental load). Inspiratory muscle training could be a novel exercise modality to be used to decrease glucose levels and glucose variability. ClinicalTrials.gov NCT02292810 .

  19. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2010-01-01

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical random fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.

  20. Random volumes from matrices

    Energy Technology Data Exchange (ETDEWEB)

    Fukuma, Masafumi; Sugishita, Sotaro; Umeda, Naoya [Department of Physics, Kyoto University,Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)

    2015-07-17

    We propose a class of models which generate three-dimensional random volumes, where each configuration consists of triangles glued together along multiple hinges. The models have matrices as the dynamical variables and are characterized by semisimple associative algebras A. Although most of the diagrams represent configurations which are not manifolds, we show that the set of possible diagrams can be drastically reduced such that only (and all of the) three-dimensional manifolds with tetrahedral decompositions appear, by introducing a color structure and taking an appropriate large N limit. We examine the analytic properties when A is a matrix ring or a group ring, and show that the models with matrix ring have a novel strong-weak duality which interchanges the roles of triangles and hinges. We also give a brief comment on the relationship of our models with the colored tensor models.

  1. Bubble CPAP versus CPAP with variable flow in newborns with respiratory distress: a randomized controlled trial.

    Science.gov (United States)

    Yagui, Ana Cristina Zanon; Vale, Luciana Assis Pires Andrade; Haddad, Luciana Branco; Prado, Cristiane; Rossi, Felipe Souza; Deutsch, Alice D Agostini; Rebello, Celso Moura

    2011-01-01

    To evaluate the efficacy and safety of nasal continuous positive airway pressure (NCPAP) using devices with variable flow or bubble continuous positive airway pressure (CPAP) regarding CPAP failure, presence of air leaks, total CPAP and oxygen time, and length of intensive care unit and hospital stay in neonates with moderate respiratory distress (RD) and birth weight (BW) ≥ 1,500 g. Forty newborns requiring NCPAP were randomized into two study groups: variable flow group (VF) and continuous flow group (CF). The study was conducted between October 2008 and April 2010. Demographic data, CPAP failure, presence of air leaks, and total CPAP and oxygen time were recorded. Categorical outcomes were tested using the chi-square test or the Fisher's exact test. Continuous variables were analyzed using the Mann-Whitney test. The level of significance was set at p CPAP failure (21.1 and 20.0% for VF and CF, respectively; p = 1.000), air leak syndrome (10.5 and 5.0%, respectively; p = 0.605), total CPAP time (median: 22.0 h, interquartile range [IQR]: 8.00-31.00 h and median: 22.0 h, IQR: 6.00-32.00 h, respectively; p = 0.822), and total oxygen time (median: 24.00 h, IQR: 7.00-85.00 h and median: 21.00 h, IQR: 9.50-66.75 h, respectively; p = 0.779). In newborns with BW ≥ 1,500 g and moderate RD, the use of continuous flow NCPAP showed the same benefits as the use of variable flow NCPAP.

  2. The quotient of normal random variables and application to asset price fat tails

    Science.gov (United States)

    Caginalp, Carey; Caginalp, Gunduz

    2018-06-01

    The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.

  3. European Randomized Study of Screening for Prostate Cancer Risk Calculator: External Validation, Variability, and Clinical Significance.

    Science.gov (United States)

    Gómez-Gómez, Enrique; Carrasco-Valiente, Julia; Blanca-Pedregosa, Ana; Barco-Sánchez, Beatriz; Fernandez-Rueda, Jose Luis; Molina-Abril, Helena; Valero-Rosa, Jose; Font-Ugalde, Pilar; Requena-Tapia, Maria José

    2017-04-01

    To externally validate the European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculator (RC) and to evaluate its variability between 2 consecutive prostate-specific antigen (PSA) values. We prospectively catalogued 1021 consecutive patients before prostate biopsy for suspicion of prostate cancer (PCa). The risk of PCa and significant PCa (Gleason score ≥7) from 749 patients was calculated according to ERSPC-RC (digital rectal examination-based version 3 of 4) for 2 consecutive PSA tests per patient. The calculators' predictions were analyzed using calibration plots and the area under the receiver operating characteristic curve (area under the curve). Cohen kappa coefficient was used to compare the ability and variability. Of 749 patients, PCa was detected in 251 (33.5%) and significant PCa was detected in 133 (17.8%). Calibration plots showed an acceptable parallelism and similar discrimination ability for both PSA levels with an area under the curve of 0.69 for PCa and 0.74 for significant PCa. The ERSPC showed 226 (30.2%) unnecessary biopsies with the loss of 10 significant PCa. The variability of the RC was 16% for PCa and 20% for significant PCa, and a higher variability was associated with a reduced risk of significant PCa. We can conclude that the performance of the ERSPC-RC in the present cohort shows a high similitude between the 2 PSA levels; however, the RC variability value is associated with a decreased risk of significant PCa. The use of the ERSPC in our cohort detects a high number of unnecessary biopsies. Thus, the incorporation of ERSPC-RC could help the clinical decision to carry out a prostate biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.

    Science.gov (United States)

    Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco

    2005-02-01

    Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.

  5. Lower limits for distribution tails of randomly stopped sums

    NARCIS (Netherlands)

    Denisov, D.E.; Korshunov, D.A.; Foss, S.G.

    2008-01-01

    We study lower limits for the ratio $\\overline{F^{*\\tau}}(x)/\\,\\overline F(x)$ of tail distributions, where $F^{*\\tau}$ is a distribution of a sum of a random size $\\tau$ of independent identically distributed random variables having a common distribution $F$, and a random variable $\\tau$ does not

  6. Statistical conditional sampling for variable-resolution video compression.

    Directory of Open Access Journals (Sweden)

    Alexander Wong

    Full Text Available In this study, we investigate a variable-resolution approach to video compression based on Conditional Random Field and statistical conditional sampling in order to further improve compression rate while maintaining high-quality video. In the proposed approach, representative key-frames within a video shot are identified and stored at full resolution. The remaining frames within the video shot are stored and compressed at a reduced resolution. At the decompression stage, a region-based dictionary is constructed from the key-frames and used to restore the reduced resolution frames to the original resolution via statistical conditional sampling. The sampling approach is based on the conditional probability of the CRF modeling by use of the constructed dictionary. Experimental results show that the proposed variable-resolution approach via statistical conditional sampling has potential for improving compression rates when compared to compressing the video at full resolution, while achieving higher video quality when compared to compressing the video at reduced resolution.

  7. Sum of ratios of products forα-μ random variables in wireless multihop relaying and multiple scattering

    KAUST Repository

    Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim

    2014-01-01

    The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.

  8. Sum of ratios of products forα-μ random variables in wireless multihop relaying and multiple scattering

    KAUST Repository

    Wang, Kezhi

    2014-09-01

    The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.

  9. The mesoscopic conductance of disordered rings, its random matrix theory and the generalized variable range hopping picture

    International Nuclear Information System (INIS)

    Stotland, Alexander; Peer, Tal; Cohen, Doron; Budoyo, Rangga; Kottos, Tsampikos

    2008-01-01

    The calculation of the conductance of disordered rings requires a theory that goes beyond the Kubo-Drude formulation. Assuming 'mesoscopic' circumstances the analysis of the electro-driven transitions shows similarities with a percolation problem in energy space. We argue that the texture and the sparsity of the perturbation matrix dictate the value of the conductance, and study its dependence on the disorder strength, ranging from the ballistic to the Anderson localization regime. An improved sparse random matrix model is introduced to capture the essential ingredients of the problem, and leads to a generalized variable range hopping picture. (fast track communication)

  10. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    Science.gov (United States)

    Galelli, S.; Castelletti, A.

    2013-07-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  11. Randomized Trial of a Lifestyle Physical Activity Intervention for Breast Cancer Survivors: Effects on Transtheoretical Model Variables.

    Science.gov (United States)

    Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen

    2018-01-01

    This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.

  12. Variational Infinite Hidden Conditional Random Fields

    NARCIS (Netherlands)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin

    2015-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of

  13. Allelic variability in species and stocks of Lake Superior ciscoes (Coregoninae)

    Science.gov (United States)

    Todd, Thomas N.

    1981-01-01

    Starch gel electrophoresis was used as a means of recognizing species and stocks in Lake Superior Coregonus. Allelic variability at isocitrate dehydrogenase and glycerol-3-phosphate dehydrogenase loci was recorded for samples of lake herring (Coregonus artedii), bloater (C. hoyi), kiyi (C. kiyi), and shortjaw cisco (C. zenithicus) from five Lake Superior localities. The observed frequencies of genotypes within each subsample did not differ significantly from those expected on the basis of random mating, and suggested that each subsample represented either a random sample from a larger randomly mating population or an independent and isolated subpopulation within which mating was random. Significant contingency X2 values for comparisons between both localities and species suggested that more than one randomly mating population occurred among the Lake Superior ciscoes, but did not reveal how many such populations there were. In contrast to the genetic results of this study, morphology seems to be a better descriptor of cisco stocks, and identification of cisco stocks and species will still have to be based on morphological criteria until more data are forthcoming. Where several species are sympatric, management should strive to preserve the least abundant. Failure to do so could result in the extinction or depletion of the rarer forms.

  14. Use of latent variables representing psychological motivation to explore citizens’ intentions with respect to congestion charging reform in Jakarta

    Directory of Open Access Journals (Sweden)

    Sugiarto Sugiarto

    2015-01-01

    Full Text Available The aim of this paper is to investigate the intentions of Jakarta citizens with respect to the electronic road pricing (ERP reform proposed by the city government. Utilizing data from a stated preference survey conducted in 2013, we construct six variables representing latent psychological motivations (appropriateness of ERP adoption; recognition that ERP can mitigate congestion and improve the environment; car dependency (CDC; awareness of the problems of cars in society; inhibition of freedom movement caused by ERP; and doubts about the ability of ERP to mitigate congestion and environment problems. A multiple-indicators multiple-causes (MIMIC model is developed to investigate the effects of respondents’ socio-demographics (causes on the latent constructs in order to gain better understanding of the relationship between respondents’ intentions and the observed individual’s responses (indicators obtained from the stated preference survey. The MIMIC model offers a good account of whether and how socio-demographic attributes and individual indicators predict the latent variables of psychological motivation constructs. Then, we further verify the influences of the latent variables, combining them with levy rate patterns and daily mobility attributes to investigate significant determining factors for social acceptance of the ERP proposal. A latent variable representations based on the generalized ordered response model are employed in our investigations to allow more flexibility in parameter estimation across outcomes. The results confirm that there is a strong correlation between latent psychological motivations and daily mobility attributes and the level of social acceptance for the ERP proposal. This empirical investigation demonstrates that the latent variables play more substantial role in determining scheme’s acceptance. Moreover, elasticity measures show that latent attributes are more sensitive compared to levies and daily mobility

  15. Describing temporal variability of the mean Estonian precipitation series in climate time scale

    Science.gov (United States)

    Post, P.; Kärner, O.

    2009-04-01

    Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0

  16. Asymptotic distribution of products of sums of independent random ...

    Indian Academy of Sciences (India)

    integrable random variables (r.v.) are asymptotically log-normal. This fact ... the product of the partial sums of i.i.d. positive random variables as follows. .... Now define ..... by Henan Province Foundation and Frontier Technology Research Plan.

  17. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  18. Effect of the Young modulus variability on the mechanical behaviour of a nuclear containment vessel

    Energy Technology Data Exchange (ETDEWEB)

    Larrard, T. de, E-mail: delarrard@lmt.ens-cachan.f [LMT-ENS Cachan, CNRS/UPMC/PRES UniverSud Paris (France); Colliat, J.B.; Benboudjema, F. [LMT-ENS Cachan, CNRS/UPMC/PRES UniverSud Paris (France); Torrenti, J.M. [Universite Paris-Est, LCPC (France); Nahas, G. [IRSN/DSR/SAMS/BAGS, Fontenay-aux-Roses (France)

    2010-12-15

    This study aims at investigating the influence of the Young modulus variability on the mechanical behaviour of a nuclear containment vessel in case of a loss of cooling agent accident and under the assumption of an elastic behaviour. To achieve this investigation, the Monte-Carlo Method is carried out thanks to a middleware which encapsulates the different components (random field generation, FE simulations) and enables calculations parallelisation. The main goal is to quantify the uncertainty propagation by comparing the maximal values of outputs of interest (orthoradial stress and Mazars equivalent strain) for each realisation of the considered random field with the ones obtained from a reference calculation taking into account uniform field (equal to the expected value of the random field). The Young modulus is supposed to be accurately represented by a weakly homogeneous random field and realisations are provided through its truncated Karhunen-Loeve expansion. This study reveals that the expected value for the maximal equivalent strain in the structure is more important when considering the Young modulus spatial variability than the value obtained from a deterministic approach with a uniform Young modulus field. The influence of the correlation length is investigated too. Finally it is shown that there is no correlation between the maximal values location of equivalent strain and the ones where the Young modulus extreme values are observed for each realisation.

  19. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  20. Variability in response to albuminuria-lowering drugs

    DEFF Research Database (Denmark)

    Petrykiv, Sergei I; de Zeeuw, Dick; Persson, Frederik

    2017-01-01

    AIMS: Albuminuria-lowering drugs have shown different effect size in different individuals. Since urine albumin levels are known to vary considerably from day-to-day, we questioned whether the between-individual variability in albuminuria response after therapy initiation reflects a random...... variability or a true response variation to treatment. In addition, we questioned whether the response variability is drug dependent. METHODS: To determine whether the response to treatment is random or a true drug response, we correlated in six clinical trials the change in albuminuria during placebo...... or active treatment (on-treatment) with the change in albuminuria during wash-out (off-treatment). If these responses correlate during active treatment, it suggests that at least part of the response variability can be attributed to drug response variability. We tested this for enalapril, losartan...

  1. Analysis and Computation of Acoustic and Elastic Wave Equations in Random Media

    KAUST Repository

    Motamed, Mohammad

    2014-01-06

    We propose stochastic collocation methods for solving the second order acoustic and elastic wave equations in heterogeneous random media and subject to deterministic boundary and initial conditions [1, 4]. We assume that the medium consists of non-overlapping sub-domains with smooth interfaces. In each sub-domain, the materials coefficients are smooth and given or approximated by a finite number of random variable. One important example is wave propagation in multi-layered media with smooth interfaces. The numerical scheme consists of a finite difference or finite element method in the physical space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space. We provide a rigorous convergence analysis and demonstrate different types of convergence of the probability error with respect to the number of collocation points under some regularity assumptions on the data. In particular, we show that, unlike in elliptic and parabolic problems [2, 3], the solution to hyperbolic problems is not in general analytic with respect to the random variables. Therefore, the rate of convergence is only algebraic. A fast spectral rate of convergence is still possible for some quantities of interest and for the wave solutions with particular types of data. We also show that the semi-discrete solution is analytic with respect to the random variables with the radius of analyticity proportional to the grid/mesh size h. We therefore obtain an exponential rate of convergence which deteriorates as the quantity h p gets smaller, with p representing the polynomial degree in the stochastic space. We have shown that analytical results and numerical examples are consistent and that the stochastic collocation method may be a valid alternative to the more traditional Monte Carlo method. Here we focus on the stochastic acoustic wave equation. Similar results are obtained for stochastic elastic equations.

  2. Quantifying intrinsic and extrinsic variability in stochastic gene expression models.

    Science.gov (United States)

    Singh, Abhyudai; Soltani, Mohammad

    2013-01-01

    Genetically identical cell populations exhibit considerable intercellular variation in the level of a given protein or mRNA. Both intrinsic and extrinsic sources of noise drive this variability in gene expression. More specifically, extrinsic noise is the expression variability that arises from cell-to-cell differences in cell-specific factors such as enzyme levels, cell size and cell cycle stage. In contrast, intrinsic noise is the expression variability that is not accounted for by extrinsic noise, and typically arises from the inherent stochastic nature of biochemical processes. Two-color reporter experiments are employed to decompose expression variability into its intrinsic and extrinsic noise components. Analytical formulas for intrinsic and extrinsic noise are derived for a class of stochastic gene expression models, where variations in cell-specific factors cause fluctuations in model parameters, in particular, transcription and/or translation rate fluctuations. Assuming mRNA production occurs in random bursts, transcription rate is represented by either the burst frequency (how often the bursts occur) or the burst size (number of mRNAs produced in each burst). Our analysis shows that fluctuations in the transcription burst frequency enhance extrinsic noise but do not affect the intrinsic noise. On the contrary, fluctuations in the transcription burst size or mRNA translation rate dramatically increase both intrinsic and extrinsic noise components. Interestingly, simultaneous fluctuations in transcription and translation rates arising from randomness in ATP abundance can decrease intrinsic noise measured in a two-color reporter assay. Finally, we discuss how these formulas can be combined with single-cell gene expression data from two-color reporter experiments for estimating model parameters.

  3. The effects of variable practice on locomotor adaptation to a novel asymmetric gait.

    Science.gov (United States)

    Hinkel-Lipsker, Jacob W; Hahn, Michael E

    2017-09-01

    Very little is known about the effects of specific practice on motor learning of predictive balance control during novel bipedal gait. This information could provide an insight into how the direction and magnitude of predictive errors during acquisition of a novel gait task influence transfer of balance control, as well as yield a practice protocol for the restoration of balance for those with locomotor impairments. This study examined the effect of a variable practice paradigm on transfer of a novel asymmetric gait pattern in able-bodied individuals. Using a split-belt treadmill, one limb was driven at a constant velocity (constant limb) and the other underwent specific changes in velocity (variable limb) during practice according to one of three prescribed practice paradigms: serial, where the variable limb velocity increased linearly; random blocked, where variable limb underwent random belt velocity changes every 20 strides; and random practice, where the variable limb underwent random step-to-step changes in velocity. Random practice showed the highest balance control variability during acquisition compared to serial and random blocked practice which demonstrated the best transfer of balance control on one transfer test. Both random and random blocked practices showed significantly less balance control variability during a second transfer test compared to serial practice. These results indicate that random blocked practice may be best for generalizability of balance control while learning a novel gait, perhaps, indicating that individuals who underwent this practice paradigm were able to find the most optimal balance control solution during practice.

  4. Common characterization of variability and forecast errors of variable energy sources and their mitigation using reserves in power system integration studies

    Energy Technology Data Exchange (ETDEWEB)

    Menemenlis, N.; Huneault, M. [IREQ, Varennes, QC (Canada); Robitaille, A. [Dir. Plantif. de la Production Eolienne, Montreal, QC (Canada). HQ Production; Holttinen, H. [VTT Technical Research Centre of Finland, VTT (Finland)

    2012-07-01

    This In this paper we define and characterize the two random variables, variability and forecast error, over which uncertainty in power systems operations is characterized and mitigated. We show that the characterization of both these variables can be carried out with the same mathematical tools. Furthermore, this common characterization of random variables lends itself to a common methodology for the calculation of non-contingency reserves required to mitigate their effects. A parallel comparison of these two variables demonstrates similar inherent statistical properties. They depend on imminent conditions, evolve with time and can be asymmetric. Correlation is an important factor when aggregating individual wind farm characteristics in forming the distribution of the total wind generation for imminent conditions. (orig.)

  5. Soil variability in engineering applications

    Science.gov (United States)

    Vessia, Giovanna

    2014-05-01

    Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random

  6. Protecting chips against hold time violations due to variability

    CERN Document Server

    Neuberger, Gustavo; Reis, Ricardo

    2013-01-01

    With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus

  7. A combinatorial and probabilistic study of initial and end heights of descents in samples of geometrically distributed random variables and in permutations

    Directory of Open Access Journals (Sweden)

    Helmut Prodinger

    2007-01-01

    Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.

  8. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  9. Southern hemisphere climate variability as represented by an ocean-atmosphere coupled model

    CSIR Research Space (South Africa)

    Beraki, A

    2012-09-01

    Full Text Available in the atmospheric circulation. The ability of predicting these modes of climate variability on longer timescales is vital. Potential predictability is usually measured as a signal-to-noise contrast between the slowly evolving and chaotic components of the climate...

  10. On the representativeness of behavior observation samples in classrooms.

    Science.gov (United States)

    Tiger, Jeffrey H; Miller, Sarah J; Mevers, Joanna Lomas; Mintz, Joslyn Cynkus; Scheithauer, Mindy C; Alvarez, Jessica

    2013-01-01

    School consultants who rely on direct observation typically conduct observational samples (e.g., 1 30-min observation per day) with the hopes that the sample is representative of performance during the remainder of the day, but the representativeness of these samples is unclear. In the current study, we recorded the problem behavior of 3 referred students for 4 consecutive school days between 9:30 a.m. and 2:30 p.m. using duration recording in consecutive 10-min sessions. We then culled 10-min, 20-min, 30-min, and 60-min observations from the complete record and compared these observations to the true daily mean to assess their accuracy (i.e., how well individual observations represented the daily occurrence of target behaviors). The results indicated that when behavior occurred with low variability, the majority of brief observations were representative of the overall levels; however, when behavior occurred with greater variability, even 60-min observations did not accurately capture the true levels of behavior. © Society for the Experimental Analysis of Behavior.

  11. Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables

    Directory of Open Access Journals (Sweden)

    Andrew Rosalsky

    2004-12-01

    Full Text Available Let {X,Xn;n≥1} be a sequence of real-valued i.i.d. random variables and let Sn=∑i=1nXi, n≥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ϕ(x=1, then for every t>0, limsupn→∞P(|Sn|>tn1/p/(nϕ(n=tpα.

  12. MATERIAL SIGNATURE ORTHONORMAL MAPPING IN HYPERSPECTRAL UNMIXING TO ADDRESS ENDMEMBER VARIABILITY

    Directory of Open Access Journals (Sweden)

    Ali Jafari

    2016-03-01

    Full Text Available A new hyperspectral unmixing algorithm which considers endmember variability is presented. In the proposed algorithm, the endmembers are represented by correlated random vectors using the stochastic mixing model. Currently, there is no published theory for selecting the appropriate distribution for endmembers. The proposed algorithm first uses a linear transformation called material signature orthonormal mapping (MSOM, which transforms the endmembers to correlated Gaussian random vectors. The MSOM transformation reduces computational requirements by reducing the dimension and improves discrimination of endmembers by orthonormalizing the endmember mean vectors. In the original spectral space, the automated endmember bundles (AEB method extracts a set of spectra (endmember set for each material. The mean vector and covariance matrix of each endmember estimated directly from endmember sets in the MSOM space. Second, a new maximum likelihood method, called NCM_ML, is proposed which estimates abundances in the MSOM space using the normal compositional model (NCM. The proposed algorithm is evaluated and compared with other state-of-the-art unmixing algorithms using simulated and real hyperspectral images. Experimental results demonstrate that the proposed unmixing algorithm can unmix pixels composed of similar endmembers in hyperspectral images in the presence of spectral variability more accurately than previous methods.

  13. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  14. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune

    Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...

  15. Use of single-representative reverse-engineered surface-models for RSA does not affect measurement accuracy and precision.

    Science.gov (United States)

    Seehaus, Frank; Schwarze, Michael; Flörkemeier, Thilo; von Lewinski, Gabriela; Kaptein, Bart L; Jakubowitz, Eike; Hurschler, Christof

    2016-05-01

    Implant migration can be accurately quantified by model-based Roentgen stereophotogrammetric analysis (RSA), using an implant surface model to locate the implant relative to the bone. In a clinical situation, a single reverse engineering (RE) model for each implant type and size is used. It is unclear to what extent the accuracy and precision of migration measurement is affected by implant manufacturing variability unaccounted for by a single representative model. Individual RE models were generated for five short-stem hip implants of the same type and size. Two phantom analyses and one clinical analysis were performed: "Accuracy-matched models": one stem was assessed, and the results from the original RE model were compared with randomly selected models. "Accuracy-random model": each of the five stems was assessed and analyzed using one randomly selected RE model. "Precision-clinical setting": implant migration was calculated for eight patients, and all five available RE models were applied to each case. For the two phantom experiments, the 95%CI of the bias ranged from -0.28 mm to 0.30 mm for translation and -2.3° to 2.5° for rotation. In the clinical setting, precision is less than 0.5 mm and 1.2° for translation and rotation, respectively, except for rotations about the proximodistal axis (RSA can be achieved and are not biased by using a single representative RE model. At least for implants similar in shape to the investigated short-stem, individual models are not necessary. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:903-910, 2016. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  16. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  17. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, R.; Brincker, Rune

    1998-01-01

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  18. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.

    Science.gov (United States)

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.

  19. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  20. How to get rid of W: a latent variables approach to modelling spatially lagged variables

    NARCIS (Netherlands)

    Folmer, H.; Oud, J.

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  1. How to get rid of W : a latent variables approach to modelling spatially lagged variables

    NARCIS (Netherlands)

    Folmer, Henk; Oud, Johan

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  2. Representativeness and seasonality of major ion records derived from NEEM firn cores

    Directory of Open Access Journals (Sweden)

    G. Gfeller

    2014-10-01

    Full Text Available The seasonal and annual representativeness of ionic aerosol proxies (among others, calcium, sodium, ammonium and nitrate in various firn cores in the vicinity of the NEEM drill site in northwest Greenland have been assessed. Seasonal representativeness is very high as one core explains more than 60% of the variability within the area. The inter-annual representativeness, however, can be substantially lower (depending on the species making replicate coring indispensable to derive the atmospheric variability of aerosol species. A single core at the NEEM site records only 30% of the inter-annual atmospheric variability in some species, while five replicate cores are already needed to cover approximately 70% of the inter-annual atmospheric variability in all species. The spatial representativeness is very high within 60 cm, rapidly decorrelates within 10 m but does not diminish further within 3 km. We attribute this to wind reworking of the snow pack leading to sastrugi formation. Due to the high resolution and seasonal representativeness of the records we can derive accurate seasonalities of the measured species for modern (AD 1990–2010 times as well as for pre-industrial (AD 1623–1750 times. Sodium and calcium show similar seasonality (peaking in February and March respectively for modern and pre-industrial times, whereas ammonium and nitrate are influenced by anthropogenic activities. Nitrate and ammonium both peak in May during modern times, whereas during pre-industrial times ammonium peaked during July–August and nitrate during June–July.

  3. Marketing norm perception among medical representatives in Indian pharmaceutical industry.

    Science.gov (United States)

    Nagashekhara, Molugulu; Agil, Syed Omar Syed; Ramasamy, Ravindran

    2012-03-01

    Study of marketing norm perception among medical representatives is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the perception of marketing norms among medical representatives. The research design is quantitative and cross sectional study with medical representatives as unit of analysis. Data is collected from medical representatives (n=300) using a simple random and cluster sampling using a structured questionnaire. Results indicate that there is no difference in the perception of marketing norms among male and female medical representatives. But there is a difference in opinion among domestic and multinational company's medical representatives. Educational back ground of medical representatives also shows the difference in opinion among medical representatives. Degree holders and multinational company medical representatives have high perception of marketing norms compare to their counterparts. The researchers strongly believe that mandatory training on marketing norms is beneficial in decision making process during the dilemmas in the sales field.

  4. Variable Selection for Regression Models of Percentile Flows

    Science.gov (United States)

    Fouad, G.

    2017-12-01

    Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high

  5. The North Atlantic Oscillation: variability and interactions with the North Atlantic ocean and Artic sea ice

    Energy Technology Data Exchange (ETDEWEB)

    Jung, T

    2000-07-01

    The North Atlantic oscillation (NAO) represents the dominant mode of atmospheric variability in the North Atlantic region and describes the strengthening and weakening of the midlatitude westerlies. In this study, variability of the NAO during wintertime and its relationship to the North Atlantic ocean and Arctic sea ice is investigated. For this purpose, observational data are analyzed along with integrations of models for the Atlantic ocean, Arctic sea ice, and the coupled global climate system. From a statistical point of view, the observed NAO index shows unusually high variance on interdecadal time scales during the 20th century. Variability on other time scales is consistent with realizations of random processes (''white noise''). Recurrence of wintertime NAO anomalies from winter-to-winter with missing signals during the inbetween nonwinter seasons is primarily associated with interdecadal variability of the NAO. This recurrence indicates that low-frequency changes of the NAO during the 20th century were in part externally forced. (orig.)

  6. The North Atlantic Oscillation: variability and interactions with the North Atlantic ocean and Artic sea ice

    Energy Technology Data Exchange (ETDEWEB)

    Jung, T.

    2000-07-01

    The North Atlantic oscillation (NAO) represents the dominant mode of atmospheric variability in the North Atlantic region and describes the strengthening and weakening of the midlatitude westerlies. In this study, variability of the NAO during wintertime and its relationship to the North Atlantic ocean and Arctic sea ice is investigated. For this purpose, observational data are analyzed along with integrations of models for the Atlantic ocean, Arctic sea ice, and the coupled global climate system. From a statistical point of view, the observed NAO index shows unusually high variance on interdecadal time scales during the 20th century. Variability on other time scales is consistent with realizations of random processes (''white noise''). Recurrence of wintertime NAO anomalies from winter-to-winter with missing signals during the inbetween nonwinter seasons is primarily associated with interdecadal variability of the NAO. This recurrence indicates that low-frequency changes of the NAO during the 20th century were in part externally forced. (orig.)

  7. Nasal Jet-CPAP (variable flow) versus Bubble-CPAP in preterm infants with respiratory distress: an open label, randomized controlled trial.

    Science.gov (United States)

    Bhatti, A; Khan, J; Murki, S; Sundaram, V; Saini, S S; Kumar, P

    2015-11-01

    To compare the failure rates between Jet continuous positive airway pressure device (J-CPAP-variable flow) and Bubble continuous positive airway device (B-CPAP) in preterm infants with respiratory distress. Preterm newborns CPAP (a variable flow device) or B-CPAP (continuous flow device). A standardized protocol was followed for titration, weaning and removal of CPAP. Pressure was monitored close to the nares in both the devices every 6 hours and settings were adjusted to provide desired CPAP. The primary outcome was CPAP failure rate within 72 h of life. Secondary outcomes were CPAP failure within 7 days of life, need for surfactant post-randomization, time to CPAP failure, duration of CPAP and complications of prematurity. An intention to treat analysis was done. One-hundred seventy neonates were randomized, 80 to J-CPAP and 90 to B-CPAP. CPAP failure rates within 72 h were similar in infants who received J-CPAP and in those who received B-CPAP (29 versus 21%; relative risks 1.4 (0.8 to 2.3), P=0.25). Mean (95% confidence intervals) time to CPAP failure was 59 h (54 to 64) in the Jet CPAP group in comparison with 65 h (62 to 68) in the Bubble CPAP group (log rank P=0.19). All other secondary outcomes were similar between the two groups. In preterm infants with respiratory distress starting within 6 h of life, CPAP failure rates were similar with Jet CPAP and Bubble CPAP.

  8. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  9. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks(Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels. © 2006 IEEE.

  10. Gaussian random bridges and a geometric model for information equilibrium

    Science.gov (United States)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  11. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  12. Some results on convergence rates for probabilities of moderate deviations for sums of random variables

    Directory of Open Access Journals (Sweden)

    Deli Li

    1992-01-01

    Full Text Available Let X, Xn, n≥1 be a sequence of iid real random variables, and Sn=∑k=1nXk, n≥1. Convergence rates of moderate deviations are derived, i.e., the rate of convergence to zero of certain tail probabilities of the partial sums are determined. For example, we obtain equivalent conditions for the convergence of series ∑n≥1(ψ2(n/nP(|Sn|≥nφ(n only under the assumptions convergence that EX=0 and EX2=1, where φ and ψ are taken from a broad class of functions. These results generalize and improve some recent results of Li (1991 and Gafurov (1982 and some previous work of Davis (1968. For b∈[0,1] and ϵ>0, letλϵ,b=∑n≥3((loglognb/nI(|Sn|≥(2+ϵnloglogn.The behaviour of Eλϵ,b as ϵ↓0 is also studied.

  13. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. A study of probabilistic fatigue crack propagation models in Mg Al Zn alloys under different specimen thickness conditions by using the residual of a random variable

    International Nuclear Information System (INIS)

    Choi, Seon Soon

    2012-01-01

    The primary aim of this paper was to evaluate several probabilistic fatigue crack propagation models using the residual of a random variable, and to present the model fit for probabilistic fatigue behavior in Mg Al Zn alloys. The proposed probabilistic models are the probabilistic Paris Erdogan model, probabilistic Walker model, probabilistic Forman model, and probabilistic modified Forman models. These models were prepared by applying a random variable to the empirical fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models vor describing fatigue crack propagation behavior in Mg Al Zn alloys were generally the probabilistic Paris Erdogan and probabilistic Walker models. The probabilistic Forman model was a good model only for a specimen with a thickness of 9.45mm

  15. Atomoxetine could improve intra-individual variability in drug-naïve adults with attention-deficit/hyperactivity disorder comparably with methylphenidate: A head-to-head randomized clinical trial.

    Science.gov (United States)

    Ni, Hsing-Chang; Hwang Gu, Shoou-Lian; Lin, Hsiang-Yuan; Lin, Yu-Ju; Yang, Li-Kuang; Huang, Hui-Chun; Gau, Susan Shur-Fen

    2016-05-01

    Intra-individual variability in reaction time (IIV-RT) is common in individuals with attention-deficit/hyperactivity disorder (ADHD). It can be improved by stimulants. However, the effects of atomoxetine on IIV-RT are inconclusive. We aimed to investigate the effects of atomoxetine on IIV-RT, and directly compared its efficacy with methylphenidate in adults with ADHD. An 8-10 week, open-label, head-to-head, randomized clinical trial was conducted in 52 drug-naïve adults with ADHD, who were randomly assigned to two treatment groups: immediate-release methylphenidate (n=26) thrice daily (10-20 mg per dose) and atomoxetine once daily (n=26) (0.5-1.2 mg/kg/day). IIV-RT, derived from the Conners' continuous performance test (CCPT), was represented by the Gaussian (reaction time standard error, RTSE) and ex-Gaussian models (sigma and tau). Other neuropsychological functions, including response errors and mean of reaction time, were also measured. Participants received CCPT assessments at baseline and week 8-10 (60.4±6.3 days). We found comparable improvements in performances of CCPT between the immediate-release methylphenidate- and atomoxetine-treated groups. Both medications significantly improved IIV-RT in terms of reducing tau values with comparable efficacy. In addition, both medications significantly improved inhibitory control by reducing commission errors. Our results provide evidence to support that atomoxetine could improve IIV-RT and inhibitory control, of comparable efficacy with immediate-release methylphenidate, in drug-naïve adults with ADHD. Shared and unique mechanisms underpinning these medication effects on IIV-RT awaits further investigation. © The Author(s) 2016.

  16. Ratio index variables or ANCOVA? Fisher's cats revisited.

    Science.gov (United States)

    Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S

    2010-01-01

    Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.

  17. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    DEFF Research Database (Denmark)

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...

  18. Individualized Anemia Management Reduces Hemoglobin Variability in Hemodialysis Patients

    OpenAIRE

    Gaweda, Adam E.; Aronoff, George R.; Jacobs, Alfred A.; Rai, Shesh N.; Brier, Michael E.

    2013-01-01

    One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA ...

  19. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  20. A Review on asymptotic normality of sums of associated random ...

    African Journals Online (AJOL)

    Association between random variables is a generalization of independence of these random variables. This concept is more and more commonly used in current trends in any research elds in Statistics. In this paper, we proceed to a simple, clear and rigorous introduction to it. We will present the fundamental asymptotic ...

  1. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  2. Classical randomness in quantum measurements

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro; Presti, Paoloplacido Lo; Perinotti, Paolo

    2005-01-01

    Similarly to quantum states, also quantum measurements can be 'mixed', corresponding to a random choice within an ensemble of measuring apparatuses. Such mixing is equivalent to a sort of hidden variable, which produces a noise of purely classical nature. It is then natural to ask which apparatuses are indecomposable, i.e. do not correspond to any random choice of apparatuses. This problem is interesting not only for foundations, but also for applications, since most optimization strategies give optimal apparatuses that are indecomposable. Mathematically the problem is posed describing each measuring apparatus by a positive operator-valued measure (POVM), which gives the statistics of the outcomes for any input state. The POVMs form a convex set, and in this language the indecomposable apparatuses are represented by extremal points-the analogous of 'pure states' in the convex set of states. Differently from the case of states, however, indecomposable POVMs are not necessarily rank-one, e.g. von Neumann measurements. In this paper we give a complete classification of indecomposable apparatuses (for discrete spectrum), by providing different necessary and sufficient conditions for extremality of POVMs, along with a simple general algorithm for the decomposition of a POVM into extremals. As an interesting application, 'informationally complete' measurements are analysed in this respect. The convex set of POVMs is fully characterized by determining its border in terms of simple algebraic properties of the corresponding POVMs

  3. Sampling intraspecific variability in leaf functional traits: Practical suggestions to maximize collected information.

    Science.gov (United States)

    Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni

    2017-12-01

    The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITV BI ) and among populations (ITV POP ), relatively few studies have analyzed intraspecific variability within individuals (ITV WI ). Here, we provide an analysis of ITV WI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITV WI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITV WI , comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.

  4. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  5. A Model for Positively Correlated Count Variables

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    2010-01-01

    An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields...... and their potential applications. The purpose of this paper is to summarize useful probabilistic results, study stochastic constructions and simulation techniques, and discuss some examples of α-permanental random fields. This should provide a useful basis for discussing the statistical aspects in future work....

  6. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  7. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  8. Variability in research ethics review of cluster randomized trials: a scenario-based survey in three countries

    Science.gov (United States)

    2014-01-01

    Background Cluster randomized trials (CRTs) present unique ethical challenges. In the absence of a uniform standard for their ethical design and conduct, problems such as variability in procedures and requirements by different research ethics committees will persist. We aimed to assess the need for ethics guidelines for CRTs among research ethics chairs internationally, investigate variability in procedures for research ethics review of CRTs within and among countries, and elicit research ethics chairs’ perspectives on specific ethical issues in CRTs, including the identification of research subjects. The proper identification of research subjects is a necessary requirement in the research ethics review process, to help ensure, on the one hand, that subjects are protected from harm and exploitation, and on the other, that reviews of CRTs are completed efficiently. Methods A web-based survey with closed- and open-ended questions was administered to research ethics chairs in Canada, the United States, and the United Kingdom. The survey presented three scenarios of CRTs involving cluster-level, professional-level, and individual-level interventions. For each scenario, a series of questions was posed with respect to the type of review required (full, expedited, or no review) and the identification of research subjects at cluster and individual levels. Results A total of 189 (35%) of 542 chairs responded. Overall, 144 (84%, 95% CI 79 to 90%) agreed or strongly agreed that there is a need for ethics guidelines for CRTs and 158 (92%, 95% CI 88 to 96%) agreed or strongly agreed that research ethics committees could be better informed about distinct ethical issues surrounding CRTs. There was considerable variability among research ethics chairs with respect to the type of review required, as well as the identification of research subjects. The cluster-cluster and professional-cluster scenarios produced the most disagreement. Conclusions Research ethics committees

  9. Fluxes all of the time? A primer on the temporal representativeness of FLUXNET

    Science.gov (United States)

    Chu, Housen; Baldocchi, Dennis D.; John, Ranjeet; Wolf, Sebastian; Reichstein, Markus

    2017-02-01

    FLUXNET, the global network of eddy covariance flux towers, provides the largest synthesized data set of CO2, H2O, and energy fluxes. To achieve the ultimate goal of providing flux information "everywhere and all of the time," studies have attempted to address the representativeness issue, i.e., whether measurements taken in a set of given locations and measurement periods can be extrapolated to a space- and time-explicit extent (e.g., terrestrial globe, 1982-2013 climatological baseline). This study focuses on the temporal representativeness of FLUXNET and tests whether site-specific measurement periods are sufficient to capture the natural variability of climatological and biological conditions. FLUXNET is unevenly representative across sites in terms of the measurement lengths and potentials of extrapolation in time. Similarity of driver conditions among years generally enables the extrapolation of flux information beyond measurement periods. Yet such extrapolation potentials are further constrained by site-specific variability of driver conditions. Several driver variables such as air temperature, diurnal temperature range, potential evapotranspiration, and normalized difference vegetation index had detectable trends and/or breakpoints within the baseline period, and flux measurements generally covered similar and biased conditions in those drivers. About 38% and 60% of FLUXNET sites adequately sampled the mean conditions and interannual variability of all driver conditions, respectively. For long-record sites (≥15 years) the percentages increased to 59% and 69%, respectively. However, the justification of temporal representativeness should not rely solely on the lengths of measurements. Whenever possible, site-specific consideration (e.g., trend, breakpoint, and interannual variability in drivers) should be taken into account.

  10. Attention Measures of Accuracy, Variability, and Fatigue Detect Early Response to Donepezil in Alzheimer's Disease: A Randomized, Double-blind, Placebo-Controlled Pilot Trial.

    Science.gov (United States)

    Vila-Castelar, Clara; Ly, Jenny J; Kaplan, Lillian; Van Dyk, Kathleen; Berger, Jeffrey T; Macina, Lucy O; Stewart, Jennifer L; Foldi, Nancy S

    2018-04-09

    Donepezil is widely used to treat Alzheimer's disease (AD), but detecting early response remains challenging for clinicians. Acetylcholine is known to directly modulate attention, particularly under high cognitive conditions, but no studies to date test whether measures of attention under high load can detect early effects of donepezil. We hypothesized that load-dependent attention tasks are sensitive to short-term treatment effects of donepezil, while global and other domain-specific cognitive measures are not. This longitudinal, randomized, double-blind, placebo-controlled pilot trial (ClinicalTrials.gov Identifier: NCT03073876) evaluated 23 participants newly diagnosed with AD initiating de novo donepezil treatment (5 mg). After baseline assessment, participants were randomized into Drug (n = 12) or Placebo (n = 11) groups, and retested after approximately 6 weeks. Cognitive assessment included: (a) attention tasks (Foreperiod Effect, Attentional Blink, and Covert Orienting tasks) measuring processing speed, top-down accuracy, orienting, intra-individual variability, and fatigue; (b) global measures (Alzheimer's Disease Assessment Scale-Cognitive Subscale, Mini-Mental Status Examination, Dementia Rating Scale); and (c) domain-specific measures (memory, language, visuospatial, and executive function). The Drug but not the Placebo group showed benefits of treatment at high-load measures by preserving top-down accuracy, improving intra-individual variability, and averting fatigue. In contrast, other global or cognitive domain-specific measures could not detect treatment effects over the same treatment interval. The pilot-study suggests that attention measures targeting accuracy, variability, and fatigue under high-load conditions could be sensitive to short-term cholinergic treatment. Given the central role of acetylcholine in attentional function, load-dependent attentional measures may be valuable cognitive markers of early treatment response.

  11. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  12. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift

    DEFF Research Database (Denmark)

    Lehre, Per Kristian; Witt, Carsten

    2014-01-01

    Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...

  14. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  15. Marital status as a candidate moderator variable of male-female differences in sexual jealousy: the need for representative population samples.

    Science.gov (United States)

    Voracek, M

    2001-04-01

    Evolutionary psychological theories predict pronounced and universal male-female differences in sexual jealousy. Recent cross-cultural research, using the forced-choice jealousy items pioneered by Buss, et al., 1992, repeatedly found a large sex differential on these self-report measures: men significantly more often than women choose their mate's imagined sexual infidelity to be more distressing or upsetting to them than an imagined emotional infidelity. However, this body of evidence is solely based on undergraduate samples and does not take into account demographic factors. This study examined male-female differences in sexual jealousy in a community sample (N = 335, Eastern Austria). Within a logistic regression model, with other variables controlled for, marital status was a stronger predictor for sexual jealousy than respondents' sex. Contrary to previous research, the sex differential's effect size was only modest. These findings stress the pitfalls of prematurely generalizing evidence from undergraduate samples to the general population and the need for representative population samples in this research area.

  16. On lower limits and equivalences for distribution tails of randomly stopped sums

    NARCIS (Netherlands)

    Denisov, D.E.; Foss, S.G.; Korshunov, D.A.

    2008-01-01

    For a distribution F*t of a random sum St=¿1+¿+¿t of i.i.d. random variables with a common distribution F on the half-line [0, 8), we study the limits of the ratios of tails as x¿8 (here, t is a counting random variable which does not depend on {¿n}n=1). We also consider applications of the results

  17. Variability of fractal dimension of solar radio flux

    Science.gov (United States)

    Bhatt, Hitaishi; Sharma, Som Kumar; Trivedi, Rupal; Vats, Hari Om

    2018-04-01

    In the present communication, the variation of the fractal dimension of solar radio flux is reported. Solar radio flux observations on a day to day basis at 410, 1415, 2695, 4995, and 8800 MHz are used in this study. The data were recorded at Learmonth Solar Observatory, Australia from 1988 to 2009 covering an epoch of two solar activity cycles (22 yr). The fractal dimension is calculated for the listed frequencies for this period. The fractal dimension, being a measure of randomness, represents variability of solar radio flux at shorter time-scales. The contour plot of fractal dimension on a grid of years versus radio frequency suggests high correlation with solar activity. Fractal dimension increases with increasing frequency suggests randomness increases towards the inner corona. This study also shows that the low frequency is more affected by solar activity (at low frequency fractal dimension difference between solar maximum and solar minimum is 0.42) whereas, the higher frequency is less affected by solar activity (here fractal dimension difference between solar maximum and solar minimum is 0.07). A good positive correlation is found between fractal dimension averaged over all frequencies and yearly averaged sunspot number (Pearson's coefficient is 0.87).

  18. Fuzzy randomness uncertainty in civil engineering and computational mechanics

    CERN Document Server

    Möller, Bernd

    2004-01-01

    This book, for the first time, provides a coherent, overall concept for taking account of uncertainty in the analysis, the safety assessment, and the design of structures. The reader is introduced to the problem of uncertainty modeling and familiarized with particular uncertainty models. For simultaneously considering stochastic and non-stochastic uncertainty the superordinated uncertainty model fuzzy randomness, which contains real valued random variables as well as fuzzy variables as special cases, is presented. For this purpose basic mathematical knowledge concerning the fuzzy set theory and the theory of fuzzy random variables is imparted. The body of the book comprises the appropriate quantification of uncertain structural parameters, the fuzzy and fuzzy probabilistic structural analysis, the fuzzy probabilistic safety assessment, and the fuzzy cluster structural design. The completely new algorithms are described in detail and illustrated by way of demonstrative examples.

  19. Fuzziness and randomness in an optimization framework

    International Nuclear Information System (INIS)

    Luhandjula, M.K.

    1994-03-01

    This paper presents a semi-infinite approach for linear programming in the presence of fuzzy random variable coefficients. As a byproduct a way for dealing with optimization problems including both fuzzy and random data is obtained. Numerical examples are provided for the sake of illustration. (author). 13 refs

  20. Large deviations of heavy-tailed random sums with applications in insurance and finance

    NARCIS (Netherlands)

    Kluppelberg, C; Mikosch, T

    We prove large deviation results for the random sum S(t)=Sigma(i=1)(N(t)) X-i, t greater than or equal to 0, where (N(t))(t greater than or equal to 0) are non-negative integer-valued random variables and (X-n)(n is an element of N) are i.i.d. non-negative random Variables with common distribution

  1. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    DEFF Research Database (Denmark)

    Burgess, Stephen; Thompson, Simon G; Thompson, Grahame

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...

  2. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  3. A Note on the Tail Behavior of Randomly Weighted Sums with Convolution-Equivalently Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Yang Yang

    2013-01-01

    Full Text Available We investigate the tailed asymptotic behavior of the randomly weighted sums with increments with convolution-equivalent distributions. Our obtained result can be directly applied to a discrete-time insurance risk model with insurance and financial risks and derive the asymptotics for the finite-time probability of the above risk model.

  4. A Randomized Controlled Trial of an Appearance-focused Intervention to Prevent Skin Cancer

    Science.gov (United States)

    Hillhouse, Joel; Turrisi, Rob; Stapleton, Jerod; Robinson, June

    2014-01-01

    BACKGROUND Skin cancer represents a significant health threat with over 1.3 million diagnoses, 8000 melanoma deaths, and more than $1 billion spent annually for skin cancer healthcare in the US. Despite findings from laboratory, case-control, and prospective studies that indicate a link between youthful indoor tanning (IT) and skin cancer, IT is increasing among US youth. Appearance-focused interventions represent a promising method to counteract these trends. METHODS A total of 430 female indoor tanners were randomized into intervention or no intervention control conditions. Intervention participants received an appearance-focused booklet based on decision-theoretical models of health behavior. Outcome variables included self-reports of IT behavior and intentions, as well as measures of cognitive mediating variables. RESULTS Normative increases in springtime IT rates were significantly lower (ie, over 35%) at 6-month follow-up in intervention versus control participants with similar reductions in future intentions. Mediation analyses revealed 6 cognitive variables (IT attitudes, fashion attitudes, perceived susceptibility to skin cancer and skin damage, subjective norms, and image norms) that significantly mediated change in IT behavior. CONCLUSIONS The appearance-focused intervention demonstrated strong effects on IT behavior and intentions in young indoor tanners. Appearance-focused approaches to skin cancer prevention need to present alternative behaviors as well as alter IT attitudes. Mediational results provide guides for strengthening future appearance-focused interventions directed at behaviors that increase risk of skin cancer. PMID:18937268

  5. A randomized controlled trial of an appearance-focused intervention to prevent skin cancer.

    Science.gov (United States)

    Hillhouse, Joel; Turrisi, Rob; Stapleton, Jerod; Robinson, June

    2008-12-01

    Skin cancer represents a significant health threat with over 1.3 million diagnoses, 8000 melanoma deaths, and more than $1 billion spent annually for skin cancer healthcare in the US. Despite findings from laboratory, case-control, and prospective studies that indicate a link between youthful indoor tanning (IT) and skin cancer, IT is increasing among US youth. Appearance-focused interventions represent a promising method to counteract these trends. A total of 430 female indoor tanners were randomized into intervention or no intervention control conditions. Intervention participants received an appearance-focused booklet based on decision-theoretical models of health behavior. Outcome variables included self-reports of IT behavior and intentions, as well as measures of cognitive mediating variables. Normative increases in springtime IT rates were significantly lower (ie, over 35%) at 6-month follow-up in intervention versus control participants with similar reductions in future intentions. Mediation analyses revealed 6 cognitive variables (IT attitudes, fashion attitudes, perceived susceptibility to skin cancer and skin damage, subjective norms, and image norms) that significantly mediated change in IT behavior. The appearance-focused intervention demonstrated strong effects on IT behavior and intentions in young indoor tanners. Appearance-focused approaches to skin cancer prevention need to present alternative behaviors as well as alter IT attitudes. Mediational results provide guides for strengthening future appearance-focused interventions directed at behaviors that increase risk of skin cancer. (c) 2008 American Cancer Society

  6. A comparison of methods for representing random taste heterogeneity in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Hess, Stephane

    2009-01-01

    This paper reports the findings of a systematic study using Monte Carlo experiments and a real dataset aimed at comparing the performance of various ways of specifying random taste heterogeneity in a discrete choice model. Specifically, the analysis compares the performance of two recent advanced...... distributions. Both approaches allow the researcher to increase the number of parameters as desired. The paper provides a range of evidence on the ability of the various approaches to recover various distributions from data. The two advanced approaches are comparable in terms of the likelihoods achieved...

  7. SU-E-J-176: Characterization of Inter-Fraction Breast Variability and the Implications On Delivered Dose

    Energy Technology Data Exchange (ETDEWEB)

    Sudhoff, M; Lamba, M; Kumar, N; Ward, A; Elson, H [University of Cincinnati, Cincinnati, OH (United States)

    2015-06-15

    Purpose: To systematically characterize inter-fraction breast variability and determine implications on delivered dose. Methods: Weekly port films were used to characterize breast setup variability. Five evenly spaced representative positions across the contour of each breast were chosen on the electronic port film in reference to graticule, and window and level was set such that the skin surface of the breast was visible. Measurements from the skin surface to treatment field edge were taken on each port film at each position and compared to the planning DRR, quantifying the variability. The systematic measurement technique was repeated for all port films for 20 recently treated breast cancer patients. Measured setup variability for each patient was modeled as a normal distribution. The distribution was randomly sampled from the model and applied as isocentric shifts in the treatment planning computer, representing setup variability for each fraction. Dose was calculated for each shifted fraction and summed to obtain DVHs and BEDs that modeled the dose with daily setup variability. Patients were categorized in to relevant groupings that were chosen to investigate the rigorousness of immobilization types, treatment techniques, and inherent anatomical difficulties. Mean position differences and dosimetric differences were evaluated between planned and delivered doses. Results: The setup variability was found to follow a normal distribution with mean position differences between the DRR and port film between − 8.6–3.5 mm with sigma range of 5.3–9.8 mm. Setup position was not found to be significantly different than zero. The mean seroma or whole breast PTV dosimetric difference, calculated as BED, ranged from a −0.23 to +1.13Gy. Conclusion: A systematic technique to quantify and model setup variability was used to calculate the dose in 20 breast cancer patients including variable setup. No statistically significant PTV or OAR BED differences were found between

  8. Patient representatives' views on patient information in clinical cancer trials.

    Science.gov (United States)

    Dellson, Pia; Nilbert, Mef; Carlsson, Christina

    2016-02-01

    Patient enrolment into clinical trials is based on oral information and informed consent, which includes an information sheet and a consent certificate. The written information should be complete, but at the same time risks being so complex that it may be questioned if a fully informed consent is possible to provide. We explored patient representatives' views and perceptions on the written trial information used in clinical cancer trials. Written patient information leaflets used in four clinical trials for colorectal cancer were used for the study. The trials included phase I-III trials, randomized and non-randomized trials that evaluated chemotherapy/targeted therapy in the neoadjuvant, adjuvant and palliative settings. Data were collected through focus groups and were analysed using inductive content analysis. Two major themes emerged: emotional responses and cognitive responses. Subthemes related to the former included individual preferences and perceptions of effect, while subthemes related to the latter were comprehensibility and layout. Based on these observations the patient representatives provided suggestions for improvement, which largely included development of future simplified and more attractive informed consent forms. The emotional and cognitive responses to written patient information reported by patient representatives provides a basis for revised formats in future trials and add to the body of information that support use of plain language, structured text and illustrations to improve the informed consent process and thereby patient enrolment into clinical trials.

  9. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    Science.gov (United States)

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  10. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    NARCIS (Netherlands)

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  11. Reticulocyte dynamic and hemoglobin variability in hemodialysis patients treated with Darbepoetin alfa and C.E.R.A.: a randomized controlled trial.

    Science.gov (United States)

    Forni, Valentina; Bianchi, Giorgia; Ogna, Adam; Salvadé, Igor; Vuistiner, Philippe; Burnier, Michel; Gabutti, Luca

    2013-07-22

    In a simulation based on a pharmacokinetic model we demonstrated that increasing the erythropoiesis stimulating agents (ESAs) half-life or shortening their administration interval decreases hemoglobin variability. The benefit of reducing the administration interval was however lessened by the variability induced by more frequent dosage adjustments. The purpose of this study was to analyze the reticulocyte and hemoglobin kinetics and variability under different ESAs and administration intervals in a collective of chronic hemodialysis patients. The study was designed as an open-label, randomized, four-period cross-over investigation, including 30 patients under chronic hemodialysis at the regional hospital of Locarno (Switzerland) in February 2010 and lasting 2 years. Four subcutaneous treatment strategies (C.E.R.A. every 4 weeks Q4W and every 2 weeks Q2W, Darbepoetin alfa Q4W and Q2W) were compared with each other. The mean square successive difference of hemoglobin, reticulocyte count and ESAs dose was used to quantify variability. We distinguished a short- and a long-term variability based respectively on the weekly and monthly successive difference. No difference was found in the mean values of biological parameters (hemoglobin, reticulocytes, and ferritin) between the 4 strategies. ESAs type did not affect hemoglobin and reticulocyte variability, but C.E.R.A induced a more sustained reticulocytes response over time and increased the risk of hemoglobin overshooting (OR 2.7, p = 0.01). Shortening the administration interval lessened the amplitude of reticulocyte count fluctuations but resulted in more frequent ESAs dose adjustments and in amplified reticulocyte and hemoglobin variability. Q2W administration interval was however more favorable in terms of ESAs dose, allowing a 38% C.E.R.A. dose reduction, and no increase of Darbepoetin alfa. The reticulocyte dynamic was a more sensitive marker of time instability of the hemoglobin response under ESAs therapy

  12. Random effects coefficient of determination for mixed and meta-analysis models.

    Science.gov (United States)

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  13. Random inbreeding, isonymy, and population isolates in Argentina.

    Science.gov (United States)

    Dipierri, José; Rodríguez-Larralde, Alvaro; Barrai, Italo; Camelo, Jorge López; Redomero, Esperanza Gutiérrez; Rodríguez, Concepción Alonso; Ramallo, Virginia; Bronberg, Rubén; Alfaro, Emma

    2014-07-01

    Population isolates are an important tool in identifying and mapping genes of Mendelian diseases and complex traits. The geographical identification of isolates represents a priority from a genetic and health care standpoint. The purpose of this study is to analyze the spatial distribution of consanguinity by random isonymy (F ST) in Argentina and its relationship with the isolates previously identified in the country. F ST was estimated from the surname distribution of 22.6 million electors registered for the year 2001 in the 24 provinces, 5 geographical regions, and 510 departments of the country. Statistically significant spatial clustering of F ST was determined using the SaTScan V5.1 software. F ST exhibited a marked regional and departamental variation, showing the highest values towards the North and West of Argentina. The clusters of high consanguinity by random isonymy followed the same distribution. Recognized Argentinean genetic isolates are mainly localized at the north of the country, in clusters of high inbreeding. Given the availability of listings of surnames in high-capacity storage devices for different countries, estimating F ST from them can provide information on inbreeding for all levels of administrative subdivisions, to be used as a demographic variable for the identification of isolates within the country for public health purposes.

  14. Random graph states, maximal flow and Fuss-Catalan distributions

    International Nuclear Information System (INIS)

    Collins, BenoIt; Nechita, Ion; Zyczkowski, Karol

    2010-01-01

    For any graph consisting of k vertices and m edges we construct an ensemble of random pure quantum states which describe a system composed of 2m subsystems. Each edge of the graph represents a bipartite, maximally entangled state. Each vertex represents a random unitary matrix generated according to the Haar measure, which describes the coupling between subsystems. Dividing all subsystems into two parts, one may study entanglement with respect to this partition. A general technique to derive an expression for the average entanglement entropy of random pure states associated with a given graph is presented. Our technique relies on Weingarten calculus and flow problems. We analyze the statistical properties of spectra of such random density matrices and show for which cases they are described by the free Poissonian (Marchenko-Pastur) distribution. We derive a discrete family of generalized, Fuss-Catalan distributions and explicitly construct graphs which lead to ensembles of random states characterized by these novel distributions of eigenvalues.

  15. Effects of short-term variability of meteorological variables on soil temperature in permafrost regions

    Science.gov (United States)

    Beer, Christian; Porada, Philipp; Ekici, Altug; Brakebusch, Matthias

    2018-03-01

    Effects of the short-term temporal variability of meteorological variables on soil temperature in northern high-latitude regions have been investigated. For this, a process-oriented land surface model has been driven using an artificially manipulated climate dataset. Short-term climate variability mainly impacts snow depth, and the thermal diffusivity of lichens and bryophytes. These impacts of climate variability on insulating surface layers together substantially alter the heat exchange between atmosphere and soil. As a result, soil temperature is 0.1 to 0.8 °C higher when climate variability is reduced. Earth system models project warming of the Arctic region but also increasing variability of meteorological variables and more often extreme meteorological events. Therefore, our results show that projected future increases in permafrost temperature and active-layer thickness in response to climate change will be lower (i) when taking into account future changes in short-term variability of meteorological variables and (ii) when representing dynamic snow and lichen and bryophyte functions in land surface models.

  16. On the Wigner law in dilute random matrices

    Science.gov (United States)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  17. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  18. Perturbation Solutions for Random Linear Structural Systems subject to Random Excitation using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    perturbation method using stochastic differential equations. The joint statistical moments entering the perturbation solution are determined by considering an augmented dynamic system with state variables made up of the displacement and velocity vector and their first and second derivatives with respect......The paper deals with the first and second order statistical moments of the response of linear systems with random parameters subject to random excitation modelled as white-noise multiplied by an envelope function with random parameters. The method of analysis is basically a second order...... to the random parameters of the problem. Equations for partial derivatives are obtained from the partial differentiation of the equations of motion. The zero time-lag joint statistical moment equations for the augmented state vector are derived from the Itô differential formula. General formulation is given...

  19. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    Science.gov (United States)

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  20. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Science.gov (United States)

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  1. Functional interpretation of representative soil spatial-temporal variability at the Central region of European territory of Russia

    Science.gov (United States)

    Vasenev, I.

    2012-04-01

    The essential spatial and temporal variability is mutual feature for most natural and man-changed soils at the Central region of European territory of Russia. The original spatial heterogeneity of forest and forest-steppe soils has been further complicated by a specific land-use history and different-direction soil successions due to environmental changes and human impacts. For demand-driven land-use planning and decision making the quantitative analysis, modeling and functional-ecological interpretation of representative soil cover patterns spatial variability is an important and challenging task that receives increasing attention from scientific society, private companies, governmental and environmental bodies. On basis of long-term different-scale soil mapping, key plot investigation, land quality and land-use evaluation, soil forming and degradation processes modeling, functional-ecological typology of the zonal set of elementary soil cover patterns (ESCP) has been done in representative natural and man transformed ecosystems of the forest, forest-steppe and steppe zones at the Central region of European territory of Russia (ETR). The validation and ranging of the limiting factors of functional quality and ecological state have been made for dominating and most dynamical components of ESCP regional-typological forms - with application of local GIS, traditional regression kriging and correlation tree models. Development, zonal-regional differentiation and verification of the basic set of criteria and algorithms for logically formalized distinguishing of the most "stable" & "hot" areas in soil cover patterns make it possible for quantitative assessment of dominating in them elementary landscape, soil-forming and degradation processes. The received data essentially expand known ranges of the soil forming processes (SFP) rate «in situ». In case of mature forests mutual for them the windthrow impacts and lateral processes make SFPs more active and complex both in

  2. Machine learning techniques to select variable stars

    Directory of Open Access Journals (Sweden)

    García-Varela Alejandro

    2017-01-01

    Full Text Available In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.

  3. Lifestyle factors and socioeconomic variables associated with abdominal obesity in Brazilian adolescents.

    Science.gov (United States)

    Moraes, Augusto César Ferreira de; Falcão, Mário Cícero

    2013-01-01

    Lifestyle variables have a key role in the development of abdominal obesity (AO). The objective of this study was to identify lifestyle factors and socioeconomic variables associated with AO in adolescents. This study carried out a school-based survey in the Brazilian city of Maringá in Paraná. The representative sample was of 991 adolescents (54.5% girls) from both public and private high schools selected through multi-stage random sampling. AO was classified according to waist circumference value. The independent variables studied were: gender, age, socioeconomic level, parental and household characteristics, smoking, alcohol use, physical inactivity, sedentary behaviour and nutrition-related habits. Poisson regression was used with robust variance adjustment to analyse the associations. The analysis was stratified by sexes. The prevalence of AO was 32.7% (girls = 36.3%, boys = 28.4%). In girls, excessive intake of fried foods was inversely associated with AO and excessive consumption of soda was positively associated. In boys, the results demonstrated a negative association with excessive consumption of sweets and soda. It is concluded that the prevalence of AO among adolescents was higher in both sexes. AO is associated with different eating habits in females and males and these relationships are mediated by familial contexts.

  4. Prediction of university student’s addictability based on some demographic variables, academic procrastination, and interpersonal variables

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Tavakoli

    2014-02-01

    Full Text Available Objectives: This study aimed to predict addictability among the students, based on demographic variables, academic procrastination, and interpersonal variables, and also to study the prevalence of addictability among these students. Method: The participants were 500 students (260 females, 240 males selected through a stratified random sampling among the students in Islamic Azad University Branch Abadan. The participants were assessed through Individual specification inventory, addiction potential scale and Aitken procrastination Inventory. Findings: The findings showed %23/6 of students’ readiness for addiction. Men showed higher addictability than women, but age wasn’t an issue. Also variables such as economic status, age, major, and academic procrastination predicted %13, and among interpersonal variables, the variables of having friends who use drugs and dissociated family predicted %13/2 of the variance in addictability. Conclusion: This study contains applied implications for addiction prevention.

  5. Comparison of structured and unstructured physical activity training on predicted VO2max and heart rate variability in adolescents - a randomized control trial.

    Science.gov (United States)

    Sharma, Vivek Kumar; Subramanian, Senthil Kumar; Radhakrishnan, Krishnakumar; Rajendran, Rajathi; Ravindran, Balasubramanian Sulur; Arunachalam, Vinayathan

    2017-05-01

    Physical inactivity contributes to many health issues. The WHO-recommended physical activity for adolescents encompasses aerobic, resistance, and bone strengthening exercises aimed at achieving health-related physical fitness. Heart rate variability (HRV) and maximal aerobic capacity (VO2max) are considered as noninvasive measures of cardiovascular health. The objective of this study is to compare the effect of structured and unstructured physical training on maximal aerobic capacity and HRV among adolescents. We designed a single blinded, parallel, randomized active-controlled trial (Registration No. CTRI/2013/08/003897) to compare the physiological effects of 6 months of globally recommended structured physical activity (SPA), with that of unstructured physical activity (USPA) in healthy school-going adolescents. We recruited 439 healthy student volunteers (boys: 250, girls: 189) in the age group of 12-17 years. Randomization across the groups was done using age and gender stratified randomization method, and the participants were divided into two groups: SPA (n=219, boys: 117, girls: 102) and USPA (n=220, boys: 119, girls: 101). Depending on their training status and gender the participants in both SPA and USPA groups were further subdivided into the following four sub-groups: SPA athlete boys (n=22) and girls (n=17), SPA nonathlete boys (n=95) and girls (n=85), USPA athlete boys (n=23) and girls (n=17), and USPA nonathlete boys (n=96) and girls (n=84). We recorded HRV, body fat%, and VO2 max using Rockport Walk Fitness test before and after the intervention. Maximum aerobic capacity and heart rate variability increased significantly while heart rate, systolic blood pressure, diastolic blood pressure, and body fat percentage decreased significantly after both SPA and USPA intervention. However, the improvement was more in SPA as compared to USPA. SPA is more beneficial for improving cardiorespiratory fitness, HRV, and reducing body fat percentage in terms of

  6. A Note on the Correlated Random Coefficient Model

    DEFF Research Database (Denmark)

    Kolodziejczyk, Christophe

    In this note we derive the bias of the OLS estimator for a correlated random coefficient model with one random coefficient, but which is correlated with a binary variable. We provide set-identification to the parameters of interest of the model. We also show how to reduce the bias of the estimator...

  7. Picturing and modelling catchments by representative hillslopes

    Science.gov (United States)

    Loritz, Ralf; Hassler, Sibylle; Jackisch, Conrad; Zehe, Erwin

    2016-04-01

    Hydrological modelling studies often start with a qualitative sketch of the hydrological processes of a catchment. These so-called perceptual models are often pictured as hillslopes and are generalizations displaying only the dominant and relevant processes of a catchment or hillslope. The problem with these models is that they are prone to become too much predetermined by the designer's background and experience. Moreover it is difficult to know if that picture is correct and contains enough complexity to represent the system under study. Nevertheless, because of their qualitative form, perceptual models are easy to understand and can be an excellent tool for multidisciplinary exchange between researchers with different backgrounds, helping to identify the dominant structures and processes in a catchment. In our study we explore whether a perceptual model built upon an intensive field campaign may serve as a blueprint for setting up representative hillslopes in a hydrological model to reproduce the functioning of two distinctly different catchments. We use a physically-based 2D hillslope model which has proven capable to be driven by measured soil-hydrological parameters. A key asset of our approach is that the model structure itself remains a picture of the perceptual model, which is benchmarked against a) geo-physical images of the subsurface and b) observed dynamics of discharge, distributed state variables and fluxes (soil moisture, matric potential and sap flow). Within this approach we are able to set up two behavioral model structures which allow the simulation of the most important hydrological fluxes and state variables in good accordance with available observations within the 19.4 km2 large Colpach catchment and the 4.5 km2 large Wollefsbach catchment in Luxembourg without the necessity of calibration. This corroborates, contrary to the widespread opinion, that a) lower mesoscale catchments may be modelled by representative hillslopes and b) physically

  8. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  9. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  10. How Far Is Quasar UV/Optical Variability from a Damped Random Walk at Low Frequency?

    Energy Technology Data Exchange (ETDEWEB)

    Guo Hengxiao; Wang Junxian; Cai Zhenyi; Sun Mouyuan, E-mail: hengxiaoguo@gmail.com, E-mail: jxw@ustc.edu.cn [CAS Key Laboratory for Research in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Hefei 230026 (China)

    2017-10-01

    Studies have shown that UV/optical light curves of quasars can be described using the prevalent damped random walk (DRW) model, also known as the Ornstein–Uhlenbeck process. A white noise power spectral density (PSD) is expected at low frequency in this model; however, a direct observational constraint to the low-frequency PSD slope is difficult due to the limited lengths of the light curves available. Meanwhile, quasars show scatter in their DRW parameters that is too large to be attributed to uncertainties in the measurements and dependence on the variation of known physical factors. In this work we present simulations showing that, if the low-frequency PSD deviates from the DRW, the red noise leakage can naturally produce large scatter in the variation parameters measured from simulated light curves. The steeper the low-frequency PSD slope, the larger scatter we expect. Based on observations of SDSS Stripe 82 quasars, we find that the low-frequency PSD slope should be no steeper than −1.3. The actual slope could be flatter, which consequently requires that the quasar variabilities should be influenced by other unknown factors. We speculate that the magnetic field and/or metallicity could be such additional factors.

  11. Analysis of the Spatial Variation of Network-Constrained Phenomena Represented by a Link Attribute Using a Hierarchical Bayesian Model

    Directory of Open Access Journals (Sweden)

    Zhensheng Wang

    2017-02-01

    Full Text Available The spatial variation of geographical phenomena is a classical problem in spatial data analysis and can provide insight into underlying processes. Traditional exploratory methods mostly depend on the planar distance assumption, but many spatial phenomena are constrained to a subset of Euclidean space. In this study, we apply a method based on a hierarchical Bayesian model to analyse the spatial variation of network-constrained phenomena represented by a link attribute in conjunction with two experiments based on a simplified hypothetical network and a complex road network in Shenzhen that includes 4212 urban facility points of interest (POIs for leisure activities. Then, the methods named local indicators of network-constrained clusters (LINCS are applied to explore local spatial patterns in the given network space. The proposed method is designed for phenomena that are represented by attribute values of network links and is capable of removing part of random variability resulting from small-sample estimation. The effects of spatial dependence and the base distribution are also considered in the proposed method, which could be applied in the fields of urban planning and safety research.

  12. An Undergraduate Research Experience on Studying Variable Stars

    Science.gov (United States)

    Amaral, A.; Percy, J. R.

    2016-06-01

    We describe and evaluate a summer undergraduate research project and experience by one of us (AA), under the supervision of the other (JP). The aim of the project was to sample current approaches to analyzing variable star data, and topics related to the study of Mira variable stars and their astrophysical importance. This project was done through the Summer Undergraduate Research Program (SURP) in astronomy at the University of Toronto. SURP allowed undergraduate students to explore and learn about many topics within astronomy and astrophysics, from instrumentation to cosmology. SURP introduced students to key skills which are essential for students hoping to pursue graduate studies in any scientific field. Variable stars proved to be an excellent topic for a research project. For beginners to independent research, it introduces key concepts in research such as critical thinking and problem solving, while illuminating previously learned topics in stellar physics. The focus of this summer project was to compare observations with structural and evolutionary models, including modelling the random walk behavior exhibited in the (O-C) diagrams of most Mira stars. We found that the random walk could be modelled by using random fluctuations of the period. This explanation agreed well with observations.

  13. Immediate versus early non-occlusal loading of dental implants placed flapless in partially edentulous patients: a 3-year randomized clinical trial.

    Science.gov (United States)

    Merli, Mauro; Moscatelli, Marco; Mariotti, Giorgia; Piemontese, Matteo; Nieri, Michele

    2012-02-01

    To compare immediate versus early non-occlusal loading of dental implants placed flapless in a 3-year, parallel group, randomized clinical trial. The study was conducted in a private dental clinic between July 2005 and July 2010. Patients 18 years or older were randomized to receive implants for fixed partial dentures in cases of partial edentulism. The test group was represented by immediate non-occlusal implant loading, whereas the control group was represented by early non-occlusal implant loading. The outcome variables were implant failure, complications and radiographic bone level at implant sites 3 years after loading, measured from the implant-abutment junction to the most coronal point of bone-to-implant contact. Randomization was computer-generated with allocation concealment by opaque sequentially numbered sealed envelopes, and the measurer was blinded to group assignment. Sixty patients were randomized: 30 to the immediately loaded group and 30 to the early loaded group. Four patients dropped out; however, the data of all patients were included in the analysis. No implant failure occurred. Two complications occurred in the control group and one in the test group. The mean bone level at 3 years was 1.91 mm for test group and 1.59 mm for control group. The adjusted difference in bone level was 0.26 mm (CI 95% -0.08 to 0.59, p = 0.1232). The null hypothesis of no difference in failure rates, complications and bone level between implants that were loaded immediately or early at 3 years cannot be rejected in this randomized clinical trial. © 2011 John Wiley & Sons A/S.

  14. Pulsating red variables

    International Nuclear Information System (INIS)

    Whitelock, P.A.

    1990-01-01

    The observational characteristics of pulsating red variables are reviewed with particular emphasis on the Miras. These variables represent the last stage in the evolution of stars on the Asymptotic Giant Branch (AGB). A large fraction of the IRAS sources in the Bulge are Mira variables and a subset of these are also OH/IR sources. Their periods range up to 720 days, though most are between 360 and 560 days. At a given period those stars with the highest pulsation amplitudes have the highest mass-loss rates; this is interpreted as evidence for a causal connection between mass-loss and pulsation. It is suggested that once an AGB star has become a Mira it will evolve with increasing pulsation amplitude and mass-loss, but with very little change of luminosity or logarithmic period. 26 refs

  15. Exponential gain of randomness certified by quantum contextuality

    Science.gov (United States)

    Um, Mark; Zhang, Junhua; Wang, Ye; Wang, Pengfei; Kim, Kihwan

    2017-04-01

    We demonstrate the protocol of exponential gain of randomness certified by quantum contextuality in a trapped ion system. The genuine randomness can be produced by quantum principle and certified by quantum inequalities. Recently, randomness expansion protocols based on inequality of Bell-text and Kochen-Specker (KS) theorem, have been demonstrated. These schemes have been theoretically innovated to exponentially expand the randomness and amplify the randomness from weak initial random seed. Here, we report the experimental evidence of such exponential expansion of randomness. In the experiment, we use three states of a 138Ba + ion between a ground state and two quadrupole states. In the 138Ba + ion system, we do not have detection loophole and we apply a methods to rule out certain hidden variable models that obey a kind of extended noncontextuality.

  16. Stability and complexity of small random linear systems

    Science.gov (United States)

    Hastings, Harold

    2010-03-01

    We explore the stability of the small random linear systems, typically involving 10-20 variables, motivated by dynamics of the world trade network and the US and Canadian power grid. This report was prepared as an account of work sponsored by an agency of the US Government. Neither the US Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the US Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the US Government or any agency thereof.

  17. Least squares estimation in a simple random coefficient autoregressive model

    DEFF Research Database (Denmark)

    Johansen, S; Lange, T

    2013-01-01

    The question we discuss is whether a simple random coefficient autoregressive model with infinite variance can create the long swings, or persistence, which are observed in many macroeconomic variables. The model is defined by yt=stρyt−1+εt,t=1,…,n, where st is an i.i.d. binary variable with p...... we prove the curious result that View the MathML source. The proof applies the notion of a tail index of sums of positive random variables with infinite variance to find the order of magnitude of View the MathML source and View the MathML source and hence the limit of View the MathML source...

  18. Statistics of α-μ Random Variables and Their Applications inWireless Multihop Relaying and Multiple Scattering Channels

    KAUST Repository

    Wang, Kezhi

    2015-06-01

    Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.

  19. Statistics of α-μ Random Variables and Their Applications inWireless Multihop Relaying and Multiple Scattering Channels

    KAUST Repository

    Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim

    2015-01-01

    Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.

  20. Softening in Random Networks of Non-Identical Beams.

    Science.gov (United States)

    Ban, Ehsan; Barocas, Victor H; Shephard, Mark S; Picu, Catalin R

    2016-02-01

    Random fiber networks are assemblies of elastic elements connected in random configurations. They are used as models for a broad range of fibrous materials including biopolymer gels and synthetic nonwovens. Although the mechanics of networks made from the same type of fibers has been studied extensively, the behavior of composite systems of fibers with different properties has received less attention. In this work we numerically and theoretically study random networks of beams and springs of different mechanical properties. We observe that the overall network stiffness decreases on average as the variability of fiber stiffness increases, at constant mean fiber stiffness. Numerical results and analytical arguments show that for small variabilities in fiber stiffness the amount of network softening scales linearly with the variance of the fiber stiffness distribution. This result holds for any beam structure and is expected to apply to a broad range of materials including cellular solids.

  1. Response variability in balanced cortical networks

    DEFF Research Database (Denmark)

    Lerchner, Alexander; Ursta, C.; Hertz, J.

    2006-01-01

    We study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external...

  2. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  3. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  4. Informed decision-making with and for people with dementia - efficacy of the PRODECIDE education program for legal representatives: protocol of a randomized controlled trial (PRODECIDE-RCT).

    Science.gov (United States)

    Lühnen, Julia; Haastert, Burkhard; Mühlhauser, Ingrid; Richter, Tanja

    2017-09-15

    In Germany, the guardianship system provides adults who are no longer able to handle their own affairs a court-appointed legal representative, for support without restriction of legal capacity. Although these representatives only rarely are qualified in healthcare, they nevertheless play decisive roles in the decision-making processes for people with dementia. Previously, we developed an education program (PRODECIDE) to address this shortcoming and tested it for feasibility. Typical, autonomy-restricting decisions in the care of people with dementia-namely, using percutaneous endoscopic gastrostomy (PEG) or physical restrains (PR), or the prescription of antipsychotic drugs (AP)-were the subject areas trained. The training course aims to enhance the competency of legal representatives in informed decision-making. In this study, we will evaluate the efficacy of the PRODECIDE education program. A randomized controlled trial with a six-month follow-up will be conducted to compare the PRODECIDE education program with standard care, enrolling legal representatives (N = 216). The education program lasts 10 h and comprises four modules: A, decision-making processes and methods; and B, C and D, evidence-based knowledge about PEG, PR and AP, respectively. The primary outcome measure is knowledge, which is operationalized as the understanding of decision-making processes in healthcare affairs and in setting realistic expectations about benefits and harms of PEG, PR and AP in people with dementia. Secondary outcomes are sufficient and sustainable knowledge and percentage of persons concerned affected by PEG, FEM or AP. A qualitative process evaluation will be performed. Additionally, to support implementation, a concept for translating the educational contents into e-learning modules will be developed. The study results will show whether the efficacy of the education program could justify its implementation into the regular training curricula for legal representatives

  5. A data based random number generator for a multivariate distribution (using stochastic interpolation)

    Science.gov (United States)

    Thompson, J. R.; Taylor, M. S.

    1982-01-01

    Let X be a K-dimensional random variable serving as input for a system with output Y (not necessarily of dimension k). given X, an outcome Y or a distribution of outcomes G(Y/X) may be obtained either explicitly or implicity. The situation is considered in which there is a real world data set X sub j sub = 1 (n) and a means of simulating an outcome Y. A method for empirical random number generation based on the sample of observations of the random variable X without estimating the underlying density is discussed.

  6. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  7. Quantifying the Variability of Internode Allometry within and between Trees for Pinus tabulaeformis Carr. Using a Multilevel Nonlinear Mixed-Effect Model

    Directory of Open Access Journals (Sweden)

    Jun Diao

    2014-11-01

    Full Text Available Allometric models of internodes are an important component of Functional-Structural Plant Models (FSPMs, which represent the shape of internodes in tree architecture and help our understanding of resource allocation in organisms. Constant allometry is always assumed in these models. In this paper, multilevel nonlinear mixed-effect models were used to characterize the variability of internode allometry, describing the relationship between the last internode length and biomass of Pinus tabulaeformis Carr. trees within the GreenLab framework. We demonstrated that there is significant variability in allometric relationships at the tree and different-order branch levels, and the variability decreases among levels from trees to first-order branches and, subsequently, to second-order branches. The variability was partially explained by the random effects of site characteristics, stand age, density, and topological position of the internode. Tree- and branch-level-specific allometric models are recommended because they produce unbiased and accurate internode length estimates. The model and method developed in this study are useful for understanding and describing the structure and functioning of trees.

  8. Impact of Flavonols on Cardiometabolic Biomarkers: A Meta-Analysis of Randomized Controlled Human Trials to Explore the Role of Inter-Individual Variability

    Science.gov (United States)

    Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R.; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula

    2017-01-01

    Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD). However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta-analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta-analysis model and reported as difference in means (DM). Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01), LDL cholesterol (DM = −0.14 mmol/L; 95% CI: −0.21, 0.07), and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03), and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95% CI: −0.29, −0.08), and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: −4.09, −2.55). Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of origin and health status may influence the effect of flavonol intake on blood lipid levels. PMID:28208791

  9. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  10. Travel time variability and rational inattention

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Jiang, Gege

    2017-01-01

    This paper sets up a rational inattention model for the choice of departure time for a traveler facing random travel time. The traveler chooses how much information to acquire about the travel time out-come before choosing departure time. This reduces the cost of travel time variability compared...

  11. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  12. [Sensitivity of four representative angular cephalometric measures].

    Science.gov (United States)

    Xü, T; Ahn, J; Baumrind, S

    2000-05-01

    Examined the sensitivity of four representative cephalometric angles to the detection of different vectors of craniofacial growth. Landmark coordinate data from a stratified random sample of 48 adolescent subjects were used to calculate conventional values for changes between the pretreatment and end-of-treatment lateral cephalograms. By modifying the end-of-treatment coordinate values appropriately, the angular changes could be recalculated reflecting three hypothetical situations: Case 1. What if there were no downward landmark displacement between timepoints? Case 2. What if there were no forward landmark displacement between timepoints? Case 3. What if there were no Nasion change? These questions were asked for four representative cephalometric angles: SNA, ANB, NAPg and UI-SN. For Case 1, the associations (r) between the baseline and the modified measure for the three angles were very highly significant (P < 0.001) with r2 values no lower than 0.94! For Case 2, however, the associations were much weaker and no r value reached significance. These angular measurements are less sensitive for measuring downward landmark displacement than they are for measuring forward landmark displacement.

  13. Fracture fragility of HFIR vessel caused by random crack size or random toughness

    International Nuclear Information System (INIS)

    Chang, Shih-Jung; Proctor, L.D.

    1993-01-01

    This report discuses the probability of fracture (fracture fragility) versus a range of applied hoop stresses along the HFIR vessel which is obtained as an estimate of its fracture capacity. Both the crack size and the fracture toughness are assumed to be random variables that follow given distribution functions. Possible hoop stress is based on the numerical solution of the vessel response by applying a point pressure-pulse it the center of the fluid volume within the vessel. Both the fluid-structure interaction and radiation embrittlement are taken into consideration. Elastic fracture mechanics is used throughout the analysis. The probability of vessel fracture for a single crack caused by either a variable crack depth or a variable toughness is first derived. Then the probability of fracture with multiple number of cracks is obtained. The probability of fracture is further extended to include different levels of confidence and variability. It, therefore, enables one to estimate the high confidence and low probability capacity accident load

  14. Random Forest Variable Importance Spectral Indices Scheme for Burnt Forest Recovery Monitoring—Multilevel RF-VIMP

    Directory of Open Access Journals (Sweden)

    Sornkitja Boonprong

    2018-05-01

    Full Text Available Burnt forest recovery is normally monitored with a time-series analysis of satellite data because of its proficiency for large observation areas. Traditional methods, such as linear correlation plotting, have been proven to be effective, as forest recovery naturally increases with time. However, these methods are complicated and time consuming when increasing the number of observed parameters. In this work, we present a random forest variable importance (RF-VIMP scheme called multilevel RF-VIMP to compare and assess the relationship between 36 spectral indices (parameters of burnt boreal forest recovery in the Great Xing’an Mountain, China. Six Landsat images were acquired in the same month 0, 1, 4, 14, 16, and 20 years after a fire, and 39,380 fixed-location samples were then extracted to calculate the effectiveness of the 36 parameters. Consequently, the proposed method was applied to find correlations between the forest recovery indices. The experiment showed that the proposed method is suitable for explaining the efficacy of those spectral indices in terms of discrimination and trend analysis, and for showing the satellite data and forest succession dynamics when applied in a time series. The results suggest that the tasseled cap transformation wetness, brightness, and the shortwave infrared bands (both 1 and 2 perform better than other indices for both classification and monitoring.

  15. Collocation methods for uncertainty quanti cation in PDE models with random data

    KAUST Repository

    Nobile, Fabio

    2014-01-06

    In this talk we consider Partial Differential Equations (PDEs) whose input data are modeled as random fields to account for their intrinsic variability or our lack of knowledge. After parametrizing the input random fields by finitely many independent random variables, we exploit the high regularity of the solution of the PDE as a function of the input random variables and consider sparse polynomial approximations in probability (Polynomial Chaos expansion) by collocation methods. We first address interpolatory approximations where the PDE is solved on a sparse grid of Gauss points in the probability space and the solutions thus obtained interpolated by multivariate polynomials. We present recent results on optimized sparse grids in which the selection of points is based on a knapsack approach and relies on sharp estimates of the decay of the coefficients of the polynomial chaos expansion of the solution. Secondly, we consider regression approaches where the PDE is evaluated on randomly chosen points in the probability space and a polynomial approximation constructed by the least square method. We present recent theoretical results on the stability and optimality of the approximation under suitable conditions between the number of sampling points and the dimension of the polynomial space. In particular, we show that for uniform random variables, the number of sampling point has to scale quadratically with the dimension of the polynomial space to maintain the stability and optimality of the approximation. Numerical results show that such condition is sharp in the monovariate case but seems to be over-constraining in higher dimensions. The regression technique seems therefore to be attractive in higher dimensions.

  16. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  17. Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning

    Science.gov (United States)

    Payne, Velma L.; Crowley, Rebecca S.

    2008-01-01

    We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors. PMID:18999140

  18. More randomness from the same data

    International Nuclear Information System (INIS)

    Bancal, Jean-Daniel; Sheridan, Lana; Scarani, Valerio

    2014-01-01

    Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment (Christensen et al 2013 Phys. Rev. Lett. 111 130406), we show that about twice as much randomness as previously reported can be potentially extracted from this setup. (paper)

  19. Stochastic stationary response of a variable-mass system with mass disturbance described by Poisson white noise

    Science.gov (United States)

    Qiao, Yan; Xu, Wei; Jia, Wantao; Han, Qun

    2017-05-01

    Variable-mass systems have received widespread attention and show prominent significance with the explosive development of micro- and nanotechnologies, so there is a growing need to study the influences of mass disturbances on systems. This paper is devoted to investigating the stochastic response of a variable-mass system subject to weakly random excitation, in which the mass disturbance is modeled as a Poisson white noise. Firstly, the original system is approximately replaced by the associated conservative system with small disturbance based on the Taylor expansion technique. Then the stationary response of the approximate system is obtained by applying the stochastic averaging method. At last, a representative variable-mass oscillator is worked out to illustrate the effectiveness of the analytical solution by comparing with Monte Carlo simulation. The relative change of mean-square displacement is used to measure the influences of mass disturbance on system responses. Results reveal that the stochastic responses are more sensitive to mass disturbance for some system parameters. It is also found that the influences of Poisson white noise as the mass disturbance on system responses are significantly different from that of Gaussian white noise of the same intensity.

  20. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  1. Test of arbitrage pricing theory using macroeconomic variables

    African Journals Online (AJOL)

    Eyerusalem

    variables; namely, exchange rate, an index of industrial production, nominal money supply ... Key Words: Arbitrage Pricing, Macroeconomic variables, Stock Market ... or theoretical market indices, where sensitivity to changes in each factor is represented ... Ethiopian Journal of Economics, Volume XXI, No 1, April 2012. 3.

  2. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were

  3. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  4. Strong result for real zeros of random algebraic polynomials

    Directory of Open Access Journals (Sweden)

    T. Uno

    2001-01-01

    Full Text Available An estimate is given for the lower bound of real zeros of random algebraic polynomials whose coefficients are non-identically distributed dependent Gaussian random variables. Moreover, our estimated measure of the exceptional set, which is independent of the degree of the polynomials, tends to zero as the degree of the polynomial tends to infinity.

  5. Infinite hidden conditional random fields for human behavior analysis.

    Science.gov (United States)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    2013-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.

  6. Effect of random edge failure on the average path length

    Energy Technology Data Exchange (ETDEWEB)

    Guo Dongchao; Liang Mangui; Li Dandan; Jiang Zhongyuan, E-mail: mgliang58@gmail.com, E-mail: 08112070@bjtu.edu.cn [Institute of Information Science, Beijing Jiaotong University, 100044, Beijing (China)

    2011-10-14

    We study the effect of random removal of edges on the average path length (APL) in a large class of uncorrelated random networks in which vertices are characterized by hidden variables controlling the attachment of edges between pairs of vertices. A formula for approximating the APL of networks suffering random edge removal is derived first. Then, the formula is confirmed by simulations for classical ER (Erdoes and Renyi) random graphs, BA (Barabasi and Albert) networks, networks with exponential degree distributions as well as random networks with asymptotic power-law degree distributions with exponent {alpha} > 2. (paper)

  7. Randomized trials, generalizability, and meta-analysis: Graphical insights for binary outcomes

    Directory of Open Access Journals (Sweden)

    Kramer Barnett S

    2003-06-01

    Full Text Available Abstract Background Randomized trials stochastically answer the question. "What would be the effect of treatment on outcome if one turned back the clock and switched treatments in the given population?" Generalizations to other subjects are reliable only if the particular trial is performed on a random sample of the target population. By considering an unobserved binary variable, we graphically investigate how randomized trials can also stochastically answer the question, "What would be the effect of treatment on outcome in a population with a possibly different distribution of an unobserved binary baseline variable that does not interact with treatment in its effect on outcome?" Method For three different outcome measures, absolute difference (DIF, relative risk (RR, and odds ratio (OR, we constructed a modified BK-Plot under the assumption that treatment has the same effect on outcome if either all or no subjects had a given level of the unobserved binary variable. (A BK-Plot shows the effect of an unobserved binary covariate on a binary outcome in two treatment groups; it was originally developed to explain Simpsons's paradox. Results For DIF and RR, but not OR, the BK-Plot shows that the estimated treatment effect is invariant to the fraction of subjects with an unobserved binary variable at a given level. Conclusion The BK-Plot provides a simple method to understand generalizability in randomized trials. Meta-analyses of randomized trials with a binary outcome that are based on DIF or RR, but not OR, will avoid bias from an unobserved covariate that does not interact with treatment in its effect on outcome.

  8. Maximal Increments of Local Time of a Random Walk

    OpenAIRE

    Jain, Naresh C.; Pruitt, William E.

    1987-01-01

    Let $(S_j)$ be a lattice random walk, i.e., $S_j = X_1 + \\cdots + X_j$, where $X_1, X_2,\\ldots$ are independent random variables with values in $\\mathbb{Z}$ and common nondegenerate distribution $F$. Let $\\{t_n\\}$ be a nondecreasing sequence of positive integers, $t_n \\leq n$, and $L^\\ast_n = \\max_{0\\leq j\\leq n-t_n}(L_{j+t_n} - L_j)$, where $L_n = \\sum^n_{j=1}1_{\\{0\\}}(S_j)$, the number of times zero is visited by the random walk by time $n$. Assuming that the random walk is recurrent and sa...

  9. Variable setpoint as a relaxing component in physiological control.

    Science.gov (United States)

    Risvoll, Geir B; Thorsen, Kristian; Ruoff, Peter; Drengstig, Tormod

    2017-09-01

    Setpoints in physiology have been a puzzle for decades, and especially the notion of fixed or variable setpoints have received much attention. In this paper, we show how previously presented homeostatic controller motifs, extended with saturable signaling kinetics, can be described as variable setpoint controllers. The benefit of a variable setpoint controller is that an observed change in the concentration of the regulated biochemical species (the controlled variable) is fully characterized, and is not considered a deviation from a fixed setpoint. The variation in this biochemical species originate from variation in the disturbances (the perturbation), and thereby in the biochemical species representing the controller (the manipulated variable). Thus, we define an operational space which is spanned out by the combined high and low levels of the variations in (1) the controlled variable, (2) the manipulated variable, and (3) the perturbation. From this operational space, we investigate whether and how it imposes constraints on the different motif parameters, in order for the motif to represent a mathematical model of the regulatory system. Further analysis of the controller's ability to compensate for disturbances reveals that a variable setpoint represents a relaxing component for the controller, in that the necessary control action is reduced compared to that of a fixed setpoint controller. Such a relaxing component might serve as an important property from an evolutionary point of view. Finally, we illustrate the principles using the renal sodium and aldosterone regulatory system, where we model the variation in plasma sodium as a function of salt intake. We show that the experimentally observed variations in plasma sodium can be interpreted as a variable setpoint regulatory system. © 2017 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  10. Cryotherapy, Sensation, and Isometric-Force Variability

    Science.gov (United States)

    Denegar, Craig R.; Buckley, William E.; Newell, Karl M.

    2003-01-01

    Objective: To determine the changes in sensation of pressure, 2-point discrimination, and submaximal isometric-force production variability due to cryotherapy. Design and Setting: Sensation was assessed using a 2 × 2 × 2 × 3 repeated-measures factorial design, with treatment (ice immersion or control), limb (right or left), digit (finger or thumb), and sensation test time (baseline, posttreatment, or postisometric-force trials) as independent variables. Dependent variables were changes in sensation of pressure and 2-point discrimination. Isometric-force variability was tested with a 2 × 2 × 3 repeated-measures factorial design. Treatment condition (ice immersion or control), limb (right or left), and percentage (10, 25, or 40) of maximal voluntary isometric contraction (MVIC) were the independent variables. The dependent variables were the precision or variability (the standard deviation of mean isometric force) and the accuracy or targeting error (the root mean square error) of the isometric force for each percentage of MVIC. Subjects: Fifteen volunteer college students (8 men, 7 women; age = 22 ± 3 years; mass = 72 ± 21.9 kg; height = 183.4 ± 11.6 cm). Measurements: We measured sensation in the distal palmar aspect of the index finger and thumb. Sensation of pressure and 2-point discrimination were measured before treatment (baseline), after treatment (15 minutes of ice immersion or control), and at the completion of isometric testing (final). Variability (standard deviation of mean isometric force) of the submaximal isometric finger forces was measured by having the subjects exert a pinching force with the thumb and index finger for 30 seconds. Subjects performed the pinching task at the 3 submaximal levels of MVIC (10%, 25%, and 40%), with the order of trials assigned randomly. The subjects were given a target representing the submaximal percentage of MVIC and visual feedback of the force produced as they pinched the testing device. The force exerted

  11. Experimental Evaluation of Novel Master-Slave Configurations for Position Control under Random Network Delay and Variable Load for Teleoperation

    Directory of Open Access Journals (Sweden)

    Ahmet Kuzu

    2014-01-01

    Full Text Available This paper proposes two novel master-slave configurations that provide improvements in both control and communication aspects of teleoperation systems to achieve an overall improved performance in position control. The proposed novel master-slave configurations integrate modular control and communication approaches, consisting of a delay regulator to address problems related to variable network delay common to such systems, and a model tracking control that runs on the slave side for the compensation of uncertainties and model mismatch on the slave side. One of the configurations uses a sliding mode observer and the other one uses a modified Smith predictor scheme on the master side to ensure position transparency between the master and slave, while reference tracking of the slave is ensured by a proportional-differentiator type controller in both configurations. Experiments conducted for the networked position control of a single-link arm under system uncertainties and randomly varying network delays demonstrate significant performance improvements with both configurations over the past literature.

  12. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  13. Financial management of a large multisite randomized clinical trial.

    Science.gov (United States)

    Sheffet, Alice J; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E; Longbottom, Mary E; Howard, Virginia J; Marler, John R; Brott, Thomas G

    2014-08-01

    The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years' funding ($21 112 866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2500 randomized participants at 40 sites. Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Projections of the original grant's fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant's fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2500 targeted sample size, 138 (5·5%) were randomized during the first five years and 1387 (55·5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13 845) of the projected per-patient costs ($152 992) of the fixed model. Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. © 2014 The Authors. International Journal of Stroke © 2014 World Stroke Organization.

  14. Prediction of N2O emission from local information with Random Forest

    International Nuclear Information System (INIS)

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2013-01-01

    Nitrous oxide is a potent greenhouse gas, with a global warming potential 298 times greater than that of CO 2 . In agricultural soils, N 2 O emissions are influenced by a large number of environmental characteristics and crop management techniques that are not systematically reported in experiments. Random Forest (RF) is a machine learning method that can handle missing data and ranks input variables on the basis of their importance. We aimed to predict N 2 O emission on the basis of local information, to rank environmental and crop management variables according to their influence on N 2 O emission, and to compare the performances of RF with several regression models. RF outperformed the regression models for predictive purposes, and this approach led to the identification of three important input variables: N fertilization, type of crop, and experiment duration. This method could be used in the future for prediction of N 2 O emissions from local information. -- Highlights: ► Random Forest gave more accurate N 2 O predictions than regression. ► Missing data were well handled by Random Forest. ► The most important factors were nitrogen rate, type of crop and experiment duration. -- Random Forest, a machine learning method, outperformed the regression models for predicting N 2 O emissions and led to the identification of three important input variables

  15. Dissociable effects of practice variability on learning motor and timing skills.

    Science.gov (United States)

    Caramiaux, Baptiste; Bevilacqua, Frédéric; Wanderley, Marcelo M; Palmer, Caroline

    2018-01-01

    Motor skill acquisition inherently depends on the way one practices the motor task. The amount of motor task variability during practice has been shown to foster transfer of the learned skill to other similar motor tasks. In addition, variability in a learning schedule, in which a task and its variations are interweaved during practice, has been shown to help the transfer of learning in motor skill acquisition. However, there is little evidence on how motor task variations and variability schedules during practice act on the acquisition of complex motor skills such as music performance, in which a performer learns both the right movements (motor skill) and the right time to perform them (timing skill). This study investigated the impact of rate (tempo) variability and the schedule of tempo change during practice on timing and motor skill acquisition. Complete novices, with no musical training, practiced a simple musical sequence on a piano keyboard at different rates. Each novice was assigned to one of four learning conditions designed to manipulate the amount of tempo variability across trials (large or small tempo set) and the schedule of tempo change (randomized or non-randomized order) during practice. At test, the novices performed the same musical sequence at a familiar tempo and at novel tempi (testing tempo transfer), as well as two novel (but related) sequences at a familiar tempo (testing spatial transfer). We found that practice conditions had little effect on learning and transfer performance of timing skill. Interestingly, practice conditions influenced motor skill learning (reduction of movement variability): lower temporal variability during practice facilitated transfer to new tempi and new sequences; non-randomized learning schedule improved transfer to new tempi and new sequences. Tempo (rate) and the sequence difficulty (spatial manipulation) affected performance variability in both timing and movement. These findings suggest that there is a

  16. [Variability of nuclear 18S-25S rDNA of Gentiana lutea L. in nature and in tissue culture in vitro].

    Science.gov (United States)

    Mel'nyk, V M; Spiridonova, K V; Andrieiev, I O; Strashniuk, N M; Kunakh, V A

    2004-01-01

    18S-25S rDNA sequence in genomes of G. lutea plants from different natural populations and from tissue culture has been studied with blot-hybridization method. It was shown that ribosomal repeats are represented by the variants which differ for their size and for the presence of additional HindIII restriction site. Genome of individual plant usually possesses several variants of DNA repeats. Interpopulation variability according to their quantitative ratio and to the presence of some of them has been shown. Modifications of the range of rDNA repeats not exceeding intraspecific variability were observed in callus tissues in comparison with the plants of initial population. Non-randomness of genome modifications in the course of cell adaptation to in vitro conditions makes it possible to some extent to forecast these modifications in tissue culture.

  17. Nonlinear deterministic structures and the randomness of protein sequences

    CERN Document Server

    Huang Yan Zhao

    2003-01-01

    To clarify the randomness of protein sequences, we make a detailed analysis of a set of typical protein sequences representing each structural classes by using nonlinear prediction method. No deterministic structures are found in these protein sequences and this implies that they behave as random sequences. We also give an explanation to the controversial results obtained in previous investigations.

  18. Cartesian integration of Grassmann variables over invariant functions

    Energy Technology Data Exchange (ETDEWEB)

    Kieburg, Mario; Kohler, Heiner; Guhr, Thomas [Universitaet Duisburg-Essen, Duisburg (Germany)

    2009-07-01

    Supersymmetry plays an important role in field theory as well as in random matrix theory and mesoscopic physics. Anticommuting variables are the fundamental objects of supersymmetry. The integration over these variables is equivalent to the derivative. Recently[arxiv:0809.2674v1[math-ph] (2008)], we constructed a differential operator which only acts on the ordinary part of the superspace consisting of ordinary and anticommuting variables. This operator is equivalent to the integration over all anticommuting variables of an invariant function. We present this operator and its applications for functions which are rotation invariant under the supergroups U(k{sub 1}/k{sub 2}) and UOSp(k{sub 1}/k{sub 2}).

  19. Analytic regularity and collocation approximation for elliptic PDEs with random domain deformations

    KAUST Repository

    Castrillon, Julio

    2016-03-02

    In this work we consider the problem of approximating the statistics of a given Quantity of Interest (QoI) that depends on the solution of a linear elliptic PDE defined over a random domain parameterized by N random variables. The elliptic problem is remapped onto a corresponding PDE with a fixed deterministic domain. We show that the solution can be analytically extended to a well defined region in CN with respect to the random variables. A sparse grid stochastic collocation method is then used to compute the mean and variance of the QoI. Finally, convergence rates for the mean and variance of the QoI are derived and compared to those obtained in numerical experiments.

  20. A message-passing approach to random constraint satisfaction problems with growing domains

    International Nuclear Information System (INIS)

    Zhao, Chunyan; Zheng, Zhiming; Zhou, Haijun; Xu, Ke

    2011-01-01

    Message-passing algorithms based on belief propagation (BP) are implemented on a random constraint satisfaction problem (CSP) referred to as model RB, which is a prototype of hard random CSPs with growing domain size. In model RB, the number of candidate discrete values (the domain size) of each variable increases polynomially with the variable number N of the problem formula. Although the satisfiability threshold of model RB is exactly known, finding solutions for a single problem formula is quite challenging and attempts have been limited to cases of N ∼ 10 2 . In this paper, we propose two different kinds of message-passing algorithms guided by BP for this problem. Numerical simulations demonstrate that these algorithms allow us to find a solution for random formulas of model RB with constraint tightness slightly less than p cr , the threshold value for the satisfiability phase transition. To evaluate the performance of these algorithms, we also provide a local search algorithm (random walk) as a comparison. Besides this, the simulated time dependence of the problem size N and the entropy of the variables for growing domain size are discussed

  1. Financial Management of a Large Multi-site Randomized Clinical Trial

    Science.gov (United States)

    Sheffet, Alice J.; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E.; Longbottom, Mary E.; Howard, Virginia J.; Marler, John R.; Brott, Thomas G.

    2014-01-01

    Background The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years’ funding ($21,112,866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2,500 randomized participants at 40 sites. Aims Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Methods Projections of the original grant’s fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant’s fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Results Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2,500 targeted sample size, 138 (5.5%) were randomized during the first five years and 1,387 (55.5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13,845) of the projected per-patient costs ($152,992) of the fixed model. Conclusions Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. PMID:24661748

  2. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-01

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  3. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-06

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  4. On a direct algorithm for the generation of log-normal pseudo-random numbers

    CERN Document Server

    Chamayou, J M F

    1976-01-01

    The random variable ( Pi /sub i=1//sup n/X/sub i//X/sub i+n/)/sup 1/ square root 2n/ is used to generate standard log normal variables Lambda (0, 1), where the X/sub i/ are independent uniform variables on (0, 1). (8 refs).

  5. Dietary supplement use and smoking are important correlates of biomarkers of water-soluble vitamin status after adjusting for sociodemographic and lifestyle variables in a representative sample of U.S. adults.

    Science.gov (United States)

    Pfeiffer, Christine M; Sternberg, Maya R; Schleicher, Rosemary L; Rybak, Michael E

    2013-06-01

    Biochemical indicators of water-soluble vitamin (WSV) status were measured in a nationally representative sample of the U.S. population in NHANES 2003-2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) variables with biomarkers of WSV status in adults (aged ≥ 20 y): serum and RBC folate, serum pyridoxal-5'-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (vitamin B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤ 0.43) and together with supplement use explained more of the variability compared with the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7 (vitamin B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥ 6 of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI and for some biomarkers with PIR (5 of 8), education (1 of 8), alcohol consumption (4 of 8), and physical activity (5 of 8). We noted large estimated percentage changes in biomarker concentrations between race-ethnic groups (from -24 to 20%), between supplement users and nonusers (from -12 to 104%), and between smokers and nonsmokers (from -28 to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status.

  6. Sources of variability in consonant perception of normal-hearing listeners

    DEFF Research Database (Denmark)

    Zaar, Johannes; Dau, Torsten

    2015-01-01

    between responses. The speech-induced variability across and within talkers and the across-listener variability were substantial and of similar magnitude. The noise-induced variability, obtained with time-shifted realizations of the same random process, was smaller but significantly larger than the amount......Responses obtained in consonant perception experiments typically show a large variability across stimuli of the same phonetic identity. The present study investigated the influence of different potential sources of this response variability. It was distinguished between source-induced variability......, referring to perceptual differences caused by acoustical differences in the speech tokens and/or the masking noise tokens, and receiver-related variability, referring to perceptual differences caused by within- and across-listener uncertainty. Consonant-vowel combinations consisting of 15 consonants...

  7. Pharmaceutical representatives' beliefs and practices about their professional practice: a study in Sudan.

    Science.gov (United States)

    Idris, K M; Mustafa, A F; Yousif, M A

    2012-08-01

    Pharmaceutical representatives are an important promotional tool for pharmaceutical companies. This cross-sectional, exploratory study aimed to determine pharmaceutical representatives' beliefs and practices about their professional practice in Sudan. A random sample of 160 pharmaceutical representatives were interviewed using a pretested questionnaire. The majority were male (84.4%) and had received training in professional sales skills (86.3%) and about the products being promoted (82.5%). Only 65.6% agreed that they provided full and balanced information about products. Not providing balanced information was attributed by 23.1% to doctors' lack of time. However, 28.1% confessed they sometimes felt like hiding unfavourable information, 21.9% were sometimes or always inclined to give untrue information to make sales and 66.9% considered free gifts as ethically acceptable. More attention is needed to dissemination of ethical codes of conduct and training about the ethics of drug promotion for pharmaceutical representatives in Sudan.

  8. Analytic regularity and collocation approximation for elliptic PDEs with random domain deformations

    KAUST Repository

    Castrillon, Julio; Nobile, Fabio; Tempone, Raul

    2016-01-01

    In this work we consider the problem of approximating the statistics of a given Quantity of Interest (QoI) that depends on the solution of a linear elliptic PDE defined over a random domain parameterized by N random variables. The elliptic problem

  9. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  10. Properties and simulation of α-permanental random fields

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    An α-permanental random field is briefly speaking a model for a collection of random variables with positive associations, where α is a positive number and the probability generating function is given in terms of a covariance or more general function so that density and moment expressions are given...... by certain α-permanents. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of  α-permanental random fields and their potential applications. The purpose of this paper is first to summarize useful probabilistic results using the simplest possible setting......, and second to study stochastic constructions and simulation techniques, which should provide a useful basis for discussing the statistical aspects in future work. The paper also discusses some examples of  α-permanental random fields....

  11. Control of variable speed variable pitch wind turbine based on a disturbance observer

    Science.gov (United States)

    Ren, Haijun; Lei, Xin

    2017-11-01

    In this paper, a novel sliding mode controller based on disturbance observer (DOB) to optimize the efficiency of variable speed variable pitch (VSVP) wind turbine is developed and analyzed. Due to the highly nonlinearity of the VSVP system, the model is linearly processed to obtain the state space model of the system. Then, a conventional sliding mode controller is designed and a DOB is added to estimate wind speed. The proposed control strategy can successfully deal with the random nature of wind speed, the nonlinearity of VSVP system, the uncertainty of parameters and external disturbance. Via adding the observer to the sliding mode controller, it can greatly reduce the chattering produced by the sliding mode switching gain. The simulation results show that the proposed control system has the effectiveness and robustness.

  12. A Generalized Random Regret Minimization Model

    NARCIS (Netherlands)

    Chorus, C.G.

    2013-01-01

    This paper presents, discusses and tests a generalized Random Regret Minimization (G-RRM) model. The G-RRM model is created by replacing a fixed constant in the attribute-specific regret functions of the RRM model, by a regret-weight variable. Depending on the value of the regret-weights, the G-RRM

  13. Intergenerational Transmission of Marital Violence: Results From a Nationally Representative Sample of Men.

    Science.gov (United States)

    Murshid, Nadine Shaanta; Murshid, Navine

    2015-09-16

    The present study assesses the association between childhood exposure to parental violence and perpetration of marital violence as adults among a representative sample of 3,396 men in Bangladesh. We used secondary analysis of survey data from the nationally representative Bangladesh Demographic and Health Survey 2007 to examine factors associated with perpetration of martial violence among 3,396 ever-married men between the ages of 16 and 50 years. Outcome measure, marital violence perpetration, was measured using a modified Conflict Tactics Scale, and predictor variables included childhood exposure to parental violence, justification of marital violence, marital duration, religion, and demographic variables. Results indicate that marital violence perpetration is significantly associated with childhood exposure to marital violence, suggesting a cycle of violence that is maintained across generations. Implications for policy and practice are discussed. © The Author(s) 2015.

  14. A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses

    Science.gov (United States)

    Vasdekis, Vassilis G. S.; Cagnone, Silvia; Moustaki, Irini

    2012-01-01

    The paper proposes a composite likelihood estimation approach that uses bivariate instead of multivariate marginal probabilities for ordinal longitudinal responses using a latent variable model. The model considers time-dependent latent variables and item-specific random effects to be accountable for the interdependencies of the multivariate…

  15. Assessing the use of cognitive heuristic representativeness in clinical reasoning.

    Science.gov (United States)

    Payne, Velma L; Crowley, Rebecca S; Crowley, Rebecca

    2008-11-06

    We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors.

  16. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  17. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  18. Genetic variability of Artemisia capillaris (Wormwood capillary) by ...

    African Journals Online (AJOL)

    The genetic variability among individuals of Artemisia capillaris from state of Terengganu, Malaysia was examined by using the random amplified polymorphic DNA (RAPD) technique. The samples were collected from differences regional in Terengganu State. The genomic DNA was extracted from the samples leaves.

  19. Effects of variable transformations on errors in FORM results

    International Nuclear Information System (INIS)

    Qin Quan; Lin Daojin; Mei Gang; Chen Hao

    2006-01-01

    On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors

  20. Arbitrary-step randomly delayed robust filter with application to boost phase tracking

    Science.gov (United States)

    Qin, Wutao; Wang, Xiaogang; Bai, Yuliang; Cui, Naigang

    2018-04-01

    The conventional filters such as extended Kalman filter, unscented Kalman filter and cubature Kalman filter assume that the measurement is available in real-time and the measurement noise is Gaussian white noise. But in practice, both two assumptions are invalid. To solve this problem, a novel algorithm is proposed by taking the following four steps. At first, the measurement model is modified by the Bernoulli random variables to describe the random delay. Then, the expression of predicted measurement and covariance are reformulated, which could get rid of the restriction that the maximum number of delay must be one or two and the assumption that probabilities of Bernoulli random variables taking the value one are equal. Next, the arbitrary-step randomly delayed high-degree cubature Kalman filter is derived based on the 5th-degree spherical-radial rule and the reformulated expressions. Finally, the arbitrary-step randomly delayed high-degree cubature Kalman filter is modified to the arbitrary-step randomly delayed high-degree cubature Huber-based filter based on the Huber technique, which is essentially an M-estimator. Therefore, the proposed filter is not only robust to the randomly delayed measurements, but robust to the glint noise. The application to the boost phase tracking example demonstrate the superiority of the proposed algorithms.

  1. Representing major soil variability at regional scale by constrained Latin Hypercube Sampling of remote sensing data

    NARCIS (Netherlands)

    Mulder, V.L.; Bruin, de S.; Schaepman, M.E.

    2013-01-01

    This paper presents a sparse, remote sensing-based sampling approach making use of conditioned Latin Hypercube Sampling (cLHS) to assess variability in soil properties at regional scale. The method optimizes the sampling scheme for a defined spatial population based on selected covariates, which are

  2. Spatiotemporal variability in wildfire patterns and analysis of the main drivers in Honduras using GIS and MODIS data

    Science.gov (United States)

    Valdez Vasquez, M. C.; Chen, C. F.

    2017-12-01

    Wildfires are unrestrained fires in an area of flammable vegetation and they are one of the most frequent disasters in Honduras during the dry season. During this period, anthropogenic activity combined with the harsh climatic conditions, dry vegetation and topographical variables, cause a large amount of wildfires. For this reason, there is a need to identify the drivers of wildfires and their susceptibility variations during the wildfire season. In this study, we combined the wildfire points during the 2010-2016 period every 8 days with a series of variables using the random forest (RF) algorithm. In addition to the wildfire points, we randomly generated a similar amount of background points that we use as pseudo-absence data. To represent the human imprint, we included proximity to different types of roads, trails, settlements and agriculture sites. Other variables included are the Moderate Resolution Imaging Spectra-radiometer (MODIS)-derived 8-day composites of land surface temperature (LST) and the normalized multi-band drought index (NMDI), derived from the MODIS surface reflectance data. We also included monthly average precipitation, solar radiation, and topographical variables. The exploratory analysis of the variables reveals that low precipitation combined with the low NMDI and accessibility to non-paved roads were the major drivers of wildfires during the early months of the dry season. During April, which is the peak of the dry season, the explanatory variables of relevance also included elevation and LST in addition to the proximity to paved and non-paved roads. During May, proximity to crops becomes relevant, in addition to the aforesaid variables. The average estimated area with high and very high wildfire susceptibility was 22% of the whole territory located mainly in the central and eastern regions, drifting towards the northeast areas during May. We validated the results using the area under the receiver operating characteristic (ROC) curve (AUC

  3. Generating Realistic Labelled, Weighted Random Graphs

    Directory of Open Access Journals (Sweden)

    Michael Charles Davis

    2015-12-01

    Full Text Available Generative algorithms for random graphs have yielded insights into the structure and evolution of real-world networks. Most networks exhibit a well-known set of properties, such as heavy-tailed degree distributions, clustering and community formation. Usually, random graph models consider only structural information, but many real-world networks also have labelled vertices and weighted edges. In this paper, we present a generative model for random graphs with discrete vertex labels and numeric edge weights. The weights are represented as a set of Beta Mixture Models (BMMs with an arbitrary number of mixtures, which are learned from real-world networks. We propose a Bayesian Variational Inference (VI approach, which yields an accurate estimation while keeping computation times tractable. We compare our approach to state-of-the-art random labelled graph generators and an earlier approach based on Gaussian Mixture Models (GMMs. Our results allow us to draw conclusions about the contribution of vertex labels and edge weights to graph structure.

  4. Machine learning search for variable stars

    Science.gov (United States)

    Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis

    2018-04-01

    Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.

  5. Genetic variability of cultivated cowpea in Benin assessed by random amplified polymorphic DNA

    NARCIS (Netherlands)

    Zannou, A.; Kossou, D.K.; Ahanchédé, A.; Zoundjihékpon, J.; Agbicodo, E.; Struik, P.C.; Sanni, A.

    2008-01-01

    Characterization of genetic diversity among cultivated cowpea [Vigna unguiculata (L.) Walp.] varieties is important to optimize the use of available genetic resources by farmers, local communities, researchers and breeders. Random amplified polymorphic DNA (RAPD) markers were used to evaluate the

  6. The 'emergent scaling' phenomenon and the dielectric properties of random resistor-capacitor networks

    CERN Document Server

    Bouamrane, R

    2003-01-01

    An efficient algorithm, based on the Frank-Lobb reduction scheme, for calculating the equivalent dielectric properties of very large random resistor-capacitor (R-C) networks has been developed. It has been used to investigate the network size and composition dependence of dielectric properties and their statistical variability. The dielectric properties of 256 samples of random networks containing: 512, 2048, 8192 and 32 768 components distributed randomly in the ratios 60% R-40% C, 50% R-50% C and 40% R-60% C have been computed. It has been found that these properties exhibit the anomalous power law dependences on frequency known as the 'universal dielectric response' (UDR). Attention is drawn to the contrast between frequency ranges across which percolation determines dielectric response, where considerable variability is found amongst the samples, and those across which power laws define response where very little variability is found between samples. It is concluded that the power law UDRs are emergent pr...

  7. An affordable proxy of representative national survey on radon concentration in dwellings: Design, organisation and preliminary evaluation of representativeness

    International Nuclear Information System (INIS)

    Antignani, Sara; Carelli, Vinicio; Cordedda, Carlo; Zonno, Fedele; Ampollini, Marco; Carpentieri, Carmela; Venoso, Gennaro; Bochicchio, Francesco

    2013-01-01

    Representative national surveys in dwellings are important to unbiasedly evaluate the exposure of the general population to radon. In Italy, a representative national survey was conducted from 1989 to 1996, which involved about 5600 dwellings in 232 towns. Later on, some Regions carried out more detailed surveys, but a new national survey in dwellings is necessary in order to obtain a more thorough estimate of radon concentration distribution over the Italian territory. The need to make this survey in an affordable way led to implement a new approach based on the collaboration between the Istituto Superiore di Sanità and a national company with workplaces and employees' homes throughout the country. The intent is to carry out a proxy of a population representative survey by measuring radon concentration in the homes of a random sample of the company employees. The realisation of this survey was affordable, thanks to the availability of corporate e-mail for each employee, intranet service, and company internal mail service. A dedicated web procedure and e-questionnaires allowed to automatically manage the contact with employees and to collect their data, which were both cost- and time-saving. Using this e-mail contact approach, 53% of contacted employees consented to participate in the survey. Radon concentration passive measuring devices were distributed to about 7000 dwellings, using about 14000 CR-39 detectors (two measured rooms per dwelling). In order to reduce costs, the devices were exposed for 12 months instead of two consecutive 6-month periods (as with the former national survey). A first checking of the actual representativeness of the sample was done by comparing characteristics of dwellings and occupants in the sample with corresponding data from the latest National Census. This was accomplished thanks to the fact that the questions in the survey questionnaire were tailored to the categories adopted for the Census questionnaire. A preliminary

  8. A Comparison of the Prognostic Value of Early PSA Test-Based Variables Following External Beam Radiotherapy, With or Without Preceding Androgen Deprivation: Analysis of Data From the TROG 96.01 Randomized Trial

    International Nuclear Information System (INIS)

    Lamb, David S.; Denham, James W.; Joseph, David; Matthews, John; Atkinson, Chris; Spry, Nigel A.; Duchesne, Gillian; Ebert, Martin; Steigler, Allison; Delahunt, Brett; D'Este, Catherine

    2011-01-01

    Purpose: We sought to compare the prognostic value of early prostate-specific antigen (PSA) test-based variables for the 802 eligible patients treated in the Trans-Tasman Radiation Oncology Group 96.01 randomized trial. Methods and Materials: Patients in this trial had T2b, T2c, T3, and T4 N0 prostate cancer and were randomized to 0, 3, or 6 months of neoadjuvant androgen deprivation therapy (NADT) prior to and during radiation treatment at 66 Gy to the prostate and seminal vesicles. The early PSA test-based variables evaluated were the pretreatment initial PSA (iPSA) value, PSA values at 2 and 4 months into NADT, the PSA nadir (nPSA) value after radiation in all patients, and PSA response signatures in men receiving radiation. Comparisons of endpoints were made using Cox models of local progression-free survival, distant failure-free survival, biochemical failure-free survival, and prostate cancer-specific survival. Results: The nPSA value was a powerful predictor of all endpoints regardless of whether NADT was given before radiation. PSA response signatures also predicted all endpoints in men treated by radiation alone. iPSA and PSA results at 2 and 4 months into NADT predicted biochemical failure-free survival but not any of the clinical endpoints. nPSA values correlated with those of iPSA, Gleason grade, and T stage and were significantly higher in men receiving radiation alone than in those receiving NADT. Conclusions: The postradiation nPSA value is the strongest prognostic indicator of all early PSA-based variables. However, its use as a surrogate endpoint needs to take into account its dependence on pretreatment variables and treatment method.

  9. Blood Pressure Variability and Cognitive Function Among Older African Americans: Introducing a New Blood Pressure Variability Measure.

    Science.gov (United States)

    Tsang, Siny; Sperling, Scott A; Park, Moon Ho; Helenius, Ira M; Williams, Ishan C; Manning, Carol

    2017-09-01

    Although blood pressure (BP) variability has been reported to be associated with cognitive impairment, whether this relationship affects African Americans has been unclear. We sought correlations between systolic and diastolic BP variability and cognitive function in community-dwelling older African Americans, and introduced a new BP variability measure that can be applied to BP data collected in clinical practice. We assessed cognitive function in 94 cognitively normal older African Americans using the Mini-Mental State Examination (MMSE) and the Computer Assessment of Mild Cognitive Impairment (CAMCI). We used BP measurements taken at the patients' three most recent primary care clinic visits to generate three traditional BP variability indices, range, standard deviation, and coefficient of variation, plus a new index, random slope, which accounts for unequal BP measurement intervals within and across patients. MMSE scores did not correlate with any of the BP variability indices. Patients with greater diastolic BP variability were less accurate on the CAMCI verbal memory and incidental memory tasks. Results were similar across the four BP variability indices. In a sample of cognitively intact older African American adults, BP variability did not correlate with global cognitive function, as measured by the MMSE. However, higher diastolic BP variability correlated with poorer verbal and incidental memory. By accounting for differences in BP measurement intervals, our new BP variability index may help alert primary care physicians to patients at particular risk for cognitive decline.

  10. Return probabilities for the reflected random walk on N_0

    NARCIS (Netherlands)

    Essifi, R.; Peigné, M.

    2015-01-01

    Let \\((Y_n)\\) be a sequence of i.i.d. \\(\\mathbb{Z }\\)-valued random variables with law \\(\\mu \\). The reflected random walk \\((X_n)\\) is defined recursively by \\(X_0=x \\in \\mathbb{N }_0, X_{n+1}=\\vert X_n+Y_{n+1}\\vert \\). Under mild hypotheses on the law \\(\\mu \\), it is proved that, for any \\( y \\in

  11. A randomized trial of high-dairy-protein, variable-carbohydrate diets and exercise on body composition in adults with obesity.

    Science.gov (United States)

    Parr, Evelyn B; Coffey, Vernon G; Cato, Louise E; Phillips, Stuart M; Burke, Louise M; Hawley, John A

    2016-05-01

    This study determined the effects of 16-week high-dairy-protein, variable-carbohydrate (CHO) diets and exercise training (EXT) on body composition in men and women with overweight/obesity. One hundred and eleven participants (age 47 ± 6 years, body mass 90.9 ± 11.7 kg, BMI 33 ± 4 kg/m(2) , values mean ± SD) were randomly stratified to diets with either: high dairy protein, moderate CHO (40% CHO: 30% protein: 30% fat; ∼4 dairy servings); high dairy protein, high CHO (55%: 30%: 15%; ∼4 dairy servings); or control (55%: 15%: 30%; ∼1 dairy serving). Energy restriction (500 kcal/day) was achieved through diet (∼250 kcal/day) and EXT (∼250 kcal/day). Body composition was measured using dual-energy X-ray absorptiometry before, midway, and upon completion of the intervention. Eighty-nine (25 M/64 F) of 115 participants completed the 16-week intervention, losing 7.7 ± 3.2 kg fat mass (P exercise stimulus. © 2016 The Obesity Society.

  12. Exploratory Spectroscopy of Magnetic Cataclysmic Variables Candidates and Other Variable Objects

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, A. S.; Palhares, M. S. [IP and D, Universidade do Vale do Paraíba, 12244-000, São José dos Campos, SP (Brazil); Rodrigues, C. V.; Cieslinski, D.; Jablonski, F. J. [Divisão de Astrofísica, Instituto Nacional de Pesquisas Espaciais, 12227-010, São José dos Campos, SP (Brazil); Silva, K. M. G. [Gemini Observatory, Casilla 603, La Serena (Chile); Almeida, L. A. [Instituto de Astronomia, Geofísica e Ciências Atmosféricas, Universidade de São Paulo, 05508-900, São Paulo, SP (Brazil); Rodríguez-Ardila, A., E-mail: alexandre@univap.br [Laboratório Nacional de Astrofísica LNA/MCTI, 37504-364, Itajubá MG (Brazil)

    2017-04-01

    The increasing number of synoptic surveys made by small robotic telescopes, such as the photometric Catalina Real-Time Transient Survey (CRTS), provides a unique opportunity to discover variable sources and improves the statistical samples of such classes of objects. Our goal is the discovery of magnetic Cataclysmic Variables (mCVs). These are rare objects that probe interesting accretion scenarios controlled by the white-dwarf magnetic field. In particular, improved statistics of mCVs would help to address open questions on their formation and evolution. We performed an optical spectroscopy survey to search for signatures of magnetic accretion in 45 variable objects selected mostly from the CRTS. In this sample, we found 32 CVs, 22 being mCV candidates, 13 of which were previously unreported as such. If the proposed classifications are confirmed, it would represent an increase of 4% in the number of known polars and 12% in the number of known IPs. A fraction of our initial sample was classified as extragalactic sources or other types of variable stars by the inspection of the identification spectra. Despite the inherent complexity in identifying a source as an mCV, variability-based selection, followed by spectroscopic snapshot observations, has proved to be an efficient strategy for their discoveries, being a relatively inexpensive approach in terms of telescope time.

  13. Temperatures and heating energy in New Zealand houses from a nationally representative study - HEEP

    Energy Technology Data Exchange (ETDEWEB)

    French, L.J.; Camilleri, M.J.; Isaacs, N.P.; Pollard, A.R. [BRANZ Ltd., Private Bag 50 908, Porirua City (New Zealand)

    2007-07-15

    The household energy end-use project (HEEP) has collected energy and temperature data from a randomly selected, nationally representative sample of about 400 houses throughout New Zealand. This database has been used to explore the drivers of indoor temperatures and heating energy. Initial analysis of the winter living room temperatures shows that heating type, climate and house age are the key drivers. On average, houses heated by solid fuel are the warmest, with houses heated by portable LPG and electric heaters the coldest. Over the three winter months, living rooms are below 20 {sup o}C for 83% of the time - and the living room is typically the warmest room. Central heating is in only 5% of houses. Solid fuel is the dominant heating fuel in houses. The lack of air conditioning means that summer temperatures are affected by passive influences (e.g. house design, construction). Summer temperatures are strongly influenced by the house age and the local climate - together these variables explain 69% of the variation in daytime (9 a.m. to 5 p.m.) living room temperatures. In both summer and winter newer (post-1978) houses are warmer - this is beneficial in winter, but the high temperatures in summer are potentially uncomfortable. (author)

  14. Balancing treatment allocations by clinician or center in randomized trials allows unacceptable levels of treatment prediction.

    Science.gov (United States)

    Hills, Robert K; Gray, Richard; Wheatley, Keith

    2009-08-01

    Randomized controlled trials are the standard method for comparing treatments because they avoid the selection bias that might arise if clinicians were free to choose which treatment a patient would receive. In practice, allocation of treatments in randomized controlled trials is often not wholly random with various 'pseudo-randomization' methods, such as minimization or balanced blocks, used to ensure good balance between treatments within potentially important prognostic or predictive subgroups. These methods avoid selection bias so long as full concealment of the next treatment allocation is maintained. There is concern, however, that pseudo-random methods may allow clinicians to predict future treatment allocations from previous allocation history, particularly if allocations are balanced by clinician or center. We investigate here to what extent treatment prediction is possible. Using computer simulations of minimization and balanced block randomizations, the success rates of various prediction strategies were investigated for varying numbers of stratification variables, including the patient's clinician. Prediction rates for minimization and balanced block randomization typically exceed 60% when clinician is included as a stratification variable and, under certain circumstances, can exceed 80%. Increasing the number of clinicians and other stratification variables did not greatly reduce the prediction rates. Without clinician as a stratification variable, prediction rates are poor unless few clinicians participate. Prediction rates are unacceptably high when allocations are balanced by clinician or by center. This could easily lead to selection bias that might suggest spurious, or mask real, treatment effects. Unless treatment is blinded, randomization should not be balanced by clinician (or by center), and clinician-center effects should be allowed for instead by retrospectively stratified analyses. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese

  15. On the number of subgraphs of the Barabasi-Albert random graph

    Energy Technology Data Exchange (ETDEWEB)

    Ryabchenko, Aleksandr A; Samosvat, Egor A [Moscow Institute of Physics and Technology (State University), Dolgoprudnyi, Moscow Region, Russian Frderation (Russian Federation)

    2012-06-30

    We study a model of a random graph of the type of the Barabasi-Albert preferential attachment model. We develop a technique that makes it possible to estimate the mathematical expectation for a fairly wide class of random variables in the model under consideration. We use this technique to prove a theorem on the asymptotics of the mathematical expectation of the number of subgraphs isomorphic to a certain fixed graph in the random graphs of this model.

  16. A discrete stress-strength interference model based on universal generating function

    International Nuclear Information System (INIS)

    An Zongwen; Huang Hongzhong; Liu Yu

    2008-01-01

    Continuous stress-strength interference (SSI) model regards stress and strength as continuous random variables with known probability density function. This, to some extent, results in a limitation of its application. In this paper, stress and strength are treated as discrete random variables, and a discrete SSI model is presented by using the universal generating function (UGF) method. Finally, case studies demonstrate the validity of the discrete model in a variety of circumstances, in which stress and strength can be represented by continuous random variables, discrete random variables, or two groups of experimental data

  17. A cluster expansion approach to exponential random graph models

    International Nuclear Information System (INIS)

    Yin, Mei

    2012-01-01

    The exponential family of random graphs are among the most widely studied network models. We show that any exponential random graph model may alternatively be viewed as a lattice gas model with a finite Banach space norm. The system may then be treated using cluster expansion methods from statistical mechanics. In particular, we derive a convergent power series expansion for the limiting free energy in the case of small parameters. Since the free energy is the generating function for the expectations of other random variables, this characterizes the structure and behavior of the limiting network in this parameter region

  18. A Review of Spectral Methods for Variable Amplitude Fatigue Prediction and New Results

    Science.gov (United States)

    Larsen, Curtis E.; Irvine, Tom

    2013-01-01

    A comprehensive review of the available methods for estimating fatigue damage from variable amplitude loading is presented. The dependence of fatigue damage accumulation on power spectral density (psd) is investigated for random processes relevant to real structures such as in offshore or aerospace applications. Beginning with the Rayleigh (or narrow band) approximation, attempts at improved approximations or corrections to the Rayleigh approximation are examined by comparison to rainflow analysis of time histories simulated from psd functions representative of simple theoretical and real world applications. Spectral methods investigated include corrections by Wirsching and Light, Ortiz and Chen, the Dirlik formula, and the Single-Moment method, among other more recent proposed methods. Good agreement is obtained between the spectral methods and the time-domain rainflow identification for most cases, with some limitations. Guidelines are given for using the several spectral methods to increase confidence in the damage estimate.

  19. Simulations of Chemotaxis and Random Motility in Finite Domains

    National Research Council Canada - National Science Library

    Jabbarzadeh, Ehsan; Abrams, Cameron F

    2005-01-01

    .... The model couples fully time-dependent finite-difference solution of a reaction-diffusion equation for the concentration field of a generic chemoattractant to biased random walks representing individual moving cells...

  20. Genetic variability of Amorphophallus muelleri Blume in Java based on Random Amplified Polymorphic DNA

    Directory of Open Access Journals (Sweden)

    DIYAH MARTANTI

    2008-10-01

    Full Text Available Amorphophallus muelleri Blume (Araceae is valued for its glucomanan content for use in food industry (healthy diet food, paper industry, pharmacy and cosmetics. The species is triploid (2n=3x=39 and the seed is developed apomictically. The present research is aimed to identify genetic variability of six population of A. muelleri from Java (consisted of 50 accessions using random amplified polymorphic DNA (RAPD. The six populations of the species are: East Java: (1 Silo-Jember, (2 Saradan-Madiun, (3 IPB (cultivated, from Saradan-Madiun, (4 Panti-Jember, (5 Probolinggo; and Central Java: (6 Cilacap. The results showed that five RAPD primers generated 42 scorable bands of which 29 (69.05% were polymorphic. Size of the bands varied from 300bp to 1.5kbp. The 50 accessions of A. muelleri were divided into two main clusters, some of them were grouped based on their populations, and some others were not. The range of individual genetic dissimilarity was from 0.02 to 0.36. The results showed that among six populations investigated, Saradan population showed the highest levels of genetic variation with mean values of na = 1.500+ 0.5061, ne = 1.3174 + 0.3841, PLP = 50% and He = 0, 0.1832+0.2054, whereas Silo-Jember population showed the lowest levels of genetic variation with mean values na = 1.2619+ 0.4450, ne = 1.1890 + 0.3507, PLP = 26.19% and He = 0.1048+0.1887. Efforts to conserve, domesticate, cultivate and improve genetically should be based on the genetic properties of each population and individual within population, especially Saradan population which has the highest levels of genetic variation, need more attention for its conservation.

  1. Random recurrence equations and ruin in a Markov-dependent stochastic economic environment

    DEFF Research Database (Denmark)

    Collamore, Jeffrey F.

    2009-01-01

    series models.  Our results build upon work of Goldie, who has developed tail asymptotics applicable for independent sequences of random variables subject to a random recurrence equation.  In contrast, we adopt a general approach based on the theory of Harris recurrent Markov chains and the associated...

  2. Pseudo-random number generator for the Sigma 5 computer

    Science.gov (United States)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  3. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  4. Uninformative variable elimination assisted by Gram-Schmidt Orthogonalization/successive projection algorithm for descriptor selection in QSAR

    DEFF Research Database (Denmark)

    Omidikia, Nematollah; Kompany-Zareh, Mohsen

    2013-01-01

    Employment of Uninformative Variable Elimination (UVE) as a robust variable selection method is reported in this study. Each regression coefficient represents the contribution of the corresponding variable in the established model, but in the presence of uninformative variables as well as colline......Employment of Uninformative Variable Elimination (UVE) as a robust variable selection method is reported in this study. Each regression coefficient represents the contribution of the corresponding variable in the established model, but in the presence of uninformative variables as well...... as collinearity reliability of the regression coefficient's magnitude is suspicious. Successive Projection Algorithm (SPA) and Gram-Schmidt Orthogonalization (GSO) were implemented as pre-selection technique for removing collinearity and redundancy among variables in the model. Uninformative variable elimination...

  5. Nonlinear Characteristics of Randomly Excited Transonic Flutter

    DEFF Research Database (Denmark)

    Christiansen, Lasse Engbo; Lehn-Schiøler, Tue; Mosekilde, Erik

    2002-01-01

    . When this model is extended by the introduction of nonlinear terms, it can reproduce the subcritical Hopf bifurcation. We hereafter consider the effects of subjecting simplified versions of the model to random external excitations representing the fluctuations present in the airflow. These models can......The paper describes the effects of random external excitations on the onset and dynamical characteristics of transonic flutter (i.e. large-amplitude, self-sustained oscillations) for a high aspect ratio wing. Wind tunnel experiments performed at the National Aerospace Laboratory (NAL) in Japan have...

  6. Classification and prediction of port variables

    Energy Technology Data Exchange (ETDEWEB)

    Molina Serrano, B.

    2016-07-01

    Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)

  7. Critical Behavior of the Annealed Ising Model on Random Regular Graphs

    Science.gov (United States)

    Can, Van Hao

    2017-11-01

    In Giardinà et al. (ALEA Lat Am J Probab Math Stat 13(1):121-161, 2016), the authors have defined an annealed Ising model on random graphs and proved limit theorems for the magnetization of this model on some random graphs including random 2-regular graphs. Then in Can (Annealed limit theorems for the Ising model on random regular graphs, arXiv:1701.08639, 2017), we generalized their results to the class of all random regular graphs. In this paper, we study the critical behavior of this model. In particular, we determine the critical exponents and prove a non standard limit theorem stating that the magnetization scaled by n^{3/4} converges to a specific random variable, with n the number of vertices of random regular graphs.

  8. Multiple Imputation of Predictor Variables Using Generalized Additive Models

    NARCIS (Netherlands)

    de Jong, Roel; van Buuren, Stef; Spiess, Martin

    2016-01-01

    The sensitivity of multiple imputation methods to deviations from their distributional assumptions is investigated using simulations, where the parameters of scientific interest are the coefficients of a linear regression model, and values in predictor variables are missing at random. The

  9. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  10. On grey levels in random CAPTCHA generation

    Science.gov (United States)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  11. Ergodicity for the Randomly Forced 2D Navier-Stokes Equations

    International Nuclear Information System (INIS)

    Kuksin, Sergei; Shirikyan, Armen

    2001-01-01

    We study space-periodic 2D Navier-Stokes equations perturbed by an unbounded random kick-force. It is assumed that Fourier coefficients of the kicks are independent random variables all of whose moments are bounded and that the distributions of the first N 0 coefficients (where N 0 is a sufficiently large integer) have positive densities against the Lebesgue measure. We treat the equation as a random dynamical system in the space of square integrable divergence-free vector fields. We prove that this dynamical system has a unique stationary measure and study its ergodic properties

  12. Interaction of random wave-current over uneven and porous bottoms

    International Nuclear Information System (INIS)

    Suo Yaohong; Zhang Zhonghua; Zhang Jiafan; Suo Xiaohong

    2009-01-01

    Starting from linear wave theory and applying Green's second identity and considering wave-current interaction for porous bottoms and variable water depth, the comprehensive mild-slope equation model theory of wave-current interaction is developed, then paying attention to the effect of random waves, by use of Kubo et al.'s method, a model theory of the interaction between random waves and current over uneven and porous bottoms is established. Finally the characteristics of the random waves are discussed numerically from both the geometric-optics approximation and the target spectrum.

  13. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  14. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  15. Temporal changes in randomness of bird communities across Central Europe.

    Science.gov (United States)

    Renner, Swen C; Gossner, Martin M; Kahl, Tiemo; Kalko, Elisabeth K V; Weisser, Wolfgang W; Fischer, Markus; Allan, Eric

    2014-01-01

    Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  16. Temporal changes in randomness of bird communities across Central Europe.

    Directory of Open Access Journals (Sweden)

    Swen C Renner

    Full Text Available Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63, implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  17. (Non-) Gibbsianness and Phase Transitions in Random Lattice Spin Models

    NARCIS (Netherlands)

    Külske, C.

    1999-01-01

    We consider disordered lattice spin models with finite-volume Gibbs measures µΛ[η](dσ). Here σ denotes a lattice spin variable and η a lattice random variable with product distribution P describing the quenched disorder of the model. We ask: when will the joint measures limΛ↑Zd P(dη)µΛ[η](dσ) be

  18. Appraisal and Reliability of Variable Engagement Model Prediction ...

    African Journals Online (AJOL)

    The variable engagement model based on the stress - crack opening displacement relationship and, which describes the behaviour of randomly oriented steel fibres composite subjected to uniaxial tension has been evaluated so as to determine the safety indices associated when the fibres are subjected to pullout and with ...

  19. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  20. Towards Representative Metallurgical Sampling and Gold Recovery Testwork Programmes

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and high quality testwork. The quality and type of samples used are as important as the testwork itself. The key characteristic required of any set of samples is that they represent a given domain and quantify its variability. There are those who think that stating a sample(s is representative makes it representative without justification. There is a need to consider both (1 in-situ and (2 testwork sub-sample representativity. Early ore/waste characterisation and domain definition are required, so that sampling and testwork protocols can be designed to suit the style of mineralisation in question. The Theory of Sampling (TOS provides an insight into the causes and magnitude of errors that may occur during the sampling of particulate materials (e.g., broken rock and is wholly applicable to metallurgical sampling. Quality assurance/quality control (QAQC is critical throughout all programmes. Metallurgical sampling and testwork should be fully integrated into geometallurgical studies. Traditional metallurgical testwork is critical for plant design and is an inherent part of geometallurgy. In a geometallurgical study, multiple spatially distributed small-scale tests are used as proxies for process parameters. These will be validated against traditional testwork results. This paper focusses on sampling and testwork for gold recovery determination. It aims to provide the reader with the background to move towards the design, implementation and reporting of representative and fit-for-purpose sampling and testwork programmes. While the paper does not intend to provide a definitive commentary, it critically assesses the hard-rock sampling methods used and their optimal collection and preparation. The need for representative sampling and quality testwork to avoid financial and intangible losses is

  1. Smooth conditional distribution function and quantiles under random censorship.

    Science.gov (United States)

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  2. Random amplified polymorphic DNA (RAPD) markers reveal genetic ...

    African Journals Online (AJOL)

    The present study evaluated genetic variability of superior bael genotypes collected from different parts of Andaman Islands, India using fruit characters and random amplified polymorphic DNA (RAPD) markers. Genomic DNA extracted from leaf material using cetyl trimethyl ammonium bromide (CTAB) method was ...

  3. Infinite conditional random fields for human behavior analysis

    NARCIS (Netherlands)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF

  4. Transform Domain Robust Variable Step Size Griffiths' Adaptive Algorithm for Noise Cancellation in ECG

    Science.gov (United States)

    Hegde, Veena; Deekshit, Ravishankar; Satyanarayana, P. S.

    2011-12-01

    The electrocardiogram (ECG) is widely used for diagnosis of heart diseases. Good quality of ECG is utilized by physicians for interpretation and identification of physiological and pathological phenomena. However, in real situations, ECG recordings are often corrupted by artifacts or noise. Noise severely limits the utility of the recorded ECG and thus needs to be removed, for better clinical evaluation. In the present paper a new noise cancellation technique is proposed for removal of random noise like muscle artifact from ECG signal. A transform domain robust variable step size Griffiths' LMS algorithm (TVGLMS) is proposed for noise cancellation. For the TVGLMS, the robust variable step size has been achieved by using the Griffiths' gradient which uses cross-correlation between the desired signal contaminated with observation or random noise and the input. The algorithm is discrete cosine transform (DCT) based and uses symmetric property of the signal to represent the signal in frequency domain with lesser number of frequency coefficients when compared to that of discrete Fourier transform (DFT). The algorithm is implemented for adaptive line enhancer (ALE) filter which extracts the ECG signal in a noisy environment using LMS filter adaptation. The proposed algorithm is found to have better convergence error/misadjustment when compared to that of ordinary transform domain LMS (TLMS) algorithm, both in the presence of white/colored observation noise. The reduction in convergence error achieved by the new algorithm with desired signal decomposition is found to be lower than that obtained without decomposition. The experimental results indicate that the proposed method is better than traditional adaptive filter using LMS algorithm in the aspects of retaining geometrical characteristics of ECG signal.

  5. Psychotherapy integration under scrutiny: investigating the impact of integrating emotion-focused components into a CBT-based approach: a study protocol of a randomized controlled trial.

    Science.gov (United States)

    Babl, Anna; Grosse Holtforth, Martin; Heer, Sara; Lin, Mu; Stähli, Annabarbara; Holstein, Dominique; Belz, Martina; Egenolf, Yvonne; Frischknecht, Eveline; Ramseyer, Fabian; Regli, Daniel; Schmied, Emma; Flückiger, Christoph; Brodbeck, Jeannette; Berger, Thomas; Caspar, Franz

    2016-11-24

    This currently recruiting randomized controlled trial investigates the effects of integrating components of Emotion-Focused Therapy (EFT) into Psychological Therapy (PT), an integrative form of cognitive-behavioral therapy in a manner that is directly mirroring common integrative practice in the sense of assimilative integration. Aims of the study are to understand how both, an existing therapy approach as well as the elements to be integrated, are affected by the integration and to clarify the role of emotional processing as a mediator of therapy outcome. A total of 130 adults with a diagnosed unipolar depressive, anxiety or adjustment disorder (seeking treatment at a psychotherapy outpatient clinic) are randomized to either treatment as usual (PT) with integrated emotion-focused components (TAU + EFT) or PT (TAU). Primary outcome variables are psychopathology and symptom severity at the end of therapy and at follow up; secondary outcome variables are interpersonal problems, psychological wellbeing, quality of life, attainment of individual therapy goals, and emotional competency. Furthermore, process variables such as the quality of the therapeutic relationship are studied as well as aptitude-treatment interactions. Variables are assessed at baseline, after 8 and 16 sessions, at the end of therapy, after 25 ± 3 sessions, and at 6, 12 and 36 month follow-up. Underlying mechanisms of change are investigated. Statistical analyses will be conducted using the appropriate multilevel approaches, mainly two-level regression and growth analysis. The results of this study will indicate whether the integration of emotion-focused elements into treatment as usual increases the effectiveness of Psychological Therapy. If advantages are found, which may be limited to particular variables or subgroups of patients, recommendations for a systematic integration, and caveats if also disadvantages are detected, can be formulated. On a more abstract level, a cognitive

  6. Variable-bias coin tossing

    International Nuclear Information System (INIS)

    Colbeck, Roger; Kent, Adrian

    2006-01-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT

  7. Variable-bias coin tossing

    Science.gov (United States)

    Colbeck, Roger; Kent, Adrian

    2006-03-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.

  8. Colloquium: Random matrices and chaos in nuclear spectra

    International Nuclear Information System (INIS)

    Papenbrock, T.; Weidenmueller, H. A.

    2007-01-01

    Chaos occurs in quantum systems if the statistical properties of the eigenvalue spectrum coincide with predictions of random-matrix theory. Chaos is a typical feature of atomic nuclei and other self-bound Fermi systems. How can the existence of chaos be reconciled with the known dynamical features of spherical nuclei? Such nuclei are described by the shell model (a mean-field theory) plus a residual interaction. The question is answered using a statistical approach (the two-body random ensemble): The matrix elements of the residual interaction are taken to be random variables. Chaos is shown to be a generic feature of the ensemble and some of its properties are displayed, emphasizing those which differ from standard random-matrix theory. In particular, the existence of correlations among spectra carrying different quantum numbers is demonstrated. These are subject to experimental verification

  9. A Bayesian Analysis of a Random Effects Small Business Loan Credit Scoring Model

    Directory of Open Access Journals (Sweden)

    Patrick J. Farrell

    2011-09-01

    Full Text Available One of the most important aspects of credit scoring is constructing a model that has low misclassification rates and is also flexible enough to allow for random variation. It is also well known that, when there are a large number of highly correlated variables as is typical in studies involving questionnaire data, a method must be found to reduce the number of variables to those that have high predictive power. Here we propose a Bayesian multivariate logistic regression model with both fixed and random effects for small business loan credit scoring and a variable reduction method using Bayes factors. The method is illustrated on an interesting data set based on questionnaires sent to loan officers in Canadian banks and venture capital companies

  10. Uniqueness conditions for finitely dependent random fields

    International Nuclear Information System (INIS)

    Dobrushin, R.L.; Pecherski, E.A.

    1981-01-01

    The authors consider a random field for which uniqueness and some additional conditions guaranteeing that the correlations between the variables of the field decrease rapidly enough with the distance between the values of the parameter occur. The main result of the paper states that in such a case uniqueness is true for any other field with transition probabilities sufficiently close to those of the original field. Then they apply this result to some ''degenerate'' classes of random fields for which one can check this condition of correlation to decay, and thus obtain some new conditions of uniqueness. (Auth.)

  11. Changes in Southern Hemisphere circulation variability in climate change modelling experiments

    International Nuclear Information System (INIS)

    Grainger, Simon; Frederiksen, Carsten; Zheng, Xiaogu

    2007-01-01

    Full text: The seasonal mean of a climate variable can be considered as a statistical random variable, consisting of a signal and noise components (Madden 1976). The noise component consists of internal intraseasonal variability, and is not predictable on time-scales of a season or more ahead. The signal consists of slowly varying external and internal variability, and is potentially predictable on seasonal time-scales. The method of Zheng and Frederiksen (2004) has been applied to monthly time series of 500hPa Geopotential height from models submitted to the Coupled Model Intercomparison Project (CMIP3) experiment to obtain covariance matrices of the intraseasonal and slow components of covariability for summer and winter. The Empirical Orthogonal Functions (EOFs) of the intraseasonal and slow covariance matrices for the second half of the 20th century are compared with those observed by Frederiksen and Zheng (2007). The leading EOF in summer and winter for both the intraseasonal and slow components of covariability is the Southern Annular Mode (see, e.g. Kiladis and Mo 1998). This is generally reproduced by the CMIP3 models, although with different variance amounts. The observed secondary intraseasonal covariability modes of wave 4 patterns in summer and wave 3 or blocking in winter are also generally seen in the models, although the actual spatial pattern is different. For the slow covariabilty, the models are less successful in reproducing the two observed ENSO modes, with generally only one of them being represented among the leading EOFs. However, most models reproduce the observed South Pacific wave pattern. The intraseasonal and slow covariances matrices of 500hPa geopotential height under three climate change scenarios are also analysed and compared with those found for the second half of the 20th century. Through aggregating the results from a number of CMIP3 models, a consensus estimate of the changes in Southern Hemisphere variability, and their

  12. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Dhruba Das

    2015-04-01

    Full Text Available In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM/M/1 and M/FM/1 has been studied and constructed their membership functions of the system characteristics based on the aforesaid principle. The former represents a queue with fuzzy exponential arrivals and exponential service rate while the latter represents a queue with exponential arrival rate and fuzzy exponential service rate.

  13. Using Environmental Variables for Studying of the Quality of Sampling in Soil Mapping

    Directory of Open Access Journals (Sweden)

    A. Jafari

    2016-02-01

    profiles, which were then described, sampled, analyzed and classified according to the USDA soil classification system (16. The basic rationale is to set up a hypercube, the axes of which are the quantiles of rasters of environmental covariates, e.g., digital elevation model. Sampling evaluation was made using the HELS algorithm. This algorithm was written based on the study of Carre et al., 2007 (3 and run in R. Results and Discussion: The covariate dataset is represented by elevation, slope and wetness index (Table 2. All data layers were interpolated to a common grid of 30 m resolution. The size of the raster layer is 421 by 711 grid cells. Each of the three covariates is divided into four quantiles (Table 2. The hypercube character space has 43, i.e. 64 strata (Figure 5. The average number of grid cells within each stratum is therefore 4677 grid cells. The map of the covariate index (Figure 6 shows some patterns representative of the covariate variability. The values of the covariate index range between 0.0045 and 5.95. This means that some strata are very dense compared to others. This index allows us to explain if high or low relative weight of the sampling units (see below is due to soil sampling or covariate density. The strata with the highest density are in the areas with high geomorphology diversity. It means that geomorphology processes can cause the diversity and variability and it is in line with the geomorphology map (Figure 2. Of the 64 strata, 30.4% represent under-sampling, 60.2% represent adequate sampling and 9.4% represent over-sampling. Regarding the covariate index, most of the under-sampling appears in the high covariate index, where soil covariates are then highly variable. Actually, it is difficult to collect field samples in these highly variable areas (Figure 7. Also, most of the over-sampling was observed in areas with alow covariate index (Figure 7. We calculated the weights of all the sampling units and showed the results in Figure 8. One 64

  14. Analysis of Modal Travel Time Variability Due to Mesoscale Ocean Structure

    National Research Council Canada - National Science Library

    Smith, Amy

    1997-01-01

    .... First, for an open ocean environment away from strong boundary currents, the effects of randomly phased linear baroclinic Rossby waves on acoustic travel time are shown to produce a variable overall...

  15. Dietary supplement use and smoking are important correlates of biomarkers of water-soluble vitamin status after adjusting for sociodemographic and lifestyle variables in a representative sample of US adults1,2,3

    Science.gov (United States)

    Pfeiffer, Christine M.; Sternberg, Maya R.; Schleicher, Rosemary L.; Rybak, Michael E.

    2016-01-01

    Biochemical indicators of water-soluble vitamin (WSV) status have been measured in a nationally representative sample of the US population in NHANES 2003–2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle variables (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) with biomarkers of WSV status in adults (≥20 y): serum and RBC folate, serum pyridoxal-5′-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤0.43) and together with supplement use explained more of the variability as compared to the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7% (B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥6 out of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI; and for some biomarkers with PIR (5/8), education (1/8), alcohol consumption (4/8), and physical activity (5/8). We noted large estimated percent changes in biomarker concentrations between race-ethnic groups (from −24% to 20%), between supplement users and nonusers (from −12% to 104%), and between smokers and nonsmokers (from −28% to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status. PMID:23576641

  16. Behavioral neurocardiac training in hypertension: a randomized, controlled trial.

    Science.gov (United States)

    Nolan, Robert P; Floras, John S; Harvey, Paula J; Kamath, Markad V; Picton, Peter E; Chessex, Caroline; Hiscock, Natalie; Powell, Jonathan; Catt, Michael; Hendrickx, Hilde; Talbot, Duncan; Chen, Maggie H

    2010-04-01

    It is not established whether behavioral interventions add benefit to pharmacological therapy for hypertension. We hypothesized that behavioral neurocardiac training (BNT) with heart rate variability biofeedback would reduce blood pressure further by modifying vagal heart rate modulation during reactivity and recovery from standardized cognitive tasks ("mental stress"). This randomized, controlled trial enrolled 65 patients with uncomplicated hypertension to BNT or active control (autogenic relaxation), with six 1-hour sessions over 2 months with home practice. Outcomes were analyzed with linear mixed models that adjusted for antihypertensive drugs. BNT reduced daytime and 24-hour systolic blood pressures (-2.4+/-0.9 mm Hg, P=0.009, and -2.1+/-0.9 mm Hg, P=0.03, respectively) and pulse pressures (-1.7+/-0.6 mm Hg, P=0.004, and -1.4+/-0.6 mm Hg, P=0.02, respectively). No effect was observed for controls (P>0.10 for all indices). BNT also increased RR-high-frequency power (0.15 to 0.40 Hz; P=0.01) and RR interval (P0.10). In contrast to relaxation therapy, BNT with heart rate variability biofeedback modestly lowers ambulatory blood pressure during wakefulness, and it augments tonic vagal heart rate modulation. It is unknown whether efficacy of this treatment can be improved with biofeedback of baroreflex gain. BNT, alone or as an adjunct to drug therapy, may represent a promising new intervention for hypertension.

  17. Random-Resistor-Random-Temperature Kirchhoff-Law-Johnson-Noise (RRRT-KLJN Key Exchange

    Directory of Open Access Journals (Sweden)

    Kish Laszlo B.

    2016-03-01

    Full Text Available We introduce two new Kirchhoff-law-Johnson-noise (KLJN secure key distribution schemes which are generalizations of the original KLJN scheme. The first of these, the Random-Resistor (RR- KLJN scheme, uses random resistors with values chosen from a quasi-continuum set. It is well-known since the creation of the KLJN concept that such a system could work in cryptography, because Alice and Bob can calculate the unknown resistance value from measurements, but the RR-KLJN system has not been addressed in prior publications since it was considered impractical. The reason for discussing it now is the second scheme, the Random Resistor Random Temperature (RRRT- KLJN key exchange, inspired by a recent paper of Vadai, Mingesz and Gingl, wherein security was shown to be maintained at non-zero power flow. In the RRRT-KLJN secure key exchange scheme, both the resistances and their temperatures are continuum random variables. We prove that the security of the RRRT-KLJN scheme can prevail at a non-zero power flow, and thus the physical law guaranteeing security is not the Second Law of Thermodynamics but the Fluctuation-Dissipation Theorem. Alice and Bob know their own resistances and temperatures and can calculate the resistance and temperature values at the other end of the communication channel from measured voltage, current and power-flow data in the wire. However, Eve cannot determine these values because, for her, there are four unknown quantities while she can set up only three equations. The RRRT-KLJN scheme has several advantages and makes all former attacks on the KLJN scheme invalid or incomplete.

  18. What proportion of people who try one cigarette become daily smokers? A meta analysis of representative surveys.

    Science.gov (United States)

    Birge, Max; Duffy, Stephen; Miler, Joanna Astrid; Hajek, Peter

    2017-11-04

    The 'conversion rate' from initial experimentation to daily smoking is a potentially important metric of smoking behavior, but estimates of it based on current representative data are lacking. The Global Health Data Exchange was searched for representative surveys conducted in English speaking, developed countries after year 2000 that included questions about ever trying a cigarette and ever smoking daily. The initial search identified 2776 surveys that were further screened for language, location, year, sample size, survey structure and representativeness. 44 surveys that passed the screening process were accessed and their codebooks were examined to see whether the two questions of interest were included. Eight datasets allowed extraction or estimation of relevant information. Survey quality was assessed with regards to response rates, sampling methods and data collection procedures. PRISMA guidelines were followed, with explicit rules for approaching derived variables and skip patterns. Proportions were pooled using random effects meta-analysis. The eight surveys used representative samples of the general adult population. Response rates varied from 45% to 88%. Survey methods were on par with the best practice in this field. Altogether 216,314 respondents were included of whom 60.3% (95%CI 51.3-69.3) ever tried a cigarette. Among those, 68.9% (95% CI 60.9-76.9%) progressed to daily smoking. Over two thirds of people who try one cigarette become, at least temporarily, daily smokers. The finding provides strong support for the current efforts to reduce cigarette experimentation among adolescents. The transition from trying the first cigarette through occasional to daily smoking usually implies that a recreational activity is turning into a compulsive need that has to be satisfied virtually continuously. The 'conversion rate' from initial experimentation to daily smoking is thus a potentially important metric of smoking behavior, but estimates of it based on

  19. A study on the representative sampling survey for the inspection of the clearance level for the radioisotope waste

    International Nuclear Information System (INIS)

    Hong Joo Ahn; Se Chul Sohn; Kwang Yong Jee; Ju Youl Kim; In Koo Lee

    2007-01-01

    Utilization facilities for radioisotope (RI) are increasing annually in South Korea, and the total number was 2,723, as of December 31, 2005. The inspection of a clearance level is a very important problem in order to ensure a social reliance for releasing radioactive materials to the environment. Korean regulations for such a clearance are described in Notice No. 2001-30 of the Ministry of Science and Technology (MOST) and Notice No. 2002-67 of the Ministry of Commerce, Industry and Energy (MOCIE). Most unsealed sources in RI waste drums at a storage facility are low-level beta-emitters with short half-lives, so it is impossible to measure their inventories by a nondestructive analysis. Furthermore, RI wastes generated from hospital, educational and research institutes and industry have a heterogeneous, multiple, irregular, and a small quantity of a waste stream. This study addresses a representative (master) sampling survey and analysis plan for RI wastes because a complete enumeration of waste drums is impossible and not desirable in terms of a cost and efficiency. The existing approaches to a representative sampling include a judgmental, simple random, stratified random, systematic grid, systematic random, composite, and adaptive sampling. A representative sampling plan may combine two or more of the above sampling approaches depending on the type and distribution of a waste stream. Stratified random sampling (constrained randomization) is proven to be adequate for a sampling design of a RI waste regarding a half-life, surface dose, undertaking time to a storage facility, and type of waste. The developed sampling protocol includes estimating the number of drums within a waste stream, estimating the number of samples, and a confirmation of the required number of samples. The statistical process control for a quality assurance plan includes control charts and an upper control limit (UCL) of 95% to determine whether a clearance level is met or not. (authors)

  20. A random walk model for evaluating clinical trials involving serial observations.

    Science.gov (United States)

    Hopper, J L; Young, G P

    1988-05-01

    For clinical trials where the variable of interest is ordered and categorical (for example, disease severity, symptom scale), and where measurements are taken at intervals, it might be possible to achieve a greater discrimination between the efficacy of treatments by modelling each patient's progress as a stochastic process. The random walk is a simple, easily interpreted model that can be fitted by maximum likelihood using a maximization routine with inference based on standard likelihood theory. In general the model can allow for randomly censored data, incorporates measured prognostic factors, and inference is conditional on the (possibly non-random) allocation of patients. Tests of fit and of model assumptions are proposed, and application to two therapeutic trials of gastroenterological disorders are presented. The model gave measures of the rate of, and variability in, improvement for patients under different treatments. A small simulation study suggested that the model is more powerful than considering the difference between initial and final scores, even when applied to data generated by a mechanism other than the random walk model assumed in the analysis. It thus provides a useful additional statistical method for evaluating clinical trials.

  1. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    Science.gov (United States)

    Gengler, Sarah; Bogaert, Patrick

    2014-12-01

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression.

  2. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    International Nuclear Information System (INIS)

    Gengler, Sarah; Bogaert, Patrick

    2014-01-01

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression

  3. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    Science.gov (United States)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  4. Crack Propagation Test Results for Variable Amplitude Spectrum Loading in Surface Flawed D6ac Steel

    National Research Council Canada - National Science Library

    Wood, H

    1971-01-01

    .... All spectra used in the program represented the critical wing pivot locations for the F-lll aircraft and were applied in a randomized block sequence containing 58 layers representing 200 flight hours...

  5. Tight Bound on Randomness for Violating the CHSH Inequality

    OpenAIRE

    Teng, Yifeng; Yang, Shenghao; Wang, Siwei; Zhao, Mingfei

    2015-01-01

    Free will (or randomness) has been studied to achieve loophole-free Bell's inequality test and to provide device-independent quantum key distribution security proofs. The required randomness such that a local hidden variable model (LHVM) can violate the Clauser-Horne-Shimony-Holt (CHSH) inequality has been studied, but a tight bound has not been proved for a practical case that i) the device settings of the two parties in the Bell test are independent; and ii) the device settings of each part...

  6. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  7. Genetic variability in Sudanese Acacia senegal (L.) assessed by ...

    African Journals Online (AJOL)

    TUOYO

    2010-07-26

    Jul 26, 2010 ... Full Length Research Paper. Genetic variability in Sudanese Acacia senegal (L.) assessed by random amplified polymorphic DNA. Rami S. Habeballa*, Nada B. Hamza and Eisa I. El Gaali. Commission for Biotechnology and Genetic Engineering, National Centre for Research, Khartoum, Sudan. P. O. Box.

  8. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  9. Understanding Solar Cycle Variability

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, R. H.; Schüssler, M., E-mail: cameron@mps.mpg.de [Max-Planck-Institut für Sonnensystemforschung, Justus-von-Liebig-Weg 3, D-37077 Göttingen (Germany)

    2017-07-10

    The level of solar magnetic activity, as exemplified by the number of sunspots and by energetic events in the corona, varies on a wide range of timescales. Most prominent is the 11-year solar cycle, which is significantly modulated on longer timescales. Drawing from dynamo theory, together with the empirical results of past solar activity and similar phenomena for solar-like stars, we show that the variability of the solar cycle can be essentially understood in terms of a weakly nonlinear limit cycle affected by random noise. In contrast to ad hoc “toy models” for the solar cycle, this leads to a generic normal-form model, whose parameters are all constrained by observations. The model reproduces the characteristics of the variable solar activity on timescales between decades and millennia, including the occurrence and statistics of extended periods of very low activity (grand minima). Comparison with results obtained with a Babcock–Leighton-type dynamo model confirm the validity of the normal-mode approach.

  10. A random effects meta-analysis model with Box-Cox transformation.

    Science.gov (United States)

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The

  11. A random effects meta-analysis model with Box-Cox transformation

    Directory of Open Access Journals (Sweden)

    Yusuke Yamaguchi

    2017-07-01

    Full Text Available Abstract Background In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. Methods We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. Results A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and

  12. 14 CFR 1274.906 - Designation of New Technology Representative and Patent Representative.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Designation of New Technology... Conditions § 1274.906 Designation of New Technology Representative and Patent Representative. Designation of New Technology Representative and Patent Representative July 2002 (a) For purposes of administration...

  13. DATA COLLECTION METHOD FOR PEDESTRIAN MOVEMENT VARIABLES

    Directory of Open Access Journals (Sweden)

    Hajime Inamura

    2000-01-01

    Full Text Available The need of tools for design and evaluation of pedestrian areas, subways stations, entrance hall, shopping mall, escape routes, stadium etc lead to the necessity of a pedestrian model. One approach pedestrian model is Microscopic Pedestrian Simulation Model. To be able to develop and calibrate a microscopic pedestrian simulation model, a number of variables need to be considered. As the first step of model development, some data was collected using video and the coordinate of the head path through image processing were also taken. Several numbers of variables can be gathered to describe the behavior of pedestrian from a different point of view. This paper describes how to obtain variables from video taking and simple image processing that can represent the movement of pedestrians and its variables

  14. Distribution of Schmidt-like eigenvalues for Gaussian ensembles of the random matrix theory

    Science.gov (United States)

    Pato, Mauricio P.; Oshanin, Gleb

    2013-03-01

    We study the probability distribution function P(β)n(w) of the Schmidt-like random variable w = x21/(∑j = 1nx2j/n), where xj, (j = 1, 2, …, n), are unordered eigenvalues of a given n × n β-Gaussian random matrix, β being the Dyson symmetry index. This variable, by definition, can be considered as a measure of how any individual (randomly chosen) eigenvalue deviates from the arithmetic mean value of all eigenvalues of a given random matrix, and its distribution is calculated with respect to the ensemble of such β-Gaussian random matrices. We show that in the asymptotic limit n → ∞ and for arbitrary β the distribution P(β)n(w) converges to the Marčenko-Pastur form, i.e. is defined as P_{n}^{( \\beta )}(w) \\sim \\sqrt{(4 - w)/w} for w ∈ [0, 4] and equals zero outside of the support, despite the fact that formally w is defined on the interval [0, n]. Furthermore, for Gaussian unitary ensembles (β = 2) we present exact explicit expressions for P(β = 2)n(w) which are valid for arbitrary n and analyse their behaviour.

  15. Distribution of Schmidt-like eigenvalues for Gaussian ensembles of the random matrix theory

    International Nuclear Information System (INIS)

    Pato, Mauricio P; Oshanin, Gleb

    2013-01-01

    We study the probability distribution function P (β) n (w) of the Schmidt-like random variable w = x 2 1 /(∑ j=1 n x 2 j /n), where x j , (j = 1, 2, …, n), are unordered eigenvalues of a given n × n β-Gaussian random matrix, β being the Dyson symmetry index. This variable, by definition, can be considered as a measure of how any individual (randomly chosen) eigenvalue deviates from the arithmetic mean value of all eigenvalues of a given random matrix, and its distribution is calculated with respect to the ensemble of such β-Gaussian random matrices. We show that in the asymptotic limit n → ∞ and for arbitrary β the distribution P (β) n (w) converges to the Marčenko–Pastur form, i.e. is defined as P n (β) (w)∼√((4 - w)/w) for w ∈ [0, 4] and equals zero outside of the support, despite the fact that formally w is defined on the interval [0, n]. Furthermore, for Gaussian unitary ensembles (β = 2) we present exact explicit expressions for P (β=2) n (w) which are valid for arbitrary n and analyse their behaviour. (paper)

  16. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  17. A generator for unique quantum random numbers based on vacuum states

    DEFF Research Database (Denmark)

    Gabriel, C.; Wittmann, C.; Sych, D.

    2010-01-01

    the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably......Random numbers are a valuable component in diverse applications that range from simulations(1) over gambling to cryptography(2,3). The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational...... unpredictability of quantum mechanics(4-11). However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique(12-15). Here we present a simple experimental setup based on homodyne measurements that uses...

  18. Variability of Actinobacteria, a minor component of rumen microflora.

    Science.gov (United States)

    Suľák, M; Sikorová, L; Jankuvová, J; Javorský, P; Pristaš, P

    2012-07-01

    Actinobacteria (Actinomycetes) are a significant and interesting group of gram-positive bacteria. They are regular, though infrequent, members of the microbial life in the rumen and represent up to 3 % of total rumen bacteria; there is considerable lack of information about ecology and biology of rumen actinobacteria. During the characterization of variability of rumen treponemas using non-cultivation approach, we also noted the variability of rumen actinobacteria. By using Treponema-specific primers a specific 16S rRNA gene library was prepared from cow and sheep rumen total DNA. About 10 % of recombinant clones contained actinobacteria-like sequences. Phylogenetic analyses of 11 clones obtained showed the high variability of actinobacteria in the ruminant digestive system. While some sequences are nearly identical to known sequences of actinobacteria, we detected completely new clusters of actinobacteria-like sequences, representing probably new, as yet undiscovered, group of rumen Actinobacteria. Further research will be necessary for understanding their nature and functions in the rumen.

  19. The ASAS-SN Catalog of Variable Stars I: The Serendipitous Survey

    Science.gov (United States)

    Jayasinghe, T.; Kochanek, C. S.; Stanek, K. Z.; Shappee, B. J.; Holoien, T. W.-S.; Thompson, Todd A.; Prieto, J. L.; Dong, Subo; Pawlak, M.; Shields, J. V.; Pojmanski, G.; Otero, S.; Britt, C. A.; Will, D.

    2018-04-01

    The All-Sky Automated Survey for Supernovae (ASAS-SN) is the first optical survey to routinely monitor the whole sky with a cadence of ˜2 - 3 days down to V≲ 17 mag. ASAS-SN has monitored the whole sky since 2014, collecting ˜100 - 500 epochs of observations per field. The V-band light curves for candidate variables identified during the search for supernovae are classified using a random forest classifier and visually verified. We present a catalog of 66,533 bright, new variable stars discovered during our search for supernovae, including 27,753 periodic variables and 38,780 irregular variables. V-band light curves for the ASAS-SN variables are available through the ASAS-SN variable stars database (https://asas-sn.osu.edu/variables). The database will begin to include the light curves of known variable stars in the near future along with the results for a systematic, all-sky variability survey.

  20. Brain Tumor Segmentation Based on Random Forest

    Directory of Open Access Journals (Sweden)

    László Lefkovits

    2016-09-01

    Full Text Available In this article we present a discriminative model for tumor detection from multimodal MR images. The main part of the model is built around the random forest (RF classifier. We created an optimization algorithm able to select the important features for reducing the dimensionality of data. This method is also used to find out the training parameters used in the learning phase. The algorithm is based on random feature properties for evaluating the importance of the variable, the evolution of learning errors and the proximities between instances. The detection performances obtained have been compared with the most recent systems, offering similar results.

  1. Ecological and evolutionary impacts of changing climatic variability.

    Science.gov (United States)

    Vázquez, Diego P; Gianoli, Ernesto; Morris, William F; Bozinovic, Francisco

    2017-02-01

    While average temperature is likely to increase in most locations on Earth, many places will simultaneously experience higher variability in temperature, precipitation, and other climate variables. Although ecologists and evolutionary biologists widely recognize the potential impacts of changes in average climatic conditions, relatively little attention has been paid to the potential impacts of changes in climatic variability and extremes. We review the evidence on the impacts of increased climatic variability and extremes on physiological, ecological and evolutionary processes at multiple levels of biological organization, from individuals to populations and communities. Our review indicates that climatic variability can have profound influences on biological processes at multiple scales of organization. Responses to increased climatic variability and extremes are likely to be complex and cannot always be generalized, although our conceptual and methodological toolboxes allow us to make informed predictions about the likely consequences of such climatic changes. We conclude that climatic variability represents an important component of climate that deserves further attention. © 2015 Cambridge Philosophical Society.

  2. Latent variable modeling%建立隐性变量模型

    Institute of Scientific and Technical Information of China (English)

    蔡力

    2012-01-01

    @@ A latent variable model, as the name suggests,is a statistical model that contains latent, that is, unobserved, variables.Their roots go back to Spearman's 1904 seminal work[1] on factor analysis,which is arguably the first well-articulated latent variable model to be widely used in psychology, mental health research, and allied disciplines.Because of the association of factor analysis with early studies of human intelligence, the fact that key variables in a statistical model are, on occasion, unobserved has been a point of lingering contention and controversy.The reader is assured, however, that a latent variable,defined in the broadest manner, is no more mysterious than an error term in a normal theory linear regression model or a random effect in a mixed model.

  3. Characterizing the Optical Variability of Bright Blazars: Variability-based Selection of Fermi Active Galactic Nuclei

    Science.gov (United States)

    Ruan, John J.; Anderson, Scott F.; MacLeod, Chelsea L.; Becker, Andrew C.; Burnett, T. H.; Davenport, James R. A.; Ivezić, Željko; Kochanek, Christopher S.; Plotkin, Richard M.; Sesar, Branimir; Stuart, J. Scott

    2012-11-01

    We investigate the use of optical photometric variability to select and identify blazars in large-scale time-domain surveys, in part to aid in the identification of blazar counterparts to the ~30% of γ-ray sources in the Fermi 2FGL catalog still lacking reliable associations. Using data from the optical LINEAR asteroid survey, we characterize the optical variability of blazars by fitting a damped random walk model to individual light curves with two main model parameters, the characteristic timescales of variability τ, and driving amplitudes on short timescales \\hat{\\sigma }. Imposing cuts on minimum τ and \\hat{\\sigma } allows for blazar selection with high efficiency E and completeness C. To test the efficacy of this approach, we apply this method to optically variable LINEAR objects that fall within the several-arcminute error ellipses of γ-ray sources in the Fermi 2FGL catalog. Despite the extreme stellar contamination at the shallow depth of the LINEAR survey, we are able to recover previously associated optical counterparts to Fermi active galactic nuclei with E >= 88% and C = 88% in Fermi 95% confidence error ellipses having semimajor axis r beaming. After correcting for beaming, we estimate that the characteristic timescale of blazar variability is ~3 years in the rest frame of the jet, in contrast with the ~320 day disk flux timescale observed in quasars. The variability-based selection method presented will be useful for blazar identification in time-domain optical surveys and is also a probe of jet physics.

  4. Variable dead time counters: 2. A computer simulation

    International Nuclear Information System (INIS)

    Hooton, B.W.; Lees, E.W.

    1980-09-01

    A computer model has been developed to give a pulse train which simulates that generated by a variable dead time counter (VDC) used in safeguards determination of Pu mass. The model is applied to two algorithms generally used for VDC analysis. It is used to determine their limitations at high counting rates and to investigate the effects of random neutrons from (α,n) reactions. Both algorithms are found to be deficient for use with masses of 240 Pu greater than 100g and one commonly used algorithm is shown, by use of the model and also by theory, to yield a result which is dependent on the random neutron intensity. (author)

  5. Randomness and variability of the neuronal activity described by the Ornstein-Uhlenbeck model

    Czech Academy of Sciences Publication Activity Database

    Košťál, Lubomír; Lánský, Petr; Zucca, Ch.

    2007-01-01

    Roč. 18, č. 1 (2007), s. 63-75 ISSN 0954-898X R&D Projects: GA MŠk(CZ) LC554; GA AV ČR(CZ) 1ET400110401; GA AV ČR(CZ) KJB100110701 Grant - others:MIUR(IT) PRIN-Cofin 2005 Institutional research plan: CEZ:AV0Z50110509 Keywords : Ornstein-Uhlenbeck * entropy * randomness Subject RIV: FH - Neurology Impact factor: 1.385, year: 2007

  6. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Nuria eRuffini

    2015-08-01

    Full Text Available Context: Heart Rate Variability (HRV indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT on ANS activity through changes of High Frequency, a heart rate variability index indicating the parasympathetic activity, in healthy subjects, compared with sham therapy and control group.Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults, both smokers and non-smokers and not on medications. At enrollment subjects were randomized in 3 groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920.Main Outcomes Measures: HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 minutes.Results: OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency rate (p<0.001, and decrease of sympathetic activity, as revealed by Low Frequency rate (p<0.01; results also showed a reduction of Low Frequency/High Frequency ratio (p<0.001 and Detrended fluctuation scaling exponent (p<0.05. Conclusions: Findings suggested that OMT can influence ANS activity increasing parasympathetic function and decreasing sympathetic activity, compared to sham therapy and control group.

  7. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  8. Rye-Based Evening Meals Favorably Affected Glucose Regulation and Appetite Variables at the Following Breakfast; A Randomized Controlled Study in Healthy Subjects.

    Science.gov (United States)

    Sandberg, Jonna C; Björck, Inger M E; Nilsson, Anne C

    2016-01-01

    Whole grain has shown potential to prevent obesity, cardiovascular disease and type 2 diabetes. Possible mechanism could be related to colonic fermentation of specific indigestible carbohydrates, i.e. dietary fiber (DF). The aim of this study was to investigate effects on cardiometabolic risk factors and appetite regulation the next day when ingesting rye kernel bread rich in DF as an evening meal. Whole grain rye kernel test bread (RKB) or a white wheat flour based bread (reference product, WWB) was provided as late evening meals to healthy young adults in a randomized cross-over design. The test products RKB and WWB were provided in two priming settings: as a single evening meal or as three consecutive evening meals prior to the experimental days. Test variables were measured in the morning, 10.5-13.5 hours after ingestion of RKB or WWB. The postprandial phase was analyzed for measures of glucose metabolism, inflammatory markers, appetite regulating hormones and short chain fatty acids (SCFA) in blood, hydrogen excretion in breath and subjective appetite ratings. With the exception of serum CRP, no significant differences in test variables were observed depending on length of priming (P>0.05). The RKB evening meal increased plasma concentrations of PYY (0-120 min, Pappetite ratings during the whole experimental period (Pappetite sensation could be beneficial in preventing obesity. These effects could possibly be mediated through colonic fermentation. ClinicalTrials.gov NCT02093481.

  9. On the number of subgraphs of the Barabási-Albert random graph

    International Nuclear Information System (INIS)

    Ryabchenko, Aleksandr A; Samosvat, Egor A

    2012-01-01

    We study a model of a random graph of the type of the Barabási-Albert preferential attachment model. We develop a technique that makes it possible to estimate the mathematical expectation for a fairly wide class of random variables in the model under consideration. We use this technique to prove a theorem on the asymptotics of the mathematical expectation of the number of subgraphs isomorphic to a certain fixed graph in the random graphs of this model.

  10. A randomized, double-blind, crossover, placebo-controlled trial of 6 weeks benfotiamine treatment on postprandial vascular function and variables of autonomic nerve function in Type 2 diabetes.

    Science.gov (United States)

    Stirban, A; Pop, A; Tschoepe, D

    2013-10-01

    In a pilot study we suggested that benfotiamine, a thiamine prodrug, prevents postprandial endothelial dysfunction in people with Type 2 diabetes mellitus. The aim of this study was to test these effects in a larger population. In a double-blind, placebo-controlled, randomized, crossover study, 31 people with Type 2 diabetes received 900 mg/day benfotiamine or a placebo for 6 weeks (with a washout period of 6 weeks between). At the end of each treatment period, macrovascular and microvascular function were assessed, together with variables of autonomic nervous function in a fasting state, as well as 2, 4 and 6 h following a heated, mixed test meal. Participants had an impaired baseline flow-mediated dilatation (2.63 ± 2.49%). Compared with the fasting state, neither variable changed postprandially following the placebo treatment. The 6 weeks' treatment with high doses of benfotiamine did not alter this pattern, either in the fasting state or postprandially. Among a subgroup of patients with the highest flow-mediated dilatation, following placebo treatment there was a significant postprandial flow-mediated dilatation decrease, while this effect was attenuated by benfotiamine pretreatment. In people with Type 2 diabetes and markedly impaired fasting flow-mediated dilatation, a mixed test meal does not further deteriorate flow-mediated dilatation or variables of microvascular or autonomic nervous function. Because no significant deterioration of postprandial flow-mediated dilatation, microvascular or autonomic nervous function tests occurred after placebo treatment, a prevention of the postprandial deterioration of these variables with benfotiamine was not feasible. © 2013 The Authors. Diabetic Medicine © 2013 Diabetes UK.

  11. Test-retest reliability of jump execution variables using mechanography: a comparison of jump protocols.

    Science.gov (United States)

    Fitzgerald, John S; Johnson, LuAnn; Tomkinson, Grant; Stein, Jesse; Roemmich, James N

    2018-05-01

    Mechanography during the vertical jump may enhance screening and determining mechanistic causes underlying physical performance changes. Utility of jump mechanography for evaluation is limited by scant test-retest reliability data on force-time variables. This study examined the test-retest reliability of eight jump execution variables assessed from mechanography. Thirty-two women (mean±SD: age 20.8 ± 1.3 yr) and 16 men (age 22.1 ± 1.9 yr) attended a familiarization session and two testing sessions, all one week apart. Participants performed two variations of the squat jump with squat depth self-selected and controlled using a goniometer to 80º knee flexion. Test-retest reliability was quantified as the systematic error (using effect size between jumps), random error (using coefficients of variation), and test-retest correlations (using intra-class correlation coefficients). Overall, jump execution variables demonstrated acceptable reliability, evidenced by small systematic errors (mean±95%CI: 0.2 ± 0.07), moderate random errors (mean±95%CI: 17.8 ± 3.7%), and very strong test-retest correlations (range: 0.73-0.97). Differences in random errors between controlled and self-selected protocols were negligible (mean±95%CI: 1.3 ± 2.3%). Jump execution variables demonstrated acceptable reliability, with no meaningful differences between the controlled and self-selected jump protocols. To simplify testing, a self-selected jump protocol can be used to assess force-time variables with negligible impact on measurement error.

  12. Rational Variability in Children's Causal Inferences: The Sampling Hypothesis

    Science.gov (United States)

    Denison, Stephanie; Bonawitz, Elizabeth; Gopnik, Alison; Griffiths, Thomas L.

    2013-01-01

    We present a proposal--"The Sampling Hypothesis"--suggesting that the variability in young children's responses may be part of a rational strategy for inductive inference. In particular, we argue that young learners may be randomly sampling from the set of possible hypotheses that explain the observed data, producing different hypotheses with…

  13. Hidden variables and locality in quantum theory

    International Nuclear Information System (INIS)

    Shiva, Vandana.

    1978-12-01

    The status of hidden variables in quantum theory has been debated since the 1920s. The author examines the no-hidden-variable theories of von Neumann, Kochen, Specker and Bell, and finds that they all share one basic assumption: averaging over the hidden variables should reproduce the quantum mechanical probabilities. Von Neumann also makes a linearity assumption, Kochen and Specker require the preservation of certain functional relations between magnitudes, and Bell proposes a locality condition. It has been assumed that the extrastatistical requirements are needed to serve as criteria of success for the introduction of hidden variables because the statistical condition is trivially satisfied, and that Bell's result is based on a locality condition that is physically motivated. The author shows that the requirement of weak locality, which is not physically motivated, is enough to give Bell's result. The proof of Bell's inequality works equally well for any pair of commuting magnitudes satisfying a condition called the degeneracy principle. None of the no-hidden-variable proofs apply to a class of hidden variable theories that are not phase-space reconstructions of quantum mechanics. The author discusses one of these theories, the Bohm-Bub theory, and finds that hidden variable theories that re all the quantum statistics, for single and sequential measurements, must introduce a randomization process for the hidden variables after each measurement. The philosophical significance of this theory lies in the role it can play in solving the conceptual puzzles posed by quantum theory

  14. Using mi impute chained to fit ANCOVA models in randomized trials with censored dependent and independent variables

    DEFF Research Database (Denmark)

    Andersen, Andreas; Rieckmann, Andreas

    2016-01-01

    In this article, we illustrate how to use mi impute chained with intreg to fit an analysis of covariance analysis of censored and nondetectable immunological concentrations measured in a randomized pretest–posttest design.......In this article, we illustrate how to use mi impute chained with intreg to fit an analysis of covariance analysis of censored and nondetectable immunological concentrations measured in a randomized pretest–posttest design....

  15. After-School Multifamily Groups: A Randomized Controlled Trial Involving Low-Income, Urban, Latino Children

    Science.gov (United States)

    McDonald, Lynn; Moberg, D. Paul; Brown, Roger; Rodriguez-Espiricueta, Ismael; Flores, Nydia I.; Burke, Melissa P.; Coover, Gail

    2006-01-01

    This randomized controlled trial evaluated a culturally representative parent engagement strategy with Latino parents of elementary school children. Ten urban schools serving low-income children from mixed cultural backgrounds participated in a large study. Classrooms were randomly assigned either either to an after-school, multifamily support…

  16. Entropy as a collective variable

    Science.gov (United States)

    Parrinello, Michele

    Sampling complex free energy surfaces that exhibit long lived metastable states separated by kinetic bottlenecks is one of the most pressing issues in the atomistic simulations of matter. Not surprisingly many solutions to this problem have been suggested. Many of them are based on the identification of appropriate collective variables that span the manifold of the slow varying modes of the system. While much effort has been put in devising and even constructing on the fly appropriate collective variables there is still a cogent need of introducing simple, generic, physically transparent, and yet effective collective variables. Motivated by the physical observation that in many case transitions between one metastable state and another result from a trade off between enthalpy and entropy we introduce appropriate collective variables that are able to represent in a simple way these two physical properties. We use these variables in the context of the recently introduced variationally enhanced sampling and apply it them with success to the simulation of crystallization from the liquid and to conformational transitions in protein. Department of Chemistry and Applied Biosciences, ETH Zurich, and Facolta' di Informatica, Istituto di Scienze Computazionali, Universita' della Svizzera Italiana, Via G. Buffi 13, 6900 Lugano, Switzerland.

  17. Variability in perceived satisfaction of reservoir management objectives

    Science.gov (United States)

    Owen, W.J.; Gates, T.K.; Flug, M.

    1997-01-01

    Fuzzy set theory provides a useful model to address imprecision in interpreting linguistically described objectives for reservoir management. Fuzzy membership functions can be used to represent degrees of objective satisfaction for different values of management variables. However, lack of background information, differing experiences and qualifications, and complex interactions of influencing factors can contribute to significant variability among membership functions derived from surveys of multiple experts. In the present study, probabilistic membership functions are used to model variability in experts' perceptions of satisfaction of objectives for hydropower generation, fish habitat, kayaking, rafting, and scenery preservation on the Green River through operations of Flaming Gorge Dam. Degree of variability in experts' perceptions differed among objectives but resulted in substantial uncertainty in estimation of optimal reservoir releases.

  18. Random variables in forest policy: A systematic sensitivity analysis using CGE models

    International Nuclear Information System (INIS)

    Alavalapati, J.R.R.

    1999-01-01

    Computable general equilibrium (CGE) models are extensively used to simulate economic impacts of forest policies. Parameter values used in these models often play a central role in their outcome. Since econometric studies and best guesses are the main sources of these parameters, some randomness exists about the 'true' values of these parameters. Failure to incorporate this randomness into these models may limit the degree of confidence in the validity of the results. In this study, we conduct a systematic sensitivity analysis (SSA) to assess the economic impacts of: 1) a 1 % increase in tax on Canadian lumber and wood products exports to the United States (US), and 2) a 1% decrease in technical change in the lumber and wood products and pulp and paper sectors of the US and Canada. We achieve this task by using an aggregated version of global trade model developed by Hertel (1997) and the automated SSA procedure developed by Arndt and Pearson (1996). The estimated means and standard deviations suggest that certain impacts are more likely than others. For example, an increase in export tax is likely to cause a decrease in Canadian income, while an increase in US income is unlikely. On the other hand, a decrease in US welfare is likely, while an increase in Canadian welfare is unlikely, in response to an increase in tax. It is likely that income and welfare both fall in Canada and the US in response to a decrease in the technical change in lumber and wood products and pulp and paper sectors 21 refs, 1 fig, 5 tabs

  19. The necessity of randomness in tests of Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Bednorz, Adam; Zielinski, Jakub

    2003-08-11

    The possibility that detectors may affect the input quantum entangled state is pointed out. It is suggested that experiments testing Bell inequalities should be repeated with more randomly oriented polarizers to both close communication loophole and refute certain local variable theories with low efficiency bound.

  20. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    Science.gov (United States)

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  1. Assessing the potential of random forest method for estimating solar radiation using air pollution index

    International Nuclear Information System (INIS)

    Sun, Huaiwei; Gui, Dongwei; Yan, Baowei; Liu, Yi; Liao, Weihong; Zhu, Yan; Lu, Chengwei; Zhao, Na

    2016-01-01

    Highlights: • Models based on random forests for daily solar radiation estimation are proposed. • Three sites within different air pollution index conditions are considered. • Performance of random forests is better than that of empirical methodologies. • Special attention is given to the use of air pollution index. • The potential of air pollution index is assessed by random forest models. - Abstract: Simulations of solar radiation have become increasingly common in recent years because of the rapid global development and deployment of solar energy technologies. The effect of air pollution on solar radiation is well known. However, few studies have attempting to evaluate the potential of the air pollution index in estimating solar radiation. In this study, meteorological data, solar radiation, and air pollution index data from three sites having different air pollution index conditions are used to develop random forest models. We propose different random forest models with and without considering air pollution index data, and then compare their respective performance with that of empirical methodologies. In addition, a variable importance approach based on random forest is applied in order to assess input variables. The results show that the performance of random forest models with air pollution index data is better than that of the empirical methodologies, generating 9.1–17.0% lower values of root-mean-square error in a fitted period and 2.0–17.4% lower values of root-mean-square error in a predicted period. Both the comparative results of different random forest models and variance importance indicate that applying air pollution index data is improves estimation of solar radiation. Also, although the air pollution index values varied largely from season to season, the random forest models appear more robust performances in different seasons than different models. The findings can act as a guide in selecting used variables to estimate daily solar

  2. Random numbers spring from alpha decay

    International Nuclear Information System (INIS)

    Frigerio, N.A.; Sanathanan, L.P.; Morley, M.; Clark, N.A.; Tyler, S.A.

    1980-05-01

    Congruential random number generators, which are widely used in Monte Carlo simulations, are deficient in that the number they generate are concentrated in a relatively small number of hyperplanes. While this deficiency may not be a limitation in small Monte Carlo studies involving a few variables, it introduces a significant bias in large simulations requiring high resolution. This bias was recognized and assessed during preparations for an accident analysis study of nuclear power plants. This report describes a random number device based on the radioactive decay of alpha particles from a 235 U source in a high-resolution gas proportional counter. The signals were fed to a 4096-channel analyzer and for each channel the frequency of signals registered in a 20,000-microsecond interval was recorded. The parity bits of these frequency counts (0 for an even count and 1 for an odd count) were then assembled in sequence to form 31-bit binary random numbers and transcribed to a magnetic tape. This cycle was repeated as many times as were necessary to create 3 million random numbers. The frequency distribution of counts from the present device conforms to the Brockwell-Moyal distribution, which takes into account the dead time of the counter (both the dead time and decay constant of the underlying Poisson process were estimated). Analysis of the count data and tests of randomness on a sample set of the 31-bit binary numbers indicate that this random number device is a highly reliable source of truly random numbers. Its use is, therefore, recommended in Monte Carlo simulations for which the congruential pseudorandom number generators are found to be inadequate. 6 figures, 5 tables

  3. Decentralized formation of random regular graphs for robust multi-agent networks

    KAUST Repository

    Yazicioglu, A. Yasin

    2014-12-15

    Multi-agent networks are often modeled via interaction graphs, where the nodes represent the agents and the edges denote direct interactions between the corresponding agents. Interaction graphs have significant impact on the robustness of networked systems. One family of robust graphs is the random regular graphs. In this paper, we present a locally applicable reconfiguration scheme to build random regular graphs through self-organization. For any connected initial graph, the proposed scheme maintains connectivity and the average degree while minimizing the degree differences and randomizing the links. As such, if the average degree of the initial graph is an integer, then connected regular graphs are realized uniformly at random as time goes to infinity.

  4. Emergence of an optimal search strategy from a simple random walk.

    Science.gov (United States)

    Sakiyama, Tomoko; Gunji, Yukio-Pegio

    2013-09-06

    In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths.

  5. The Effects of Point or Polygon Based Training Data on RandomForest Classification Accuracy of Wetlands

    Directory of Open Access Journals (Sweden)

    Jennifer Corcoran

    2015-04-01

    Full Text Available Wetlands are dynamic in space and time, providing varying ecosystem services. Field reference data for both training and assessment of wetland inventories in the State of Minnesota are typically collected as GPS points over wide geographical areas and at infrequent intervals. This status-quo makes it difficult to keep updated maps of wetlands with adequate accuracy, efficiency, and consistency to monitor change. Furthermore, point reference data may not be representative of the prevailing land cover type for an area, due to point location or heterogeneity within the ecosystem of interest. In this research, we present techniques for training a land cover classification for two study sites in different ecoregions by implementing the RandomForest classifier in three ways: (1 field and photo interpreted points; (2 fixed window surrounding the points; and (3 image objects that intersect the points. Additional assessments are made to identify the key input variables. We conclude that the image object area training method is the most accurate and the most important variables include: compound topographic index, summer season green and blue bands, and grid statistics from LiDAR point cloud data, especially those that relate to the height of the return.

  6. Lectures on random interfaces

    CERN Document Server

    Funaki, Tadahisa

    2016-01-01

    Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...

  7. CHARACTERIZING THE OPTICAL VARIABILITY OF BRIGHT BLAZARS: VARIABILITY-BASED SELECTION OF FERMI ACTIVE GALACTIC NUCLEI

    International Nuclear Information System (INIS)

    Ruan, John J.; Anderson, Scott F.; MacLeod, Chelsea L.; Becker, Andrew C.; Davenport, James R. A.; Ivezić, Željko; Burnett, T. H.; Kochanek, Christopher S.; Plotkin, Richard M.; Sesar, Branimir; Stuart, J. Scott

    2012-01-01

    We investigate the use of optical photometric variability to select and identify blazars in large-scale time-domain surveys, in part to aid in the identification of blazar counterparts to the ∼30% of γ-ray sources in the Fermi 2FGL catalog still lacking reliable associations. Using data from the optical LINEAR asteroid survey, we characterize the optical variability of blazars by fitting a damped random walk model to individual light curves with two main model parameters, the characteristic timescales of variability τ, and driving amplitudes on short timescales σ-circumflex. Imposing cuts on minimum τ and σ-circumflex allows for blazar selection with high efficiency E and completeness C. To test the efficacy of this approach, we apply this method to optically variable LINEAR objects that fall within the several-arcminute error ellipses of γ-ray sources in the Fermi 2FGL catalog. Despite the extreme stellar contamination at the shallow depth of the LINEAR survey, we are able to recover previously associated optical counterparts to Fermi active galactic nuclei with E ≥ 88% and C = 88% in Fermi 95% confidence error ellipses having semimajor axis r < 8'. We find that the suggested radio counterpart to Fermi source 2FGL J1649.6+5238 has optical variability consistent with other γ-ray blazars and is likely to be the γ-ray source. Our results suggest that the variability of the non-thermal jet emission in blazars is stochastic in nature, with unique variability properties due to the effects of relativistic beaming. After correcting for beaming, we estimate that the characteristic timescale of blazar variability is ∼3 years in the rest frame of the jet, in contrast with the ∼320 day disk flux timescale observed in quasars. The variability-based selection method presented will be useful for blazar identification in time-domain optical surveys and is also a probe of jet physics.

  8. The blocked-random effect in pictures and words.

    Science.gov (United States)

    Toglia, M P; Hinman, P J; Dayton, B S; Catalano, J F

    1997-06-01

    Picture and word recall was examined in conjunction with list organization. 60 subjects studied a list of 30 items, either words or their pictorial equivalents. The 30 words/pictures, members of five conceptual categories, each represented by six exemplars, were presented either blocked by category or in a random order. While pictures were recalled better than words and a standard blocked-random effect was observed, the interaction indicated that the recall advantage of a blocked presentation was restricted to the word lists. A similar pattern emerged for clustering. These findings are discussed in terms of limitations upon the pictorial superiority effect.

  9. A continuous-time random-walk approach to the Cole-Davidson dielectric response of dipolar liquids

    DEFF Research Database (Denmark)

    Szabat, B.; Langner, K. M.; Klösgen-Buchkremer, Beate Maria

    2004-01-01

    We show how the Cole-Davidson relaxation response, characteristic of alcoholic systems, can be derived within the framework of the continuous-time random walk (CTRW). Using the random-variable formalism, we indicate that the high-frequency power law of dielectric spectra is determined by the heavy...

  10. A continuous-time random-walk approach to the Cole-Davidson dielectric response of dipolar liquids

    DEFF Research Database (Denmark)

    Szabat, Bozena; Langner, Karol M.; Klösgen, Beate Maria

    2005-01-01

    We show how the Cole-Davidson relaxation response, characteristic of alcoholic systems, can be derived within the framework of the continuous-time random walk 4CTRW). Using the random-variable formalism, we indicate that the high-frequency power law of dielectric spectra is determined by the heav...

  11. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  12. Statistical theory of correlations in random packings of hard particles.

    Science.gov (United States)

    Jin, Yuliang; Puckett, James G; Makse, Hernán A

    2014-05-01

    A random packing of hard particles represents a fundamental model for granular matter. Despite its importance, analytical modeling of random packings remains difficult due to the existence of strong correlations which preclude the development of a simple theory. Here, we take inspiration from liquid theories for the n-particle angular correlation function to develop a formalism of random packings of hard particles from the bottom up. A progressive expansion into a shell of particles converges in the large layer limit under a Kirkwood-like approximation of higher-order correlations. We apply the formalism to hard disks and predict the density of two-dimensional random close packing (RCP), ϕ(rcp) = 0.85 ± 0.01, and random loose packing (RLP), ϕ(rlp) = 0.67 ± 0.01. Our theory also predicts a phase diagram and angular correlation functions that are in good agreement with experimental and numerical data.

  13. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 1. Theory

    Science.gov (United States)

    Graham, Wendy D.; Tankersley, Claude D.

    1994-05-01

    Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.

  14. Higher-dimensional cosmological model with variable gravitational ...

    Indian Academy of Sciences (India)

    variable G and bulk viscosity in Lyra geometry. Exact solutions for ... a comparative study of Robertson–Walker models with a constant deceleration .... where H is defined as H =(˙A/A)+(1/3)( ˙B/B) and β0,H0 are representing present values of β ...

  15. Calculating radiotherapy margins based on Bayesian modelling of patient specific random errors

    International Nuclear Information System (INIS)

    Herschtal, A; Te Marvelde, L; Mengersen, K; Foroudi, F; Ball, D; Devereux, T; Pham, D; Greer, P B; Pichler, P; Eade, T; Kneebone, A; Bell, L; Caine, H; Hindson, B; Kron, T; Hosseinifard, Z

    2015-01-01

    Collected real-life clinical target volume (CTV) displacement data show that some patients undergoing external beam radiotherapy (EBRT) demonstrate significantly more fraction-to-fraction variability in their displacement (‘random error’) than others. This contrasts with the common assumption made by historical recipes for margin estimation for EBRT, that the random error is constant across patients. In this work we present statistical models of CTV displacements in which random errors are characterised by an inverse gamma (IG) distribution in order to assess the impact of random error variability on CTV-to-PTV margin widths, for eight real world patient cohorts from four institutions, and for different sites of malignancy. We considered a variety of clinical treatment requirements and penumbral widths. The eight cohorts consisted of a total of 874 patients and 27 391 treatment sessions. Compared to a traditional margin recipe that assumes constant random errors across patients, for a typical 4 mm penumbral width, the IG based margin model mandates that in order to satisfy the common clinical requirement that 90% of patients receive at least 95% of prescribed RT dose to the entire CTV, margins be increased by a median of 10% (range over the eight cohorts −19% to +35%). This substantially reduces the proportion of patients for whom margins are too small to satisfy clinical requirements. (paper)

  16. Online Synthesis for Operation Execution Time Variability on Digital Microfluidic Biochips

    DEFF Research Database (Denmark)

    Alistar, Mirela; Pop, Paul

    2014-01-01

    have assumed that each biochemical operation in an application is characterized by a worst-case execution time (wcet). However, during the execution of the application, due to variability and randomness in biochemical reactions, operations may finish earlier than their wcets. In this paper we propose...... an online synthesis strategy that re-synthesizes the application at runtime when operations experience variability in their execution time, obtaining thus shorter application execution times. The proposed strategy has been evaluated using several benchmarks....

  17. 14 CFR 1260.58 - Designation of new technology representative and patent representative.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Designation of new technology... of new technology representative and patent representative. Designation of New Technology... of this grant entitled “New Technology,” the following named representatives are hereby designated by...

  18. Individualized anemia management reduces hemoglobin variability in hemodialysis patients.

    Science.gov (United States)

    Gaweda, Adam E; Aronoff, George R; Jacobs, Alfred A; Rai, Shesh N; Brier, Michael E

    2014-01-01

    One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA doses guided by the Smart Anemia Manager algorithm (treatment) or by a standard protocol (control). Dose recommendations, performed on a monthly basis, were validated by an expert physician anemia manager. The primary outcome was the percentage of hemoglobin concentrations between 10 and 12 g/dl over the follow-up period. A total of 258 of 356 (72.5%) hemoglobin concentrations were between 10 and 12 g/dl in the treatment group, compared with 208 of 336 (61.9%) in the control group; 42 (11.8%) hemoglobin concentrations were hemoglobin concentrations were >12 g/dl in the treatment group compared with 46 (13.4%) in the control group. The median ESA dosage per patient was 2000 IU/wk in both groups. Five participants received 6 transfusions (21 U) in the treatment group, compared with 8 participants and 13 transfusions (31 U) in the control group. These results suggest that individualized ESA dosing decreases total hemoglobin variability compared with a population protocol-based approach. As hemoglobin levels are declining in hemodialysis patients, decreasing hemoglobin variability may help reduce the risk of transfusions in this population.

  19. Risk Gambling and Personality: Results from a Representative Swedish Sample.

    Science.gov (United States)

    Sundqvist, Kristina; Wennberg, Peter

    2015-12-01

    The association between personality and gambling has been explored previously. However, few studies are based on representative populations. This study aimed at examining the association between risk gambling and personality in a representative Swedish population. A random Swedish sample (N = 19,530) was screened for risk gambling using the Lie/Bet questionnaire. The study sample (N = 257) consisted of those screening positive on Lie/Bet and completing a postal questionnaire about gambling and personality (measured with the NODS-PERC and the HP5i respectively). Risk gambling was positively correlated with Negative Affectivity (a facet of Neuroticism) and Impulsivity (an inversely related facet of Conscientiousness), but all associations were weak. When taking age and gender into account, there were no differences in personality across game preference groups, though preferred game correlated with level of risk gambling. Risk gamblers scored lower than the population norm data with respect to Negative Affectivity, but risk gambling men scored higher on Impulsivity. The association between risk gambling and personality found in previous studies was corroborated in this study using a representative sample. We conclude that risk and problem gamblers should not be treated as a homogeneous group, and prevention and treatment interventions should be adapted according to differences in personality, preferred type of game and the risk potential of the games.

  20. ALOHA Random Access that Operates as a Rateless Code

    DEFF Research Database (Denmark)

    Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    Various applications of wireless Machine-to-Machine (M2M) communications have rekindled the research interest in random access protocols, suitable to support a large number of connected devices. Slotted ALOHA and its derivatives represent a simple solution for distributed random access in wireless...... the contention when the instantaneous throughput is maximized. The paper presents the related analysis, providing heuristic criteria for terminating the contention period and showing that very high throughputs can be achieved, even for a low number for contending users. The demonstrated results potentially have...

  1. Recent activities of the Seismology Division Early Career Representative(s)

    Science.gov (United States)

    Agius, Matthew; Van Noten, Koen; Ermert, Laura; Mai, P. Martin; Krawczyk, CharLotte

    2016-04-01

    The European Geosciences Union is a bottom-up-organisation, in which its members are represented by their respective scientific divisions, committees and council. In recent years, EGU has embarked on a mission to reach out for its numerous 'younger' members by giving awards to outstanding young scientists and the setting up of Early Career Scientists (ECS) representatives. The division representative's role is to engage in discussions that concern students and early career scientists. Several meetings between all the division representatives are held throughout the year to discuss ideas and Union-wide issues. One important impact ECS representatives have had on EGU is the increased number of short courses and workshops run by ECS during the annual General Assembly. Another important contribution of ECS representatives was redefining 'Young Scientist' to 'Early Career Scientist', which avoids discrimination due to age. Since 2014, the Seismology Division has its own ECS representative. In an effort to more effectively reach out for young seismologists, a blog and a social media page dedicated to seismology have been set up online. With this dedicated blog, we'd like to give more depth to the average browsing experience by enabling young researchers to explore various seismology topics in one place while making the field more exciting and accessible to the broader community. These pages are used to promote the latest research especially of young seismologists and to share interesting seismo-news. Over the months the pages proved to be popular, with hundreds of views every week and an increased number of followers. An online survey was conducted to learn more about the activities and needs of early career seismologists. We present the results from this survey, and the work that has been carried out over the last two years, including detail of what has been achieved so far, and what we would like the ECS representation for Seismology to achieve. Young seismologists are

  2. Investigating Organizational Alienation Behavior in Terms of Some Variables

    Science.gov (United States)

    Dagli, Abidin; Averbek, Emel

    2017-01-01

    The aim of this study is to detect the perceptions of public primary school teachers regarding organizational alienation behaviors in terms of some variables (gender, marital status and seniority). Survey model was used in this study. The research sample consists of randomly selected 346 teachers from 40 schools in the central district of Mardin,…

  3. Random isotropic one-dimensional XY-model

    Science.gov (United States)

    Gonçalves, L. L.; Vieira, A. P.

    1998-01-01

    The 1D isotropic s = ½XY-model ( N sites), with random exchange interaction in a transverse random field is considered. The random variables satisfy bimodal quenched distributions. The solution is obtained by using the Jordan-Wigner fermionization and a canonical transformation, reducing the problem to diagonalizing an N × N matrix, corresponding to a system of N noninteracting fermions. The calculations are performed numerically for N = 1000, and the field-induced magnetization at T = 0 is obtained by averaging the results for the different samples. For the dilute case, in the uniform field limit, the magnetization exhibits various discontinuities, which are the consequence of the existence of disconnected finite clusters distributed along the chain. Also in this limit, for finite exchange constants J A and J B, as the probability of J A varies from one to zero, the saturation field is seen to vary from Γ A to Γ B, where Γ A(Γ B) is the value of the saturation field for the pure case with exchange constant equal to J A(J B) .

  4. Replacement and inspection policies for products with random life cycle

    International Nuclear Information System (INIS)

    Yun, Won Young; Nakagawa, Toshio

    2010-01-01

    In this paper, we consider maintenance policies for products in which the economical life cycle of products is a random variable. First, we study a periodic replacement policy with minimal repair. The system is minimally repaired at failure and is replaced by new one at age T (periodic replacement policy with minimal repair of Barlow and Hunter). The expected present value of total maintenance cost of products with random life cycle is obtained and the optimal replacement interval minimizing the cost is found. Second, we consider an inspection policy for products with random life cycle to detect the system failure. The expected total cost is obtained and the optimal inspection interval is found. Numerical examples are also included.

  5. Estimating overall exposure effects for the clustered and censored outcome using random effect Tobit regression models.

    Science.gov (United States)

    Wang, Wei; Griswold, Michael E

    2016-11-30

    The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. The Temporal Structure of State Self-Esteem Variability During Parent-Adolescent Interactions : More Than Random Fluctuations

    NARCIS (Netherlands)

    De Ruiter, Naomi M. P.; Den Hartigh, Ruud J. R.; Cox, Ralf F. A.; Van Geert, Paul L. C.; Kunnen, E. Saskia

    2015-01-01

    Research regarding the variability of state self-esteem (SSE) commonly focuses on the magnitude of variability. In this article we provide the first empirical test of the temporalstructure of SSE as a real-time process during parent-adolescent interactions. We adopt a qualitative phenomenological

  7. Variability among Capsicum baccatum accessions from Goiás, Brazil, assessed by morphological traits and molecular markers.

    Science.gov (United States)

    Martinez, A L A; Araújo, J S P; Ragassi, C F; Buso, G S C; Reifschneider, F J B

    2017-07-06

    Capsicum peppers are native to the Americas, with Brazil being a significant diversity center. Capsicum baccatum accessions at Instituto Federal (IF) Goiano represent a portion of the species genetic resources from central Brazil. We aimed to characterize a C. baccatum working collection comprising 27 accessions and 3 commercial cultivars using morphological traits and molecular markers to describe its genetic and morphological variability and verify the occurrence of duplicates. This set included 1 C. baccatum var. praetermissum and 29 C. baccatum var. pendulum with potential for use in breeding programs. Twenty-two morphological descriptors, 57 inter-simple sequence repeat, and 34 random amplified polymorphic DNA markers were used. Genetic distance was calculated through the Jaccard similarity index and genetic variability through cluster analysis using the unweighted pair group method with arithmetic mean, resulting in dendrograms for both morphological analysis and molecular analysis. Genetic variability was found among C. baccatum var. pendulum accessions, and the distinction between the two C. baccatum varieties was evident in both the morphological and molecular analyses. The 29 C. baccatum var. pendulum genotypes clustered in four groups according to fruit type in the morphological analysis. They formed seven groups in the molecular analysis, without a clear correspondence with morphology. No duplicates were found. The results describe the genetic and morphological variability, provide a detailed characterization of genotypes, and discard the possibility of duplicates within the IF Goiano C. baccatum L. collection. This study will foment the use of this germplasm collection in C. baccatum breeding programs.

  8. Genetic Variability for Drought Adaptive Traits in A-511 Maize ...

    African Journals Online (AJOL)

    Drought causes considerable yield reduction in maize (Zea mays L.) grown in the moisture stressed areas of Ethiopia. Increased crop production through improvement is expected if the adapted local genotypes possess variability for drought adaptive traits. Randomly taken 196 S1 lines generated from Population A-511 ...

  9. Random matrix ensembles for PT-symmetric systems

    International Nuclear Information System (INIS)

    Graefe, Eva-Maria; Mudute-Ndumbe, Steve; Taylor, Matthew

    2015-01-01

    Recently much effort has been made towards the introduction of non-Hermitian random matrix models respecting PT-symmetry. Here we show that there is a one-to-one correspondence between complex PT-symmetric matrices and split-complex and split-quaternionic versions of Hermitian matrices. We introduce two new random matrix ensembles of (a) Gaussian split-complex Hermitian; and (b) Gaussian split-quaternionic Hermitian matrices, of arbitrary sizes. We conjecture that these ensembles represent universality classes for PT-symmetric matrices. For the case of 2 × 2 matrices we derive analytic expressions for the joint probability distributions of the eigenvalues, the one-level densities and the level spacings in the case of real eigenvalues. (fast track communication)

  10. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    Science.gov (United States)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  11. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  12. Examining impulse-variability in overarm throwing.

    Science.gov (United States)

    Urbin, M A; Stodden, David; Boros, Rhonda; Shannon, David

    2012-01-01

    The purpose of this study was to examine variability in overarm throwing velocity and spatial output error at various percentages of maximum to test the prediction of an inverted-U function as predicted by impulse-variability theory and a speed-accuracy trade-off as predicted by Fitts' Law Thirty subjects (16 skilled, 14 unskilled) were instructed to throw a tennis ball at seven percentages of their maximum velocity (40-100%) in random order (9 trials per condition) at a target 30 feet away. Throwing velocity was measured with a radar gun and interpreted as an index of overall systemic power output. Within-subject throwing velocity variability was examined using within-subjects repeated-measures ANOVAs (7 repeated conditions) with built-in polynomial contrasts. Spatial error was analyzed using mixed model regression. Results indicated a quadratic fit with variability in throwing velocity increasing from 40% up to 60%, where it peaked, and then decreasing at each subsequent interval to maximum (p < .001, η2 = .555). There was no linear relationship between speed and accuracy. Overall, these data support the notion of an inverted-U function in overarm throwing velocity variability as both skilled and unskilled subjects approach maximum effort. However, these data do not support the notion of a speed-accuracy trade-off. The consistent demonstration of an inverted-U function associated with systemic power output variability indicates an enhanced capability to regulate aspects of force production and relative timing between segments as individuals approach maximum effort, even in a complex ballistic skill.

  13. Predictive coding of dynamical variables in balanced spiking networks.

    Science.gov (United States)

    Boerlin, Martin; Machens, Christian K; Denève, Sophie

    2013-01-01

    Two observations about the cortex have puzzled neuroscientists for a long time. First, neural responses are highly variable. Second, the level of excitation and inhibition received by each neuron is tightly balanced at all times. Here, we demonstrate that both properties are necessary consequences of neural networks that represent information efficiently in their spikes. We illustrate this insight with spiking networks that represent dynamical variables. Our approach is based on two assumptions: We assume that information about dynamical variables can be read out linearly from neural spike trains, and we assume that neurons only fire a spike if that improves the representation of the dynamical variables. Based on these assumptions, we derive a network of leaky integrate-and-fire neurons that is able to implement arbitrary linear dynamical systems. We show that the membrane voltage of the neurons is equivalent to a prediction error about a common population-level signal. Among other things, our approach allows us to construct an integrator network of spiking neurons that is robust against many perturbations. Most importantly, neural variability in our networks cannot be equated to noise. Despite exhibiting the same single unit properties as widely used population code models (e.g. tuning curves, Poisson distributed spike trains), balanced networks are orders of magnitudes more reliable. Our approach suggests that spikes do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly underestimated.

  14. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  15. Exploratory Spectroscopy of Magnetic Cataclysmic Variables Candidates and Other Variable Objects

    Science.gov (United States)

    Oliveira, A. S.; Rodrigues, C. V.; Cieslinski, D.; Jablonski, F. J.; Silva, K. M. G.; Almeida, L. A.; Rodríguez-Ardila, A.; Palhares, M. S.

    2017-04-01

    The increasing number of synoptic surveys made by small robotic telescopes, such as the photometric Catalina Real-Time Transient Survey (CRTS), provides a unique opportunity to discover variable sources and improves the statistical samples of such classes of objects. Our goal is the discovery of magnetic Cataclysmic Variables (mCVs). These are rare objects that probe interesting accretion scenarios controlled by the white-dwarf magnetic field. In particular, improved statistics of mCVs would help to address open questions on their formation and evolution. We performed an optical spectroscopy survey to search for signatures of magnetic accretion in 45 variable objects selected mostly from the CRTS. In this sample, we found 32 CVs, 22 being mCV candidates, 13 of which were previously unreported as such. If the proposed classifications are confirmed, it would represent an increase of 4% in the number of known polars and 12% in the number of known IPs. A fraction of our initial sample was classified as extragalactic sources or other types of variable stars by the inspection of the identification spectra. Despite the inherent complexity in identifying a source as an mCV, variability-based selection, followed by spectroscopic snapshot observations, has proved to be an efficient strategy for their discoveries, being a relatively inexpensive approach in terms of telescope time. Based on observations obtained at the Observatório do Pico dos Dias/LNA, and at the Southern Astrophysical Research (SOAR) telescope, which is a joint project of the Ministério da Ciência, Tecnologia, e Inovação (MCTI) da República Federativa do Brasil, the U.S. National Optical Astronomy Observatory (NOAO), the University of North Carolina at Chapel Hill (UNC), and Michigan State University (MSU).

  16. Subquantum nonlocal correlations induced by the background random field

    Energy Technology Data Exchange (ETDEWEB)

    Khrennikov, Andrei, E-mail: Andrei.Khrennikov@lnu.s [International Center for Mathematical Modelling in Physics and Cognitive Sciences, Linnaeus University, Vaexjoe (Sweden); Institute of Information Security, Russian State University for Humanities, Moscow (Russian Federation)

    2011-10-15

    We developed a purely field model of microphenomena-prequantum classical statistical field theory (PCSFT). This model not only reproduces important probabilistic predictions of quantum mechanics (QM) including correlations for entangled systems, but also gives a possibility to go beyond QM, i.e. to make predictions of phenomena that could be observed at the subquantum level. In this paper, we discuss one such prediction-the existence of nonlocal correlations between prequantum random fields corresponding to all quantum systems. (And by PCSFT, quantum systems are represented by classical Gaussian random fields and quantum observables by quadratic forms of these fields.) The source of these correlations is the common background field. Thus all prequantum random fields are 'entangled', but in the sense of classical signal theory. On the one hand, PCSFT demystifies quantum nonlocality by reducing it to nonlocal classical correlations based on the common random background. On the other hand, it demonstrates total generality of such correlations. They exist even for distinguishable quantum systems in factorizable states (by PCSFT terminology-for Gaussian random fields with covariance operators corresponding to factorizable quantum states).

  17. Subquantum nonlocal correlations induced by the background random field

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2011-01-01

    We developed a purely field model of microphenomena-prequantum classical statistical field theory (PCSFT). This model not only reproduces important probabilistic predictions of quantum mechanics (QM) including correlations for entangled systems, but also gives a possibility to go beyond QM, i.e. to make predictions of phenomena that could be observed at the subquantum level. In this paper, we discuss one such prediction-the existence of nonlocal correlations between prequantum random fields corresponding to all quantum systems. (And by PCSFT, quantum systems are represented by classical Gaussian random fields and quantum observables by quadratic forms of these fields.) The source of these correlations is the common background field. Thus all prequantum random fields are 'entangled', but in the sense of classical signal theory. On the one hand, PCSFT demystifies quantum nonlocality by reducing it to nonlocal classical correlations based on the common random background. On the other hand, it demonstrates total generality of such correlations. They exist even for distinguishable quantum systems in factorizable states (by PCSFT terminology-for Gaussian random fields with covariance operators corresponding to factorizable quantum states).

  18. Application of the random vibration approach in the seismic analysis of LMFBR structures - Benchmark calculations

    International Nuclear Information System (INIS)

    Preumont, A.; Shilab, S.; Cornaggia, L.; Reale, M.; Labbe, P.; Noe, H.

    1992-01-01

    This benchmark exercise is the continuation of the state-of-the-art review (EUR 11369 EN) which concluded that the random vibration approach could be an effective tool in seismic analysis of nuclear power plants, with potential advantages on time history and response spectrum techniques. As compared to the latter, the random vibration method provides an accurate treatment of multisupport excitations, non classical damping as well as the combination of high-frequency modal components. With respect to the former, the random vibration method offers direct information on statistical variability (probability distribution) and cheaper computations. The disadvantages of the random vibration method are that it is based on stationary results, and requires a power spectral density input instead of a response spectrum. A benchmark exercise to compare the three methods from the various aspects mentioned above, on one or several simple structures has been made. The following aspects have been covered with the simplest possible models: (i) statistical variability, (ii) multisupport excitation, (iii) non-classical damping. The random vibration method is therefore concluded to be a reliable method of analysis. Its use is recommended, particularly for preliminary design, owing to its computational advantage on multiple time history analysis

  19. Connectivity ranking of heterogeneous random conductivity models

    Science.gov (United States)

    Rizzo, C. B.; de Barros, F.

    2017-12-01

    To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.

  20. Environmental versus demographic variability in stochastic predator–prey models

    International Nuclear Information System (INIS)

    Dobramysl, U; Täuber, U C

    2013-01-01

    In contrast to the neutral population cycles of the deterministic mean-field Lotka–Volterra rate equations, including spatial structure and stochastic noise in models for predator–prey interactions yields complex spatio-temporal structures associated with long-lived erratic population oscillations. Environmental variability in the form of quenched spatial randomness in the predation rates results in more localized activity patches. Our previous study showed that population fluctuations in rare favorable regions in turn cause a remarkable increase in the asymptotic densities of both predators and prey. Very intriguing features are found when variable interaction rates are affixed to individual particles rather than lattice sites. Stochastic dynamics with demographic variability in conjunction with inheritable predation efficiencies generate non-trivial time evolution for the predation rate distributions, yet with overall essentially neutral optimization. (paper)

  1. Elementary representative sizes of soil attributes via attenuation of gamma rays and computerized tomography

    International Nuclear Information System (INIS)

    Borges, Jaqueline Aparecida Ribaski

    2015-01-01

    In this study, the Computed Tomography (CT) and gamma-ray attenuation (GRA) techniques were used in the investigation of representative sample sizes for attributes related to soil structure. First of all, the representative elementary length (REL) for experimental measurements of soil mass attenuation coefficient (μes), of samples from a sandy and a clayey soil, was analyzed. The study was conducted with two radioactive sources ( 241 Am and 137 Cs), three collimators (2 - 4 mm diameters), and 14 thickness (x) samples (2-5 cm). From these analyzes, it was possible to identify an ideal thickness range for each of the studied sources (2-4 cm and 12-15 cm for the sources of 241 Am and 137 Cs, respectively). The application of such results in representative elementary area evaluations, in clayey soil clods via CT, indicated that experimental soil mass attenuation coefficient average values obtained for x>4 cm and source 241 Am might induce the use of samples which are not large enough for soil bulk density evaluations. Subsequently, μCT images with a total volume of 39×39×33 mm 3 and spatial resolution of 60 μm were used for macroporous system morphological characterization of a Rhodic Ferralsol with clayey texture, under no-till (NT) and conventional till (CT) systems. Attributes as macroporosity (MAP), number of macropores (NMAP), tortuosity (τ) and connectivity (C) of the pores were assessed. The C degree was estimated based on the Euler-Poincare characteristic (EPC). Once 3D images enable the study of these attributes in different sample volumes, the proposed study is ideal for the analysis of representative elementary volume (REV). Usually, the selection of subvolumes for REV analysis occurs concentrically to a small volume or in adjacent positions. Here, we introduced a new method for selecting the positions of subvolumes, which are randomly chosen within the total image volume (random selection). It was observed that higher fluctuations in amplitude of each

  2. Resistance controllability and variability improvement in a TaO{sub x}-based resistive memory for multilevel storage application

    Energy Technology Data Exchange (ETDEWEB)

    Prakash, A., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr; Song, J.; Hwang, H., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr [Department of Materials Science and Engineering, Pohang University of Science and Technology (POSTECH), 77 Cheongam-ro, Nam-gu, Pohang, 790-784 (Korea, Republic of); Deleruyelle, D.; Bocquet, M. [Im2np, UMR CNRS 7334, Aix-Marseille Université, Marseille (France)

    2015-06-08

    In order to obtain reliable multilevel cell (MLC) characteristics, resistance controllability between the different resistance levels is required especially in resistive random access memory (RRAM), which is prone to resistance variability mainly due to its intrinsic random nature of defect generation and filament formation. In this study, we have thoroughly investigated the multilevel resistance variability in a TaO{sub x}-based nanoscale (<30 nm) RRAM operated in MLC mode. It is found that the resistance variability not only depends on the conductive filament size but also is a strong function of oxygen vacancy concentration in it. Based on the gained insights through experimental observations and simulation, it is suggested that forming thinner but denser conductive filament may greatly improve the temporal resistance variability even at low operation current despite the inherent stochastic nature of resistance switching process.

  3. Reflectance variability of surface coatings reveals characteristic eigenvalue spectra

    Science.gov (United States)

    Medina, José M.; Díaz, José A.; Barros, Rui

    2012-10-01

    We have examined the trial-to-trial variability of the reflectance spectra of surface coatings containing effect pigments. Principal component analysis of reflectances was done at each detection angle separately. A method for classification of principal components is applied based on the eigenvalue spectra. It was found that the eigenvalue spectra follow characteristic power laws and depend on the detection angle. Three different subsets of principal components were examined to separate the relevant spectral features related to the pigments from other noise sources. Reconstruction of the reflectance spectra by taking only the first subset indicated that reflectance variability was higher at near-specular reflection, suggesting a correlation with the trial-to-trial deposition of effect pigments. Reconstruction by using the second subset indicates that variability was higher at short wavelengths. Finally, reconstruction by using only the third subset indicates that reflectance variability was not totally random as a function of the wavelength. The methods employed can be useful in the evaluation of color variability in industrial paint application processes.

  4. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  5. Local lattice relaxations in random metallic alloys: Effective tetrahedron model and supercell approach

    DEFF Research Database (Denmark)

    Ruban, Andrei; Simak, S.I.; Shallcross, S.

    2003-01-01

    We present a simple effective tetrahedron model for local lattice relaxation effects in random metallic alloys on simple primitive lattices. A comparison with direct ab initio calculations for supercells representing random Ni0.50Pt0.50 and Cu0.25Au0.75 alloys as well as the dilute limit of Au-ri......-rich CuAu alloys shows that the model yields a quantitatively accurate description of the relaxtion energies in these systems. Finally, we discuss the bond length distribution in random alloys....

  6. The influence of climate variables on dengue in Singapore.

    Science.gov (United States)

    Pinto, Edna; Coelho, Micheline; Oliver, Leuda; Massad, Eduardo

    2011-12-01

    In this work we correlated dengue cases with climatic variables for the city of Singapore. This was done through a Poisson Regression Model (PRM) that considers dengue cases as the dependent variable and the climatic variables (rainfall, maximum and minimum temperature and relative humidity) as independent variables. We also used Principal Components Analysis (PCA) to choose the variables that influence in the increase of the number of dengue cases in Singapore, where PC₁ (Principal component 1) is represented by temperature and rainfall and PC₂ (Principal component 2) is represented by relative humidity. We calculated the probability of occurrence of new cases of dengue and the relative risk of occurrence of dengue cases influenced by climatic variable. The months from July to September showed the highest probabilities of the occurrence of new cases of the disease throughout the year. This was based on an analysis of time series of maximum and minimum temperature. An interesting result was that for every 2-10°C of variation of the maximum temperature, there was an average increase of 22.2-184.6% in the number of dengue cases. For the minimum temperature, we observed that for the same variation, there was an average increase of 26.1-230.3% in the number of the dengue cases from April to August. The precipitation and the relative humidity, after analysis of correlation, were discarded in the use of Poisson Regression Model because they did not present good correlation with the dengue cases. Additionally, the relative risk of the occurrence of the cases of the disease under the influence of the variation of temperature was from 1.2-2.8 for maximum temperature and increased from 1.3-3.3 for minimum temperature. Therefore, the variable temperature (maximum and minimum) was the best predictor for the increased number of dengue cases in Singapore.

  7. On the concept of a linguistic variable

    International Nuclear Information System (INIS)

    Kerre, E.

    1996-01-01

    The concept of a linguistic variable plays a crucial role in the representation of imprecise knowledge in information sciences. A variable is called linguistic as soon as its values are linguistic terms rather than numerical ones. The power of daily communication and common sense reasoning lies in the use of such linguistic values. Even when exact numerical values are available, experts tend to transform these values into linguistic ones. A physician will usually translate a numerical measurement of a blood pressure into linguistic specifications such as normal, very high, too low... Zadeh has argued that the set of values for a linguistic variable assumes a more-or-less fixed structure. Starting from an atomic value and its antonym all remaining values are constructed using logical connectives on the one hand and linguistic hedges on the other hand. In this paper we will describe how to represent the value set of a linguistic variable in general and of linguistic hedges in particular

  8. Distributed Synchronization in Networks of Agent Systems With Nonlinearities and Random Switchings.

    Science.gov (United States)

    Tang, Yang; Gao, Huijun; Zou, Wei; Kurths, Jürgen

    2013-02-01

    In this paper, the distributed synchronization problem of networks of agent systems with controllers and nonlinearities subject to Bernoulli switchings is investigated. Controllers and adaptive updating laws injected in each vertex of networks depend on the state information of its neighborhood. Three sets of Bernoulli stochastic variables are introduced to describe the occurrence probabilities of distributed adaptive controllers, updating laws and nonlinearities, respectively. By the Lyapunov functions method, we show that the distributed synchronization of networks composed of agent systems with multiple randomly occurring nonlinearities, multiple randomly occurring controllers, and multiple randomly occurring updating laws can be achieved in mean square under certain criteria. The conditions derived in this paper can be solved by semi-definite programming. Moreover, by mathematical analysis, we find that the coupling strength, the probabilities of the Bernoulli stochastic variables, and the form of nonlinearities have great impacts on the convergence speed and the terminal control strength. The synchronization criteria and the observed phenomena are demonstrated by several numerical simulation examples. In addition, the advantage of distributed adaptive controllers over conventional adaptive controllers is illustrated.

  9. Variability, plot size and border effect in lettuce trials in protected environment

    Directory of Open Access Journals (Sweden)

    Daniel Santos

    2018-03-01

    Full Text Available ABSTRACT The variability within rows of cultivation may reduce the accuracy of experiments conducted in a complete randomized block design if the rows are considered as blocks, however, little is known about this variability in protected environments. Thus, our aim was to study the variability of the fresh mass in lettuce shoot, growing in protected environment, and to verify the border effect and size of the experimental unit in minimizing the productive variability. Data from two uniformity trials carried out in a greenhouse in autumn and spring growing seasons were used. In the statistical analyses, it was considered the existence of parallel cultivation rows the lateral openings of the greenhouse and of columns perpendicular to these openings. Different scenarios were simulated by excluding rows and columns to generate several borders arrangements and also to use different sizes of the experimental unit. For each scenario, homogeneity test of variances between remaining rows and columns was performed, and it was calculated the variance and coefficient of variation. There is variability among rows in trials with lettuce in plastic greenhouses and the border use does not bring benefits in terms of reduction of the coefficient of variation or minimizing the cases of heterogeneous variances among rows. In experiments with lettuce in a plastic greenhouse, the use of an experimental unit size greater than or equal to two plants provides homogeneity of variances among rows and columns and, therefore, allows the use of a completely randomized design.

  10. About the relationships among variables observed in the real world

    Science.gov (United States)

    Petkov, Boyan H.

    2018-06-01

    Since a stationary chaotic system is determined by nonlinear equations connecting its components, the appurtenance of two variables to such a system has been considered a sign of nontrivial relationships between them including also other quantities. These relationships could remain hidden for the approach usually employed in the research analyses, which is based on the extent of the correlation that characterises the dependence of one variable on the other. The appurtenance to the same system can be hypothesized if the topological features of the attractors reconstructed from two time series representing the evolution of the corresponding variables are close to each other. However, the possibility that both attractors represent different systems with similar behaviour cannot be excluded. For that reason, an approach allowing the reconstruction of the attractor by using jointly two time series was proposed and the conclusion about the common origin of the variables under study can be made if this attractor is topologically similar to those built separately from the two time series. In the present study, the features of the attractors were presented by the correlation dimension and the largest Lyapunov exponent and the proposed algorithm has been tested on numerically generated sequences obtained from various maps. It is believed that this approach could be used to reveal connections among the variables observed in experiments or field measurements.

  11. Analysis of genetic diversity of Sclerotinia sclerotiorum from eggplant by mycelial compatibility, random amplification of polymorphic DNA (RAPD and simple sequence repeat (SSR analyses

    Directory of Open Access Journals (Sweden)

    Fatih Mehmet Tok

    2016-09-01

    Full Text Available The genetic diversity and pathogenicity/virulence among 60 eggplant Sclerotinia sclerotiorum isolates collected from six different geographic regions of Turkey were analysed using mycelial compatibility groupings (MCGs, random amplified polymorphic DNA (RAPD and simple sequence repeat (SSR polymorphism. By MCG tests, the isolates were classified into 22 groups. Out of 22 MCGs, 36% were represented each by a single isolate. The isolates showed great variability for virulence regardless of MCG and geographic origin. Based on the results of RAPD and SSR analyses, 60 S. sclerotiorum isolates representing 22 MCGs were grouped in 2 and 3 distinct clusters, respectively. Analyses using RAPD and SSR markers illustrated that cluster groupings or genetic distance of S. sclerotiorum populations from eggplant were not distinctly relative to the MCG, geographical origin and virulence diversity. The patterns obtained revealed a high heterogeneity of genetic composition and suggested the occurrence of clonal and sexual reproduction of S. sclerotiorum on eggplant in the areas surveyed.

  12. Constructing Proxy Variables to Measure Adult Learners' Time Management Strategies in LMS

    Science.gov (United States)

    Jo, Il-Hyun; Kim, Dongho; Yoon, Meehyun

    2015-01-01

    This study describes the process of constructing proxy variables from recorded log data within a Learning Management System (LMS), which represents adult learners' time management strategies in an online course. Based on previous research, three variables of total login time, login frequency, and regularity of login interval were selected as…

  13. VARIABILITY OF OPTICAL COUNTERPARTS IN THE CHANDRA GALACTIC BULGE SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Britt, C. T.; Hynes, R. I.; Johnson, C. B.; Baldwin, A.; Collazzi, A.; Gossen, L. [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, LA 70803-4001 (United States); Jonker, P. G.; Torres, M. A. P. [SRON, Netherlands Institute for Space Research, Sorbonnelaan 2, 3584 CA Utrecht (Netherlands); Nelemans, G. [Department of Astrophysics, IMAPP, Radboud University Nijmegen, Heyendaalseweg 135, 6525 AJ, Nijmegen (Netherlands); Maccarone, T. [Department of Physics, Texas Tech University, Box 41051, Science Building, Lubbock, TX 79409-1051 (United States); Steeghs, D.; Greiss, S. [Astronomy and Astrophysics, Department of Physics, University of Warwick, Coventry, CV4 7AL (United Kingdom); Heinke, C. [Department of Physics, University of Alberta, CCIS 4-183, Edmonton, AB T6G 2E1 (Canada); Bassa, C. G. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Manchester M13 9PL (United Kingdom); Villar, A. [Department of Physics, Massachussettes Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139-4307 (United States); Gabb, M. [Department of Physics, Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33431-0991 (United States)

    2014-09-01

    We present optical light curves of variable stars consistent with the positions of X-ray sources identified with the Chandra X-ray Observatory for the Chandra Galactic Bulge Survey (GBS). Using data from the Mosaic-II instrument on the Blanco 4 m Telescope at CTIO, we gathered time-resolved photometric data on timescales from ∼2 hr to 8 days over the 3/4 of the X-ray survey containing sources from the initial GBS catalog. Among the light curve morphologies we identify are flickering in interacting binaries, eclipsing sources, dwarf nova outbursts, ellipsoidal variations, long period variables, spotted stars, and flare stars. Eighty-seven percent of X-ray sources have at least one potential optical counterpart. Twenty-seven percent of these candidate counterparts are detectably variable; a much greater fraction than expected for randomly selected field stars, which suggests that most of these variables are real counterparts. We discuss individual sources of interest, provide variability information on candidate counterparts, and discuss the characteristics of the variable population.

  14. Radiation Transport in Random Media With Large Fluctuations

    Science.gov (United States)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  15. A general symplectic method for the response analysis of infinitely periodic structures subjected to random excitations

    Directory of Open Access Journals (Sweden)

    You-Wei Zhang

    Full Text Available A general symplectic method for the random response analysis of infinitely periodic structures subjected to stationary/non-stationary random excitations is developed using symplectic mathematics in conjunction with variable separation and the pseudo-excitation method (PEM. Starting from the equation of motion for a single loaded substructure, symplectic analysis is firstly used to eliminate the dependent degrees of the freedom through condensation. A Fourier expansion of the condensed equation of motion is then applied to separate the variables of time and wave number, thus enabling the necessary recurrence scheme to be developed. The random response is finally determined by implementing PEM. The proposed method is justified by comparison with results available in the literature and is then applied to a more complicated time-dependent coupled system.

  16. Sound Effects for Children's Comprehension of Variably-Paced Television Programs.

    Science.gov (United States)

    Calvert, Sandra L.; Scott, M. Catherine

    In this study, children's selective attention to, and comprehension of, variably-paced television programs were examined as a function of sound effects. Sixty-four children, equally distributed by sex and by preschool and fourth grades, were randomly assigned to one of four treatment conditions which crossed two levels of sound effects (presence…

  17. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    Science.gov (United States)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  18. Genetic variability of European honey bee, Apis mellifera in mid hills ...

    African Journals Online (AJOL)

    To observe the genetic variability in European honey bee, A. mellifera, PCR was run separately with five primers and analysis of the banding pattern was worked out to investigate the molecular profile of honey bee genotypes collected from different locations having random amplified polymorphic DNA (RAPD) primers.

  19. Neutron Transport in Finite Random Media with Pure-Triplet Scattering

    International Nuclear Information System (INIS)

    Sallaha, M.; Hendi, A.A.

    2008-01-01

    The solution of the one-speed neutron transport equation in a finite slab random medium with pure-triplet anisotropic scattering is studied. The stochastic medium is assumed to consist of two randomly mixed immiscible fluids. The cross section and the scattering kernel are treated as discrete random variables, which obey the same statistics as Markovian processes and exponential chord length statistics. The medium boundaries are considered to have specular reflectivities with angular-dependent externally incident flux. The deterministic solution is obtained by using Pomraning-Eddington approximation. Numerical results are calculated for the average reflectivity and average transmissivity for different values of the single scattering albedo and varying the parameters which characterize the random medium. Compared to the results obtained by Adams et al. in case of isotropic scattering that based on the Monte Carlo technique, it can be seen that we have good comparable data

  20. How to Fully Represent Expert Information about Imprecise Properties in a Computer System – Random Sets, Fuzzy Sets, and Beyond: An Overview

    Science.gov (United States)

    Nguyen, Hung T.; Kreinovich, Vladik

    2014-01-01

    To help computers make better decisions, it is desirable to describe all our knowledge in computer-understandable terms. This is easy for knowledge described in terms on numerical values: we simply store the corresponding numbers in the computer. This is also easy for knowledge about precise (well-defined) properties which are either true or false for each object: we simply store the corresponding “true” and “false” values in the computer. The challenge is how to store information about imprecise properties. In this paper, we overview different ways to fully store the expert information about imprecise properties. We show that in the simplest case, when the only source of imprecision is disagreement between different experts, a natural way to store all the expert information is to use random sets; we also show how fuzzy sets naturally appear in such random-set representation. We then show how the random-set representation can be extended to the general (“fuzzy”) case when, in addition to disagreements, experts are also unsure whether some objects satisfy certain properties or not. PMID:25386045