WorldWideScience

Sample records for unknown random variable

  1. Zero Distribution of System with Unknown Random Variables Case Study: Avoiding Collision Path

    Directory of Open Access Journals (Sweden)

    Parman Setyamartana

    2014-07-01

    Full Text Available This paper presents the stochastic analysis of finding the feasible trajectories of robotics arm motion at obstacle surrounding. Unknown variables are coefficients of polynomials joint angle so that the collision-free motion is achieved. ãk is matrix consisting of these unknown feasible polynomial coefficients. The pattern of feasible polynomial in the obstacle environment shows as random. This paper proposes to model the pattern of this randomness values using random polynomial with unknown variables as coefficients. The behavior of the system will be obtained from zero distribution as the characteristic of such random polynomial. Results show that the pattern of random polynomial of avoiding collision can be constructed from zero distribution. Zero distribution is like building block of the system with obstacles as uncertainty factor. By scale factor k, which has range, the random coefficient pattern can be predicted.

  2. Stochastic Optimal Estimation with Fuzzy Random Variables and Fuzzy Kalman Filtering

    Institute of Scientific and Technical Information of China (English)

    FENG Yu-hu

    2005-01-01

    By constructing a mean-square performance index in the case of fuzzy random variable, the optimal estimation theorem for unknown fuzzy state using the fuzzy observation data are given. The state and output of linear discrete-time dynamic fuzzy system with Gaussian noise are Gaussian fuzzy random variable sequences. An approach to fuzzy Kalman filtering is discussed. Fuzzy Kalman filtering contains two parts: a real-valued non-random recurrence equation and the standard Kalman filtering.

  3. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Strong Decomposition of Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.

    2007-01-01

    A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....

  5. Modal Parameter Identification from Responses of General Unknown Random Inputs

    DEFF Research Database (Denmark)

    Ibrahim, S. R.; Asmussen, J. C.; Brincker, Rune

    1996-01-01

    Modal parameter identification from ambient responses due to a general unknown random inputs is investigated. Existing identification techniques which are based on assumptions of white noise and or stationary random inputs are utilized even though the inputs conditions are not satisfied....... This is accomplished via adding. In cascade. A force cascade conversion to the structures system under consideration. The input to the force conversion system is white noise and the output of which is the actual force(s) applied to the structure. The white noise input(s) and the structures responses are then used...

  6. Ordered random variables theory and applications

    CERN Document Server

    Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M

    2016-01-01

    Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).

  7. Contextuality is about identity of random variables

    International Nuclear Information System (INIS)

    Dzhafarov, Ehtibar N; Kujala, Janne V

    2014-01-01

    Contextual situations are those in which seemingly ‘the same’ random variable changes its identity depending on the conditions under which it is recorded. Such a change of identity is observed whenever the assumption that the variable is one and the same under different conditions leads to contradictions when one considers its joint distribution with other random variables (this is the essence of all Bell-type theorems). In our Contextuality-by-Default approach, instead of asking why or how the conditions force ‘one and the same’ random variable to change ‘its’ identity, any two random variables recorded under different conditions are considered different ‘automatically.’ They are never the same, nor are they jointly distributed, but one can always impose on them a joint distribution (probabilistic coupling). The special situations when there is a coupling in which these random variables are equal with probability 1 are considered noncontextual. Contextuality means that such couplings do not exist. We argue that the determination of the identity of random variables by conditions under which they are recorded is not a causal relationship and cannot violate laws of physics. (paper)

  8. Contextuality in canonical systems of random variables

    Science.gov (United States)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  9. A random number generator for continuous random variables

    Science.gov (United States)

    Guerra, V. M.; Tapia, R. A.; Thompson, J. R.

    1972-01-01

    A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.

  10. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  11. Random neural Q-learning for obstacle avoidance of a mobile robot in unknown environments

    Directory of Open Access Journals (Sweden)

    Jing Yang

    2016-07-01

    Full Text Available The article presents a random neural Q-learning strategy for the obstacle avoidance problem of an autonomous mobile robot in unknown environments. In the proposed strategy, two independent modules, namely, avoidance without considering the target and goal-seeking without considering obstacles, are first trained using the proposed random neural Q-learning algorithm to obtain their best control policies. Then, the two trained modules are combined based on a switching function to realize the obstacle avoidance in unknown environments. For the proposed random neural Q-learning algorithm, a single-hidden layer feedforward network is used to approximate the Q-function to estimate the Q-value. The parameters of the single-hidden layer feedforward network are modified using the recently proposed neural algorithm named the online sequential version of extreme learning machine, where the parameters of the hidden nodes are assigned randomly and the sample data can come one by one. However, different from the original online sequential version of extreme learning machine algorithm, the initial output weights are estimated subjected to quadratic inequality constraint to improve the convergence speed. Finally, the simulation results demonstrate that the proposed random neural Q-learning strategy can successfully solve the obstacle avoidance problem. Also, the higher learning efficiency and better generalization ability are achieved by the proposed random neural Q-learning algorithm compared with the Q-learning based on the back-propagation method.

  12. Building a RAPPOR with the Unknown: Privacy-Preserving Learning of Associations and Data Dictionaries

    Directory of Open Access Journals (Sweden)

    Fanti Giulia

    2016-07-01

    Full Text Available Techniques based on randomized response enable the collection of potentially sensitive data from clients in a privacy-preserving manner with strong local differential privacy guarantees. A recent such technology, RAPPOR [12], enables estimation of the marginal frequencies of a set of strings via privacy-preserving crowdsourcing. However, this original estimation process relies on a known dictionary of possible strings; in practice, this dictionary can be extremely large and/or unknown. In this paper, we propose a novel decoding algorithm for the RAPPOR mechanism that enables the estimation of “unknown unknowns,” i.e., strings we do not know we should be estimating. To enable learning without explicit dictionary knowledge, we develop methodology for estimating the joint distribution of multiple variables collected with RAPPOR. Our contributions are not RAPPOR-specific, and can be generalized to other local differential privacy mechanisms for learning distributions of string-valued random variables.

  13. On Complex Random Variables

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable  is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of  have a complex univariate normal distribution. The characteristic function of  has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector  is Hermitian positive definite. Marginal distributions of  have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.

  14. Polynomial chaos expansion with random and fuzzy variables

    Science.gov (United States)

    Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.

    2016-06-01

    A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.

  15. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    Science.gov (United States)

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  16. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Directory of Open Access Journals (Sweden)

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  17. Maximal Inequalities for Dependent Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jorgensen, Jorgen

    2016-01-01

    Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X-k. Then a......Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X......-k. Then a maximal inequality gives conditions ensuring that the maximal partial sum M-n = max(1) (...

  18. Benford's law and continuous dependent random variables

    Science.gov (United States)

    Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine

    2018-01-01

    Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.

  19. Hoeffding’s Inequality for Sums of Dependent Random Variables

    Czech Academy of Sciences Publication Activity Database

    Pelekis, Christos; Ramon, J.

    2017-01-01

    Roč. 14, č. 6 (2017), č. článku 243. ISSN 1660-5446 Institutional support: RVO:67985807 Keywords : dependent random variables * Hoeffding’s inequality * k-wise independent random variables * martingale differences Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.868, year: 2016

  20. On the product and ratio of Bessel random variables

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2005-01-01

    Full Text Available The distributions of products and ratios of random variables are of interest in many areas of the sciences. In this paper, the exact distributions of the product |XY| and the ratio |X/Y| are derived when X and Y are independent Bessel function random variables. An application of the results is provided by tabulating the associated percentage points.

  1. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    .e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  2. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  3. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  4. Perfusion-CT guided intravenous thrombolysis in patients with unknown-onset stroke: a randomized, double-blind, placebo-controlled, pilot feasibility trial.

    Science.gov (United States)

    Michel, Patrik; Ntaios, George; Reichhart, Marc; Schindler, Christian; Bogousslavsky, Julien; Maeder, Philip; Meuli, Reto; Wintermark, Max

    2012-06-01

    Patients with unknown stroke onset are generally excluded from acute recanalisation treatments. We designed a pilot study to assess feasibility of a trial of perfusion computed tomography (PCT)-guided thrombolysis in patients with ischemic tissue at risk of infarction and unknown stroke onset. Patients with a supratentorial stroke of unknown onset in the middle cerebral artery territory and significant volume of at-risk tissue on PCT were randomized to intravenous thrombolysis with alteplase (0.9 mg/kg) or placebo. Feasibility endpoints were randomization and blinded treatment of patients within 2 h after hospital arrival, and the correct application (estimation) of the perfusion imaging criteria. At baseline, there was a trend towards older age [69.5 (57-78) vs. 49 (44-78) years] in the thrombolysis group (n = 6) compared to placebo (n = 6). Regarding feasibility, hospital arrival to treatment delay was above the allowed 2 h in three patients (25%). There were two protocol violations (17%) regarding PCT, both underestimating the predicted infarct in patients randomized in the placebo group. No symptomatic hemorrhage or death occurred during the first 7 days. Three of the four (75%) and one of the five (20%) patients were recanalized in the thrombolysis and placebo group respectively. The volume of non-infarcted at-risk tissue was 84 (44-206) cm(3) in the treatment arm and 29 (8-105) cm(3) in the placebo arm. This pilot study shows that a randomized PCT-guided thrombolysis trial in patients with stroke of unknown onset may be feasible if issues such as treatment delays and reliable identification of tissue at risk of infarction tissue are resolved. Safety and efficiency of such an approach need to be established.

  5. Perfusion-CT guided intravenous thrombolysis in patients with unknown-onset stroke: a randomized, double-blind, placebo-controlled, pilot feasibility trial

    International Nuclear Information System (INIS)

    Michel, Patrik; Ntaios, George; Reichhart, Marc; Schindler, Christian; Bogousslavsky, Julien; Maeder, Philip; Meuli, Reto; Wintermark, Max

    2012-01-01

    Patients with unknown stroke onset are generally excluded from acute recanalisation treatments. We designed a pilot study to assess feasibility of a trial of perfusion computed tomography (PCT)-guided thrombolysis in patients with ischemic tissue at risk of infarction and unknown stroke onset. Patients with a supratentorial stroke of unknown onset in the middle cerebral artery territory and significant volume of at-risk tissue on PCT were randomized to intravenous thrombolysis with alteplase (0.9 mg/kg) or placebo. Feasibility endpoints were randomization and blinded treatment of patients within 2 h after hospital arrival, and the correct application (estimation) of the perfusion imaging criteria. At baseline, there was a trend towards older age [69.5 (57-78) vs. 49 (44-78) years] in the thrombolysis group (n = 6) compared to placebo (n = 6). Regarding feasibility, hospital arrival to treatment delay was above the allowed 2 h in three patients (25%). There were two protocol violations (17%) regarding PCT, both underestimating the predicted infarct in patients randomized in the placebo group. No symptomatic hemorrhage or death occurred during the first 7 days. Three of the four (75%) and one of the five (20%) patients were recanalized in the thrombolysis and placebo group respectively. The volume of non-infarcted at-risk tissue was 84 (44-206) cm 3 in the treatment arm and 29 (8-105) cm 3 in the placebo arm. This pilot study shows that a randomized PCT-guided thrombolysis trial in patients with stroke of unknown onset may be feasible if issues such as treatment delays and reliable identification of tissue at risk of infarction tissue are resolved. Safety and efficiency of such an approach need to be established. (orig.)

  6. A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jae-Hee Hur

    2017-01-01

    Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.

  7. Exponential Inequalities for Positively Associated Random Variables and Applications

    Directory of Open Access Journals (Sweden)

    Yang Shanchao

    2008-01-01

    Full Text Available Abstract We establish some exponential inequalities for positively associated random variables without the boundedness assumption. These inequalities improve the corresponding results obtained by Oliveira (2005. By one of the inequalities, we obtain the convergence rate for the case of geometrically decreasing covariances, which closes to the optimal achievable convergence rate for independent random variables under the Hartman-Wintner law of the iterated logarithm and improves the convergence rate derived by Oliveira (2005 for the above case.

  8. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  9. Development of a localized probabilistic sensitivity method to determine random variable regional importance

    International Nuclear Information System (INIS)

    Millwater, Harry; Singh, Gulshan; Cortina, Miguel

    2012-01-01

    There are many methods to identify the important variable out of a set of random variables, i.e., “inter-variable” importance; however, to date there are no comparable methods to identify the “region” of importance within a random variable, i.e., “intra-variable” importance. Knowledge of the critical region of an input random variable (tail, near-tail, and central region) can provide valuable information towards characterizing, understanding, and improving a model through additional modeling or testing. As a result, an intra-variable probabilistic sensitivity method was developed and demonstrated for independent random variables that computes the partial derivative of a probabilistic response with respect to a localized perturbation in the CDF values of each random variable. These sensitivities are then normalized in absolute value with respect to the largest sensitivity within a distribution to indicate the region of importance. The methodology is implemented using the Score Function kernel-based method such that existing samples can be used to compute sensitivities for negligible cost. Numerical examples demonstrate the accuracy of the method through comparisons with finite difference and numerical integration quadrature estimates. - Highlights: ► Probabilistic sensitivity methodology. ► Determines the “region” of importance within random variables such as left tail, near tail, center, right tail, etc. ► Uses the Score Function approach to reuse the samples, hence, negligible cost. ► No restrictions on the random variable types or limit states.

  10. New Results On the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2015-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented.

  11. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2016-01-06

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  12. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza; Alouini, Mohamed-Slim

    2016-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  13. Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Dragana Č. Pavlović

    2013-01-01

    Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.

  14. Partial summations of stationary sequences of non-Gaussian random variables

    DEFF Research Database (Denmark)

    Mohr, Gunnar; Ditlevsen, Ove Dalager

    1996-01-01

    The distribution of the sum of a finite number of identically distributed random variables is in many cases easily determined given that the variables are independent. The moments of any order of the sum can always be expressed by the moments of the single term without computational problems...... of convergence of the distribution of a sum (or an integral) of mutually dependent random variables to the Gaussian distribution. The paper is closely related to the work in Ditlevsen el al. [Ditlevsen, O., Mohr, G. & Hoffmeyer, P. Integration of non-Gaussian fields. Prob. Engng Mech 11 (1996) 15-23](2)....... lognormal variables or polynomials of standard Gaussian variables. The dependency structure is induced by specifying the autocorrelation structure of the sequence of standard Gaussian variables. Particularly useful polynomials are the Winterstein approximations that distributionally fit with non...

  15. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    Science.gov (United States)

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  16. A review of instrumental variable estimators for Mendelian randomization.

    Science.gov (United States)

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  17. A Particle Swarm Optimization Algorithm with Variable Random Functions and Mutation

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xiao-Jun; YANG Chun-Hua; GUI Wei-Hua; DONG Tian-Xue

    2014-01-01

    The convergence analysis of the standard particle swarm optimization (PSO) has shown that the changing of random functions, personal best and group best has the potential to improve the performance of the PSO. In this paper, a novel strategy with variable random functions and polynomial mutation is introduced into the PSO, which is called particle swarm optimization algorithm with variable random functions and mutation (PSO-RM). Random functions are adjusted with the density of the population so as to manipulate the weight of cognition part and social part. Mutation is executed on both personal best particle and group best particle to explore new areas. Experiment results have demonstrated the effectiveness of the strategy.

  18. On mean square displacement behaviors of anomalous diffusions with variable and random orders

    International Nuclear Information System (INIS)

    Sun Hongguang; Chen Wen; Sheng Hu; Chen Yangquan

    2010-01-01

    Mean square displacement (MSD) is used to characterize anomalous diffusion. Recently, models of anomalous diffusion with variable-order and random-order were proposed, but no MSD analysis has been given so far. The purpose of this Letter is to offer a concise derivation of MSD functions for the variable-order model and the random-order model. Numerical results are presented to illustrate the analytical results. In addition, we show how to establish a variable-random-order model for a given MSD function which has clear application potentials.

  19. Limit theorems for multi-indexed sums of random variables

    CERN Document Server

    Klesov, Oleg

    2014-01-01

    Presenting the first unified treatment of limit theorems for multiple sums of independent random variables, this volume fills an important gap in the field. Several new results are introduced, even in the classical setting, as well as some new approaches that are simpler than those already established in the literature. In particular, new proofs of the strong law of large numbers and the Hajek-Renyi inequality are detailed. Applications of the described theory include Gibbs fields, spin glasses, polymer models, image analysis and random shapes. Limit theorems form the backbone of probability theory and statistical theory alike. The theory of multiple sums of random variables is a direct generalization of the classical study of limit theorems, whose importance and wide application in science is unquestionable. However, to date, the subject of multiple sums has only been treated in journals. The results described in this book will be of interest to advanced undergraduates, graduate students and researchers who ...

  20. Compound Poisson Approximations for Sums of Random Variables

    OpenAIRE

    Serfozo, Richard F.

    1986-01-01

    We show that a sum of dependent random variables is approximately compound Poisson when the variables are rarely nonzero and, given they are nonzero, their conditional distributions are nearly identical. We give several upper bounds on the total-variation distance between the distribution of such a sum and a compound Poisson distribution. Included is an example for Markovian occurrences of a rare event. Our bounds are consistent with those that are known for Poisson approximations for sums of...

  1. Randomized trial of intermittent or continuous amnioinfusion for variable decelerations.

    Science.gov (United States)

    Rinehart, B K; Terrone, D A; Barrow, J H; Isler, C M; Barrilleaux, P S; Roberts, W E

    2000-10-01

    To determine whether continuous or intermittent bolus amnioinfusion is more effective in relieving variable decelerations. Patients with repetitive variable decelerations were randomized to an intermittent bolus or continuous amnioinfusion. The intermittent bolus infusion group received boluses of 500 mL of normal saline, each over 30 minutes, with boluses repeated if variable decelerations recurred. The continuous infusion group received a bolus infusion of 500 mL of normal saline over 30 minutes and then 3 mL per minute until delivery occurred. The ability of the amnioinfusion to abolish variable decelerations was analyzed, as were maternal demographic and pregnancy outcome variables. Power analysis indicated that 64 patients would be required. Thirty-five patients were randomized to intermittent infusion and 30 to continuous infusion. There were no differences between groups in terms of maternal demographics, gestational age, delivery mode, neonatal outcome, median time to resolution of variable decelerations, or the number of times variable decelerations recurred. The median volume infused in the intermittent infusion group (500 mL) was significantly less than that in the continuous infusion group (905 mL, P =.003). Intermittent bolus amnioinfusion is as effective as continuous infusion in relieving variable decelerations in labor. Further investigation is necessary to determine whether either of these techniques is associated with increased occurrence of rare complications such as cord prolapse or uterine rupture.

  2. ESEARCH OF THE LAW OF DISTRIBUTION OF THE RANDOM VARIABLE OF THE COMPRESSION

    Directory of Open Access Journals (Sweden)

    I. Sarayeva

    2011-01-01

    Full Text Available At research of diagnosing the process of modern automobile engines by means of methods of mathematical statistics the experimental data of the random variable of compression are analysed and it is proved that the random variable of compression has the form of the normal law of distribution.

  3. Characteristics of quantum open systems: free random variables approach

    International Nuclear Information System (INIS)

    Gudowska-Nowak, E.; Papp, G.; Brickmann, J.

    1998-01-01

    Random Matrix Theory provides an interesting tool for modelling a number of phenomena where noises (fluctuations) play a prominent role. Various applications range from the theory of mesoscopic systems in nuclear and atomic physics to biophysical models, like Hopfield-type models of neural networks and protein folding. Random Matrix Theory is also used to study dissipative systems with broken time-reversal invariance providing a setup for analysis of dynamic processes in condensed, disordered media. In the paper we use the Random Matrix Theory (RMT) within the formalism of Free Random Variables (alias Blue's functions), which allows to characterize spectral properties of non-Hermitean ''Hamiltonians''. The relevance of using the Blue's function method is discussed in connection with application of non-Hermitean operators in various problems of physical chemistry. (author)

  4. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  5. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    Science.gov (United States)

    Bancroft, Stacie L.; Bourret, Jason C.

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…

  6. CONVERGENCE OF THE FRACTIONAL PARTS OF THE RANDOM VARIABLES TO THE TRUNCATED EXPONENTIAL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Bogdan Gheorghe Munteanu

    2013-01-01

    Full Text Available Using the stochastic approximations, in this paper it was studiedthe convergence in distribution of the fractional parts of the sum of random variables to the truncated exponential distribution with parameter lambda. This fact is feasible by means of the Fourier-Stieltjes sequence (FSS of the random variable.

  7. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  8. Output variability caused by random seeds in a multi-agent transport simulation model

    DEFF Research Database (Denmark)

    Paulsen, Mads; Rasmussen, Thomas Kjær; Nielsen, Otto Anker

    2018-01-01

    Dynamic transport simulators are intended to support decision makers in transport-related issues, and as such it is valuable that the random variability of their outputs is as small as possible. In this study we analyse the output variability caused by random seeds of a multi-agent transport...... simulator (MATSim) when applied to a case study of Santiago de Chile. Results based on 100 different random seeds shows that the relative accuracies of estimated link loads tend to increase with link load, but that relative errors of up to 10 % do occur even for links with large volumes. Although...

  9. Strong Laws of Large Numbers for Arrays of Rowwise NA and LNQD Random Variables

    Directory of Open Access Journals (Sweden)

    Jiangfeng Wang

    2011-01-01

    Full Text Available Some strong laws of large numbers and strong convergence properties for arrays of rowwise negatively associated and linearly negative quadrant dependent random variables are obtained. The results obtained not only generalize the result of Hu and Taylor to negatively associated and linearly negative quadrant dependent random variables, but also improve it.

  10. A modification of the successive projections algorithm for spectral variable selection in the presence of unknown interferents.

    Science.gov (United States)

    Soares, Sófacles Figueredo Carreiro; Galvão, Roberto Kawakami Harrop; Araújo, Mário César Ugulino; da Silva, Edvan Cirino; Pereira, Claudete Fernandes; de Andrade, Stéfani Iury Evangelista; Leite, Flaviano Carvalho

    2011-03-09

    This work proposes a modification to the successive projections algorithm (SPA) aimed at selecting spectral variables for multiple linear regression (MLR) in the presence of unknown interferents not included in the calibration data set. The modified algorithm favours the selection of variables in which the effect of the interferent is less pronounced. The proposed procedure can be regarded as an adaptive modelling technique, because the spectral features of the samples to be analyzed are considered in the variable selection process. The advantages of this new approach are demonstrated in two analytical problems, namely (1) ultraviolet-visible spectrometric determination of tartrazine, allure red and sunset yellow in aqueous solutions under the interference of erythrosine, and (2) near-infrared spectrometric determination of ethanol in gasoline under the interference of toluene. In these case studies, the performance of conventional MLR-SPA models is substantially degraded by the presence of the interferent. This problem is circumvented by applying the proposed Adaptive MLR-SPA approach, which results in prediction errors smaller than those obtained by three other multivariate calibration techniques, namely stepwise regression, full-spectrum partial-least-squares (PLS) and PLS with variables selected by a genetic algorithm. An inspection of the variable selection results reveals that the Adaptive approach successfully avoids spectral regions in which the interference is more intense. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Extended q -Gaussian and q -exponential distributions from gamma random variables

    Science.gov (United States)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  13. Piecewise linearisation of the first order loss function for families of arbitrarily distributed random variables

    NARCIS (Netherlands)

    Rossi, R.; Hendrix, E.M.T.

    2014-01-01

    We discuss the problem of computing optimal linearisation parameters for the first order loss function of a family of arbitrarily distributed random variable. We demonstrate that, in contrast to the problem in which parameters must be determined for the loss function of a single random variable,

  14. An infinite-dimensional weak KAM theory via random variables

    KAUST Repository

    Gomes, Diogo A.

    2016-08-31

    We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables\\' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.

  15. An infinite-dimensional weak KAM theory via random variables

    KAUST Repository

    Gomes, Diogo A.; Nurbekyan, Levon

    2016-01-01

    We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.

  16. Extensions of von Neumann's method for generating random variables

    International Nuclear Information System (INIS)

    Monahan, J.F.

    1979-01-01

    Von Neumann's method of generating random variables with the exponential distribution and Forsythe's method for obtaining distributions with densities of the form e/sup -G//sup( x/) are generalized to apply to certain power series representations. The flexibility of the power series methods is illustrated by algorithms for the Cauchy and geometric distributions

  17. Multifocal, chronic osteomyelitis of unknown etiology

    International Nuclear Information System (INIS)

    Kozlowski, K.; Beluffi, G.; Feltham, C.; James, M.; Nespoli, L.; Tamaela, L.; Pavia Univ.; Municipal Hospital, Nelson; Medical School, Jakarta

    1985-01-01

    Four cases of multifocal osteomyelitis of unknown origin in childhood are reported. The variable clinical and radiographic appearances of the disease are illustrated and the diagnostic difficulties in the early stages of the disease are stressed. (orig.) [de

  18. Quantum circuits cannot control unknown operations

    International Nuclear Information System (INIS)

    Araújo, Mateus; Feix, Adrien; Costa, Fabio; Brukner, Časlav

    2014-01-01

    One of the essential building blocks of classical computer programs is the ‘if’ clause, which executes a subroutine depending on the value of a control variable. Similarly, several quantum algorithms rely on applying a unitary operation conditioned on the state of a control system. Here we show that this control cannot be performed by a quantum circuit if the unitary is completely unknown. The task remains impossible even if we allow the control to be done modulo a global phase. However, this no-go theorem does not prevent implementing quantum control of unknown unitaries in practice, as any physical implementation of an unknown unitary provides additional information that makes the control possible. We then argue that one should extend the quantum circuit formalism to capture this possibility in a straightforward way. This is done by allowing unknown unitaries to be applied to subspaces and not only to subsystems. (paper)

  19. How a dependent's variable non-randomness affects taper equation ...

    African Journals Online (AJOL)

    In order to apply the least squares method in regression analysis, the values of the dependent variable Y should be random. In an example of regression analysis linear and nonlinear taper equations, which estimate the diameter of the tree dhi at any height of the tree hi, were compared. For each tree the diameter at the ...

  20. SOERP, Statistics and 2. Order Error Propagation for Function of Random Variables

    International Nuclear Information System (INIS)

    Cox, N. D.; Miller, C. F.

    1985-01-01

    1 - Description of problem or function: SOERP computes second-order error propagation equations for the first four moments of a function of independently distributed random variables. SOERP was written for a rigorous second-order error propagation of any function which may be expanded in a multivariable Taylor series, the input variables being independently distributed. The required input consists of numbers directly related to the partial derivatives of the function, evaluated at the nominal values of the input variables and the central moments of the input variables from the second through the eighth. 2 - Method of solution: The development of equations for computing the propagation of errors begins by expressing the function of random variables in a multivariable Taylor series expansion. The Taylor series expansion is then truncated, and statistical operations are applied to the series in order to obtain equations for the moments (about the origin) of the distribution of the computed value. If the Taylor series is truncated after powers of two, the procedure produces second-order error propagation equations. 3 - Restrictions on the complexity of the problem: The maximum number of component variables allowed is 30. The IBM version will only process one set of input data per run

  1. Higher order moments of a sum of random variables: remarks and applications.

    Directory of Open Access Journals (Sweden)

    Luisa Tibiletti

    1996-02-01

    Full Text Available The moments of a sum of random variables depend on both the pure moments of each random addendum and on the addendum mixed moments. In this note we introduce a simple measure to evaluate the relative impedance to attach to the latter. Once the pure moments are fixed, the functional relation between the random addenda leading to the extreme values is also provided. Applications to Finance, Decision Theory and Actuarial Sciences are also suggested.

  2. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Ahmed, Sajid

    2016-01-13

    Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.

  3. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Ahmed, Sajid; Alouini, Mohamed-Slim; Jardak, Seifallah

    2016-01-01

    Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.

  4. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Possibility/Necessity-Based Probabilistic Expectation Models for Linear Programming Problems with Discrete Fuzzy Random Variables

    Directory of Open Access Journals (Sweden)

    Hideki Katagiri

    2017-10-01

    Full Text Available This paper considers linear programming problems (LPPs where the objective functions involve discrete fuzzy random variables (fuzzy set-valued discrete random variables. New decision making models, which are useful in fuzzy stochastic environments, are proposed based on both possibility theory and probability theory. In multi-objective cases, Pareto optimal solutions of the proposed models are newly defined. Computational algorithms for obtaining the Pareto optimal solutions of the proposed models are provided. It is shown that problems involving discrete fuzzy random variables can be transformed into deterministic nonlinear mathematical programming problems which can be solved through a conventional mathematical programming solver under practically reasonable assumptions. A numerical example of agriculture production problems is given to demonstrate the applicability of the proposed models to real-world problems in fuzzy stochastic environments.

  6. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  7. Simulation-based production planning for engineer-to-order systems with random yield

    NARCIS (Netherlands)

    Akcay, Alp; Martagan, Tugce

    2018-01-01

    We consider an engineer-to-order production system with unknown yield. We model the yield as a random variable which represents the percentage output obtained from one unit of production quantity. We develop a beta-regression model in which the mean value of the yield depends on the unique

  8. Learning Unknown Structure in CRFs via Adaptive Gradient Projection Method

    Directory of Open Access Journals (Sweden)

    Wei Xue

    2016-08-01

    Full Text Available We study the problem of fitting probabilistic graphical models to the given data when the structure is not known. More specifically, we focus on learning unknown structure in conditional random fields, especially learning both the structure and parameters of a conditional random field model simultaneously. To do this, we first formulate the learning problem as a convex minimization problem by adding an l_2-regularization to the node parameters and a group l_1-regularization to the edge parameters, and then a gradient-based projection method is proposed to solve it which combines an adaptive stepsize selection strategy with a nonmonotone line search. Extensive simulation experiments are presented to show the performance of our approach in solving unknown structure learning problems.

  9. Random-Resistor-Random-Temperature Kirchhoff-Law-Johnson-Noise (RRRT-KLJN Key Exchange

    Directory of Open Access Journals (Sweden)

    Kish Laszlo B.

    2016-03-01

    Full Text Available We introduce two new Kirchhoff-law-Johnson-noise (KLJN secure key distribution schemes which are generalizations of the original KLJN scheme. The first of these, the Random-Resistor (RR- KLJN scheme, uses random resistors with values chosen from a quasi-continuum set. It is well-known since the creation of the KLJN concept that such a system could work in cryptography, because Alice and Bob can calculate the unknown resistance value from measurements, but the RR-KLJN system has not been addressed in prior publications since it was considered impractical. The reason for discussing it now is the second scheme, the Random Resistor Random Temperature (RRRT- KLJN key exchange, inspired by a recent paper of Vadai, Mingesz and Gingl, wherein security was shown to be maintained at non-zero power flow. In the RRRT-KLJN secure key exchange scheme, both the resistances and their temperatures are continuum random variables. We prove that the security of the RRRT-KLJN scheme can prevail at a non-zero power flow, and thus the physical law guaranteeing security is not the Second Law of Thermodynamics but the Fluctuation-Dissipation Theorem. Alice and Bob know their own resistances and temperatures and can calculate the resistance and temperature values at the other end of the communication channel from measured voltage, current and power-flow data in the wire. However, Eve cannot determine these values because, for her, there are four unknown quantities while she can set up only three equations. The RRRT-KLJN scheme has several advantages and makes all former attacks on the KLJN scheme invalid or incomplete.

  10. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    Science.gov (United States)

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  11. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    Science.gov (United States)

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  12. Central limit theorem for the Banach-valued weakly dependent random variables

    International Nuclear Information System (INIS)

    Dmitrovskij, V.A.; Ermakov, S.V.; Ostrovskij, E.I.

    1983-01-01

    The central limit theorem (CLT) for the Banach-valued weakly dependent random variables is proved. In proving CLT convergence of finite-measured (i.e. cylindrical) distributions is established. A weak compactness of the family of measures generated by a certain sequence is confirmed. The continuity of the limiting field is checked

  13. Free random variables

    CERN Document Server

    Voiculescu, Dan; Nica, Alexandru

    1992-01-01

    This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.

  14. Antimicrobial drugs for persistent diarrhoea of unknown or non-specific cause in children under six in low and middle income countries: systematic review of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Hart C Anthony

    2009-03-01

    Full Text Available Abstract Background A high proportion of children with persistent diarrhoea in middle and low income countries die. The best treatment is not clear. We conducted a systematic review to evaluate the effectiveness of antimicrobial drug treatment for persistent diarrhoea of unknown or non-specific cause. Methods We included randomized comparisons of antimicrobial drugs for the treatment of persistent diarrhoea of unknown or non-specific cause in children under the age of six years in low and middle income countries. We searched the electronic databases MEDLINE, EMBASE, LILACS, WEB OF SCIENCE, and the Cochrane Central Register of Controlled Trials (CENTRAL to May 2008 for relevant randomized or quasi randomized controlled trials. We summarised the characteristics of the eligible trials, assessed their quality using standard criteria, and extracted relevant outcomes data. Where appropriate, we combined the results of different trials. Results Three trials from South East Asia and one from Guatemala were included, all were small, and three had adequate allocation concealment. Two were in patients with diarrhoea of unknown cause, and two were in patients in whom known bacterial or parasitological causes of diarrhoea had been excluded. No difference was demonstrated for oral gentamicin compared with placebo (presence of diarrhoea at 6 or 7 days; 2 trials, n = 151; and for metronidazole compared with placebo (presence of diarrhoea at 3, 5 and 7 days; 1 trial, n = 99. In one small trial, sulphamethoxazole-trimethoprim appeared better than placebo in relation to diarrhoea at seven days and total stool volume (n = 55. Conclusion There is little evidence as to whether or not antimicrobials help treat persistent diarrhoea in young children in low and middle income countries.

  15. Antimicrobial drugs for persistent diarrhoea of unknown or non-specific cause in children under six in low and middle income countries: systematic review of randomized controlled trials

    Science.gov (United States)

    2009-01-01

    Background A high proportion of children with persistent diarrhoea in middle and low income countries die. The best treatment is not clear. We conducted a systematic review to evaluate the effectiveness of antimicrobial drug treatment for persistent diarrhoea of unknown or non-specific cause. Methods We included randomized comparisons of antimicrobial drugs for the treatment of persistent diarrhoea of unknown or non-specific cause in children under the age of six years in low and middle income countries. We searched the electronic databases MEDLINE, EMBASE, LILACS, WEB OF SCIENCE, and the Cochrane Central Register of Controlled Trials (CENTRAL) to May 2008 for relevant randomized or quasi randomized controlled trials. We summarised the characteristics of the eligible trials, assessed their quality using standard criteria, and extracted relevant outcomes data. Where appropriate, we combined the results of different trials. Results Three trials from South East Asia and one from Guatemala were included, all were small, and three had adequate allocation concealment. Two were in patients with diarrhoea of unknown cause, and two were in patients in whom known bacterial or parasitological causes of diarrhoea had been excluded. No difference was demonstrated for oral gentamicin compared with placebo (presence of diarrhoea at 6 or 7 days; 2 trials, n = 151); and for metronidazole compared with placebo (presence of diarrhoea at 3, 5 and 7 days; 1 trial, n = 99). In one small trial, sulphamethoxazole-trimethoprim appeared better than placebo in relation to diarrhoea at seven days and total stool volume (n = 55). Conclusion There is little evidence as to whether or not antimicrobials help treat persistent diarrhoea in young children in low and middle income countries. PMID:19257885

  16. AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM

    International Nuclear Information System (INIS)

    Farrell, Sean A.; Murphy, Tara; Lo, Kitty K.

    2015-01-01

    In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of a random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.

  17. Non-uniform approximations for sums of discrete m-dependent random variables

    OpenAIRE

    Vellaisamy, P.; Cekanavicius, V.

    2013-01-01

    Non-uniform estimates are obtained for Poisson, compound Poisson, translated Poisson, negative binomial and binomial approximations to sums of of m-dependent integer-valued random variables. Estimates for Wasserstein metric also follow easily from our results. The results are then exemplified by the approximation of Poisson binomial distribution, 2-runs and $m$-dependent $(k_1,k_2)$-events.

  18. How Far Is Quasar UV/Optical Variability from a Damped Random Walk at Low Frequency?

    Energy Technology Data Exchange (ETDEWEB)

    Guo Hengxiao; Wang Junxian; Cai Zhenyi; Sun Mouyuan, E-mail: hengxiaoguo@gmail.com, E-mail: jxw@ustc.edu.cn [CAS Key Laboratory for Research in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Hefei 230026 (China)

    2017-10-01

    Studies have shown that UV/optical light curves of quasars can be described using the prevalent damped random walk (DRW) model, also known as the Ornstein–Uhlenbeck process. A white noise power spectral density (PSD) is expected at low frequency in this model; however, a direct observational constraint to the low-frequency PSD slope is difficult due to the limited lengths of the light curves available. Meanwhile, quasars show scatter in their DRW parameters that is too large to be attributed to uncertainties in the measurements and dependence on the variation of known physical factors. In this work we present simulations showing that, if the low-frequency PSD deviates from the DRW, the red noise leakage can naturally produce large scatter in the variation parameters measured from simulated light curves. The steeper the low-frequency PSD slope, the larger scatter we expect. Based on observations of SDSS Stripe 82 quasars, we find that the low-frequency PSD slope should be no steeper than −1.3. The actual slope could be flatter, which consequently requires that the quasar variabilities should be influenced by other unknown factors. We speculate that the magnetic field and/or metallicity could be such additional factors.

  19. Stable Graphical Model Estimation with Random Forests for Discrete, Continuous, and Mixed Variables

    OpenAIRE

    Fellinghauer, Bernd; Bühlmann, Peter; Ryffel, Martin; von Rhein, Michael; Reinhardt, Jan D.

    2011-01-01

    A conditional independence graph is a concise representation of pairwise conditional independence among many variables. Graphical Random Forests (GRaFo) are a novel method for estimating pairwise conditional independence relationships among mixed-type, i.e. continuous and discrete, variables. The number of edges is a tuning parameter in any graphical model estimator and there is no obvious number that constitutes a good choice. Stability Selection helps choosing this parameter with respect to...

  20. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  1. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  2. On bounds in Poisson approximation for distributions of independent negative-binomial distributed random variables.

    Science.gov (United States)

    Hung, Tran Loc; Giang, Le Truong

    2016-01-01

    Using the Stein-Chen method some upper bounds in Poisson approximation for distributions of row-wise triangular arrays of independent negative-binomial distributed random variables are established in this note.

  3. Multiobjective Two-Stage Stochastic Programming Problems with Interval Discrete Random Variables

    Directory of Open Access Journals (Sweden)

    S. K. Barik

    2012-01-01

    Full Text Available Most of the real-life decision-making problems have more than one conflicting and incommensurable objective functions. In this paper, we present a multiobjective two-stage stochastic linear programming problem considering some parameters of the linear constraints as interval type discrete random variables with known probability distribution. Randomness of the discrete intervals are considered for the model parameters. Further, the concepts of best optimum and worst optimum solution are analyzed in two-stage stochastic programming. To solve the stated problem, first we remove the randomness of the problem and formulate an equivalent deterministic linear programming model with multiobjective interval coefficients. Then the deterministic multiobjective model is solved using weighting method, where we apply the solution procedure of interval linear programming technique. We obtain the upper and lower bound of the objective function as the best and the worst value, respectively. It highlights the possible risk involved in the decision-making tool. A numerical example is presented to demonstrate the proposed solution procedure.

  4. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Jardak, Seifallah

    2014-09-01

    Correlated waveforms have a number of applications in different fields, such as radar and communication. It is very easy to generate correlated waveforms using infinite alphabets, but for some of the applications, it is very challenging to use them in practice. Moreover, to generate infinite alphabet constant envelope correlated waveforms, the available research uses iterative algorithms, which are computationally very expensive. In this work, we propose simple novel methods to generate correlated waveforms using finite alphabet constant and non-constant-envelope symbols. To generate finite alphabet waveforms, the proposed method map the Gaussian random variables onto the phase-shift-keying, pulse-amplitude, and quadrature-amplitude modulation schemes. For such mapping, the probability-density-function of Gaussian random variables is divided into M regions, where M is the number of alphabets in the corresponding modulation scheme. By exploiting the mapping function, the relationship between the cross-correlation of Gaussian and finite alphabet symbols is derived. To generate equiprobable symbols, the area of each region is kept same. If the requirement is to have each symbol with its own unique probability, the proposed scheme allows us that as well. Although, the proposed scheme is general, the main focus of this paper is to generate finite alphabet waveforms for multiple-input multiple-output radar, where correlated waveforms are used to achieve desired beampatterns. © 2014 IEEE.

  5. Using randomized variable practice in the treatment of childhood apraxia of speech.

    Science.gov (United States)

    Skelton, Steven L; Hagopian, Aubrie Lynn

    2014-11-01

    The purpose of this study was to determine if randomized variable practice, a central component of concurrent treatment, would be effective and efficient in treating childhood apraxia of speech (CAS). Concurrent treatment is a treatment program that takes the speech task hierarchy and randomizes it so that all tasks are worked on in one session. Previous studies have shown the treatment program to be effective and efficient in treating phonological and articulation disorders. The program was adapted to be used with children with CAS. A research design of multiple baselines across participants was used. Probes of generalization to untaught words were administered every fifth session. Three children, ranging in age from 4 to 6 years old, were the participants. Data were collected as percent correct productions during baseline, treatment, and probes of generalization of target sounds to untaught words and three-word phrases. All participants showed an increase in correct productions during treatment and during probes. Effect sizes (standard mean difference) for treatment were 3.61-5.00, and for generalization probes, they were 3.15-8.51. The results obtained from this study suggest that randomized variable practice as used in concurrent treatment can be adapted for use in treating children with CAS. Replication of this study with other children presenting CAS will be needed to establish generality of the findings.

  6. Chaos Synchronization Based on Unknown Input Proportional Multiple-Integral Fuzzy Observer

    Directory of Open Access Journals (Sweden)

    T. Youssef

    2013-01-01

    Full Text Available This paper presents an unknown input Proportional Multiple-Integral Observer (PIO for synchronization of chaotic systems based on Takagi-Sugeno (TS fuzzy chaotic models subject to unmeasurable decision variables and unknown input. In a secure communication configuration, this unknown input is regarded as a message encoded in the chaotic system and recovered by the proposed PIO. Both states and outputs of the fuzzy chaotic models are subject to polynomial unknown input with kth derivative zero. Using Lyapunov stability theory, sufficient design conditions for synchronization are proposed. The PIO gains matrices are obtained by resolving linear matrix inequalities (LMIs constraints. Simulation results show through two TS fuzzy chaotic models the validity of the proposed method.

  7. Problems of variance reduction in the simulation of random variables

    International Nuclear Information System (INIS)

    Lessi, O.

    1987-01-01

    The definition of the uniform linear generator is given and some of the mostly used tests to evaluate the uniformity and the independence of the obtained determinations are listed. The problem of calculating, through simulation, some moment W of a random variable function is taken into account. The Monte Carlo method enables the moment W to be estimated and the estimator variance to be obtained. Some techniques for the construction of other estimators of W with a reduced variance are introduced

  8. On the Distribution of Indefinite Quadratic Forms in Gaussian Random Variables

    KAUST Repository

    Al-Naffouri, Tareq Y.

    2015-10-30

    © 2015 IEEE. In this work, we propose a unified approach to evaluating the CDF and PDF of indefinite quadratic forms in Gaussian random variables. Such a quantity appears in many applications in communications, signal processing, information theory, and adaptive filtering. For example, this quantity appears in the mean-square-error (MSE) analysis of the normalized least-meansquare (NLMS) adaptive algorithm, and SINR associated with each beam in beam forming applications. The trick of the proposed approach is to replace inequalities that appear in the CDF calculation with unit step functions and to use complex integral representation of the the unit step function. Complex integration allows us then to evaluate the CDF in closed form for the zero mean case and as a single dimensional integral for the non-zero mean case. Utilizing the saddle point technique allows us to closely approximate such integrals in non zero mean case. We demonstrate how our approach can be extended to other scenarios such as the joint distribution of quadratic forms and ratios of such forms, and to characterize quadratic forms in isotropic distributed random variables.We also evaluate the outage probability in multiuser beamforming using our approach to provide an application of indefinite forms in communications.

  9. An edgeworth expansion for a sum of M-Dependent random variables

    Directory of Open Access Journals (Sweden)

    Wan Soo Rhee

    1985-01-01

    Full Text Available Given a sequence X1,X2,…,Xn of m-dependent random variables with moments of order 3+α (0<α≦1, we give an Edgeworth expansion of the distribution of Sσ−1(S=X1+X2+…+Xn, σ2=ES2 under the assumption that E[exp(it Sσ1] is small away from the origin. The result is of the best possible order.

  10. Analysis of Secret Key Randomness Exploiting the Radio Channel Variability

    Directory of Open Access Journals (Sweden)

    Taghrid Mazloum

    2015-01-01

    Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.

  11. A simulation study on estimating biomarker-treatment interaction effects in randomized trials with prognostic variables.

    Science.gov (United States)

    Haller, Bernhard; Ulm, Kurt

    2018-02-20

    To individualize treatment decisions based on patient characteristics, identification of an interaction between a biomarker and treatment is necessary. Often such potential interactions are analysed using data from randomized clinical trials intended for comparison of two treatments. Tests of interactions are often lacking statistical power and we investigated if and how a consideration of further prognostic variables can improve power and decrease the bias of estimated biomarker-treatment interactions in randomized clinical trials with time-to-event outcomes. A simulation study was performed to assess how prognostic factors affect the estimate of the biomarker-treatment interaction for a time-to-event outcome, when different approaches, like ignoring other prognostic factors, including all available covariates or using variable selection strategies, are applied. Different scenarios regarding the proportion of censored observations, the correlation structure between the covariate of interest and further potential prognostic variables, and the strength of the interaction were considered. The simulation study revealed that in a regression model for estimating a biomarker-treatment interaction, the probability of detecting a biomarker-treatment interaction can be increased by including prognostic variables that are associated with the outcome, and that the interaction estimate is biased when relevant prognostic variables are not considered. However, the probability of a false-positive finding increases if too many potential predictors are included or if variable selection is performed inadequately. We recommend undertaking an adequate literature search before data analysis to derive information about potential prognostic variables and to gain power for detecting true interaction effects and pre-specifying analyses to avoid selective reporting and increased false-positive rates.

  12. Fitting Nonlinear Ordinary Differential Equation Models with Random Effects and Unknown Initial Conditions Using the Stochastic Approximation Expectation-Maximization (SAEM) Algorithm.

    Science.gov (United States)

    Chow, Sy-Miin; Lu, Zhaohua; Sherwood, Andrew; Zhu, Hongtu

    2016-03-01

    The past decade has evidenced the increased prevalence of irregularly spaced longitudinal data in social sciences. Clearly lacking, however, are modeling tools that allow researchers to fit dynamic models to irregularly spaced data, particularly data that show nonlinearity and heterogeneity in dynamical structures. We consider the issue of fitting multivariate nonlinear differential equation models with random effects and unknown initial conditions to irregularly spaced data. A stochastic approximation expectation-maximization algorithm is proposed and its performance is evaluated using a benchmark nonlinear dynamical systems model, namely, the Van der Pol oscillator equations. The empirical utility of the proposed technique is illustrated using a set of 24-h ambulatory cardiovascular data from 168 men and women. Pertinent methodological challenges and unresolved issues are discussed.

  13. A Neural Network Controller for Variable-Speed Variable-Pitch Wind Energy Conversion Systems Using Generalized Minimum Entropy Criterion

    Directory of Open Access Journals (Sweden)

    Mifeng Ren

    2014-01-01

    Full Text Available This paper considers the neural network controller design problem for variable pitch wind energy conversion systems (WECS with non-Gaussian wind speed disturbances in the stochastic distribution control framework. The approach here is used to directly model the unknown control law based on a fixed neural network (the number of layers and nodes in a neural network is fixed without the need to construct a separate model for the WECS. In order to characterize the randomness of the WECS, a generalized minimum entropy criterion is established to train connection weights of the neural network. For the train purpose, both kernel density estimation method and sliding window technique are adopted to estimate the PDF of tracking error and entropies. Due to the unknown process dynamics, the gradient of the objective function in a gradient-descent-type algorithm is estimated using an incremental perturbation method. The proposed approach is illustrated on a simulated WECS with non-Gaussian wind speed.

  14. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  15. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  16. Prevalence and Impact of Unknown Diabetes in the ICU.

    Science.gov (United States)

    Carpenter, David L; Gregg, Sara R; Xu, Kejun; Buchman, Timothy G; Coopersmith, Craig M

    2015-12-01

    Many patients with diabetes and their care providers are unaware of the presence of the disease. Dysglycemia encompassing hyperglycemia, hypoglycemia, and glucose variability is common in the ICU in patients with and without diabetes. The purpose of this study was to determine the impact of unknown diabetes on glycemic control in the ICU. Prospective observational study. Nine ICUs in an academic, tertiary hospital and a hybrid academic/community hospital. Hemoglobin A1c levels were ordered at all ICU admissions from March 1, 2011 to September 30, 2013. Electronic medical records were examined for a history of antihyperglycemic medications or International Classification of Diseases, 9th Edition diagnosis of diabetes. Patients were categorized as having unknown diabetes (hemoglobin A1c > 6.5%, without history of diabetes), no diabetes (hemoglobin A1c 6.5%, with documented history of diabetes). None. A total of 15,737 patients had an hemoglobin A1c and medical record evaluable for the history of diabetes, and 5,635 patients had diabetes diagnosed by either medical history or an elevated hemoglobin A1c in the ICU. Of these, 1,460 patients had unknown diabetes, accounting for 26.0% of all patients with diabetes. This represented 41.0% of patients with an hemoglobin A1c > 6.5% and 9.3% of all ICU patients. Compared with patients without diabetes, patients with unknown diabetes had a higher likelihood of requiring an insulin infusion (44.3% vs 29.3%; p 180 mg/dL; p < 0.0001) and hypoglycemia (8.9% vs 2.5%; blood glucose < 70 mg/dL; p < 0.0001), higher glycemic variability (55.6 vs 28.8, average of patient SD of glucose; p < 0.0001), and increased mortality (13.8% vs 11.4%; p = 0.01). Patients with unknown diabetes represent a significant percentage of ICU admissions. Measurement of hemoglobin A1c at admission can prospectively identify a population that are not known to have diabetes but have significant challenges in glycemic control in the ICU.

  17. Nonlinear Estimation of Discrete-Time Signals Under Random Observation Delay

    International Nuclear Information System (INIS)

    Caballero-Aguila, R.; Jimenez-Lopez, J. D.; Hermoso-Carazo, A.; Linares-Perez, J.; Nakamori, S.

    2008-01-01

    This paper presents an approximation to the nonlinear least-squares estimation problem of discrete-time stochastic signals using nonlinear observations with additive white noise which can be randomly delayed by one sampling time. The observation delay is modelled by a sequence of independent Bernoulli random variables whose values, zero or one, indicate that the real observation arrives on time or it is delayed and, hence, the available measurement to estimate the signal is not up-to-date. Assuming that the state-space model generating the signal is unknown and only the covariance functions of the processes involved in the observation equation are ready for use, a filtering algorithm based on linear approximations of the real observations is proposed.

  18. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  19. Equivalent conditions of complete moment convergence for extended negatively dependent random variables

    Directory of Open Access Journals (Sweden)

    Qunying Wu

    2017-05-01

    Full Text Available Abstract In this paper, we study the equivalent conditions of complete moment convergence for sequences of identically distributed extended negatively dependent random variables. As a result, we extend and generalize some results of complete moment convergence obtained by Chow (Bull. Inst. Math. Acad. Sin. 16:177-201, 1988 and Li and Spătaru (J. Theor. Probab. 18:933-947, 2005 from the i.i.d. case to extended negatively dependent sequences.

  20. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  1. Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables

    KAUST Repository

    Jardak, Seifallah

    2012-11-01

    The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.

  2. Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables

    KAUST Repository

    Jardak, Seifallah; Ahmed, Sajid; Alouini, Mohamed-Slim

    2012-01-01

    The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.

  3. Spontaneous temporal changes and variability of peripheral nerve conduction analyzed using a random effects model

    DEFF Research Database (Denmark)

    Krøigård, Thomas; Gaist, David; Otto, Marit

    2014-01-01

    SUMMARY: The reproducibility of variables commonly included in studies of peripheral nerve conduction in healthy individuals has not previously been analyzed using a random effects regression model. We examined the temporal changes and variability of standard nerve conduction measures in the leg...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity......, sural nerve sensory conduction velocity, and tibial nerve minimal F-wave latency. Between-subject variability was greater than within-subject variability. Sample sizes ranging from 21 to 128 would be required to show changes twice the magnitude of the spontaneous changes observed in this study. Nerve...

  4. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  5. The Distribution of Minimum of Ratios of Two Random Variables and Its Application in Analysis of Multi-hop Systems

    Directory of Open Access Journals (Sweden)

    A. Stankovic

    2012-12-01

    Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.

  6. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  7. Robust Control for the Segway with Unknown Control Coefficient and Model Uncertainties

    Directory of Open Access Journals (Sweden)

    Byung Woo Kim

    2016-06-01

    Full Text Available The Segway, which is a popular vehicle nowadays, is an uncertain nonlinear system and has an unknown time-varying control coefficient. Thus, we should consider the unknown time-varying control coefficient and model uncertainties to design the controller. Motivated by this observation, we propose a robust control for the Segway with unknown control coefficient and model uncertainties. To deal with the time-varying unknown control coefficient, we employ the Nussbaum gain technique. We introduce an auxiliary variable to solve the underactuated problem. Due to the prescribed performance control technique, the proposed controller does not require the adaptive technique, neural network, and fuzzy logic to compensate the uncertainties. Therefore, it can be simple. From the Lyapunov stability theory, we prove that all signals in the closed-loop system are bounded. Finally, we provide the simulation results to demonstrate the effectiveness of the proposed control scheme.

  8. Accounting for unknown foster dams in the genetic evaluation of embryo transfer progeny.

    Science.gov (United States)

    Suárez, M J; Munilla, S; Cantet, R J C

    2015-02-01

    Animals born by embryo transfer (ET) are usually not included in the genetic evaluation of beef cattle for preweaning growth if the recipient dam is unknown. This is primarily to avoid potential bias in the estimation of the unknown age of dam. We present a method that allows including records of calves with unknown age of dam. Assumptions are as follows: (i) foster cows belong to the same breed being evaluated, (ii) there is no correlation between the breeding value (BV) of the calf and the maternal BV of the recipient cow, and (iii) cows of all ages are used as recipients. We examine the issue of bias for the fixed level of unknown age of dam (AOD) and propose an estimator of the effect based on classical measurement error theory (MEM) and a Bayesian approach. Using stochastic simulation under random mating or selection, the MEM estimating equations were compared with BLUP in two situations as follows: (i) full information (FI); (ii) missing AOD information on some dams. Predictions of breeding value (PBV) from the FI situation had the smallest empirical average bias followed by PBV obtained without taking measurement error into account. In turn, MEM displayed the highest bias, although the differences were small. On the other hand, MEM showed the smallest MSEP, for either random mating or selection, followed by FI, whereas ignoring measurement error produced the largest MSEP. As a consequence from the smallest MSEP with a relatively small bias, empirical accuracies of PBV were larger for MEM than those for full information, which in turn showed larger accuracies than the situation ignoring measurement error. It is concluded that MEM equations are a useful alternative for analysing weaning weight data when recipient cows are unknown, as it mitigates the effects of bias in AOD by decreasing MSEP. © 2014 Blackwell Verlag GmbH.

  9. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  10. Bottom-up and Top-down Input Augment the Variability of Cortical Neurons

    Science.gov (United States)

    Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.

    2016-01-01

    SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459

  11. Physical activity, mindfulness meditation, or heart rate variability biofeedback for stress reduction: a randomized controlled trial

    NARCIS (Netherlands)

    van der Zwan, J.E.; de Vente, W.; Huizink, A.C.; Bögels, S.M.; de Bruin, E.I.

    2015-01-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing

  12. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik; Alouini, Mohamed-Slim; Yang, Hongchuan

    2010-01-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs

  13. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-05-02

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).

  14. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  15. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  16. Residual and Past Entropy for Concomitants of Ordered Random Variables of Morgenstern Family

    Directory of Open Access Journals (Sweden)

    M. M. Mohie EL-Din

    2015-01-01

    Full Text Available For a system, which is observed at time t, the residual and past entropies measure the uncertainty about the remaining and the past life of the distribution, respectively. In this paper, we have presented the residual and past entropy of Morgenstern family based on the concomitants of the different types of generalized order statistics (gos and give the linear transformation of such model. Characterization results for these dynamic entropies for concomitants of ordered random variables have been considered.

  17. Capturing heterogeneity in gene expression studies by surrogate variable analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey T Leek

    2007-09-01

    Full Text Available It has unambiguously been shown that genetic, environmental, demographic, and technical factors may have substantial effects on gene expression levels. In addition to the measured variable(s of interest, there will tend to be sources of signal due to factors that are unknown, unmeasured, or too complicated to capture through simple models. We show that failing to incorporate these sources of heterogeneity into an analysis can have widespread and detrimental effects on the study. Not only can this reduce power or induce unwanted dependence across genes, but it can also introduce sources of spurious signal to many genes. This phenomenon is true even for well-designed, randomized studies. We introduce "surrogate variable analysis" (SVA to overcome the problems caused by heterogeneity in expression studies. SVA can be applied in conjunction with standard analysis techniques to accurately capture the relationship between expression and any modeled variables of interest. We apply SVA to disease class, time course, and genetics of gene expression studies. We show that SVA increases the biological accuracy and reproducibility of analyses in genome-wide expression studies.

  18. On the strong law of large numbers for $\\varphi$-subgaussian random variables

    OpenAIRE

    Zajkowski, Krzysztof

    2016-01-01

    For $p\\ge 1$ let $\\varphi_p(x)=x^2/2$ if $|x|\\le 1$ and $\\varphi_p(x)=1/p|x|^p-1/p+1/2$ if $|x|>1$. For a random variable $\\xi$ let $\\tau_{\\varphi_p}(\\xi)$ denote $\\inf\\{a\\ge 0:\\;\\forall_{\\lambda\\in\\mathbb{R}}\\; \\ln\\mathbb{E}\\exp(\\lambda\\xi)\\le\\varphi_p(a\\lambda)\\}$; $\\tau_{\\varphi_p}$ is a norm in a space $Sub_{\\varphi_p}=\\{\\xi:\\;\\tau_{\\varphi_p}(\\xi)1$) there exist positive constants $c$ and $\\alpha$ such that for every natural number $n$ the following inequality $\\tau_{\\varphi_p}(\\sum_{i=1...

  19. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  20. Convolutions of Heavy Tailed Random Variables and Applications to Portfolio Diversification and MA(1) Time Series

    NARCIS (Netherlands)

    J.L. Geluk (Jaap); L. Peng (Liang); C.G. de Vries (Casper)

    1999-01-01

    textabstractThe paper characterizes first and second order tail behavior of convolutions of i.i.d. heavy tailed random variables with support on the real line. The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.

  1. Simple, efficient estimators of treatment effects in randomized trials using generalized linear models to leverage baseline variables.

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J

    2010-04-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.

  2. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  3. The use of random amplified polymorphic DNA to evaluate the genetic variability of Ponkan mandarin (Citrus reticulata Blanco accessions

    Directory of Open Access Journals (Sweden)

    Coletta Filho Helvécio Della

    2000-01-01

    Full Text Available RAPD analysis of 19 Ponkan mandarin accessions was performed using 25 random primers. Of 112 amplification products selected, only 32 were polymorphic across five accessions. The absence of genetic variability among the other 14 accessions suggested that they were either clonal propagations with different local names, or that they had undetectable genetic variability, such as point mutations which cannot be detected by RAPD.

  4. Allocating monitoring effort in the face of unknown unknowns

    Science.gov (United States)

    Wintle, B.A.; Runge, M.C.; Bekessy, S.A.

    2010-01-01

    There is a growing view that to make efficient use of resources, ecological monitoring should be hypothesis-driven and targeted to address specific management questions. 'Targeted' monitoring has been contrasted with other approaches in which a range of quantities are monitored in case they exhibit an alarming trend or provide ad hoc ecological insights. The second form of monitoring, described as surveillance, has been criticized because it does not usually aim to discern between competing hypotheses, and its benefits are harder to identify a priori. The alternative view is that the existence of surveillance data may enable rapid corroboration of emerging hypotheses or help to detect important 'unknown unknowns' that, if undetected, could lead to catastrophic outcomes or missed opportunities. We derive a model to evaluate and compare the efficiency of investments in surveillance and targeted monitoring. We find that a decision to invest in surveillance monitoring may be defensible if: (1) the surveillance design is more likely to discover or corroborate previously unknown phenomena than a targeted design and (2) the expected benefits (or avoided costs) arising from discovery are substantially higher than those arising from a well-planned targeted design. Our examination highlights the importance of being explicit about the objectives, costs and expected benefits of monitoring in a decision analytic framework. ?? 2010 Blackwell Publishing Ltd/CNRS.

  5. THE COVARIATION FUNCTION FOR SYMMETRIC &ALPHA;-STABLE RANDOM VARIABLES WITH FINITE FIRST MOMENTS

    Directory of Open Access Journals (Sweden)

    Dedi Rosadi

    2012-05-01

    Full Text Available In this paper, we discuss a generalized dependence measure which is designed to measure dependence of two symmetric α-stable random variables with finite mean(1<α<=2 and contains the covariance function as the special case (when α=2. Weshortly discuss some basic properties of the function and consider several methods to estimate the function and further investigate the numerical properties of the estimatorusing the simulated data. We show how to apply this function to measure dependence of some stock returns on the composite index LQ45 in Indonesia Stock Exchange.

  6. A Method of Approximating Expectations of Functions of Sums of Independent Random Variables

    OpenAIRE

    Klass, Michael J.

    1981-01-01

    Let $X_1, X_2, \\cdots$ be a sequence of independent random variables with $S_n = \\sum^n_{i = 1} X_i$. Fix $\\alpha > 0$. Let $\\Phi(\\cdot)$ be a continuous, strictly increasing function on $\\lbrack 0, \\infty)$ such that $\\Phi(0) = 0$ and $\\Phi(cx) \\leq c^\\alpha\\Phi(x)$ for all $x > 0$ and all $c \\geq 2$. Suppose $a$ is a real number and $J$ is a finite nonempty subset of the positive integers. In this paper we are interested in approximating $E \\max_{j \\in J} \\Phi(|a + S_j|)$. We construct a nu...

  7. State, Parameter, and Unknown Input Estimation Problems in Active Automotive Safety Applications

    Science.gov (United States)

    Phanomchoeng, Gridsada

    A variety of driver assistance systems such as traction control, electronic stability control (ESC), rollover prevention and lane departure avoidance systems are being developed by automotive manufacturers to reduce driver burden, partially automate normal driving operations, and reduce accidents. The effectiveness of these driver assistance systems can be significant enhanced if the real-time values of several vehicle parameters and state variables, namely tire-road friction coefficient, slip angle, roll angle, and rollover index, can be known. Since there are no inexpensive sensors available to measure these variables, it is necessary to estimate them. However, due to the significant nonlinear dynamics in a vehicle, due to unknown and changing plant parameters, and due to the presence of unknown input disturbances, the design of estimation algorithms for this application is challenging. This dissertation develops a new approach to observer design for nonlinear systems in which the nonlinearity has a globally (or locally) bounded Jacobian. The developed approach utilizes a modified version of the mean value theorem to express the nonlinearity in the estimation error dynamics as a convex combination of known matrices with time varying coefficients. The observer gains are then obtained by solving linear matrix inequalities (LMIs). A number of illustrative examples are presented to show that the developed approach is less conservative and more useful than the standard Lipschitz assumption based nonlinear observer. The developed nonlinear observer is utilized for estimation of slip angle, longitudinal vehicle velocity, and vehicle roll angle. In order to predict and prevent vehicle rollovers in tripped situations, it is necessary to estimate the vertical tire forces in the presence of unknown road disturbance inputs. An approach to estimate unknown disturbance inputs in nonlinear systems using dynamic model inversion and a modified version of the mean value theorem is

  8. Financial Development and Economic Growth: Known Knowns, Known Unknowns, and Unknown Unknowns

    OpenAIRE

    Ugo Panizza

    2014-01-01

    This paper summarizes the main findings of the literature on the relationship between financial and economic development (the known knowns), points to directions for future research (the known unknowns), and then speculates on the third Rumsfeldian category. The known knowns section organizes the empirical literature on finance and growth into three strands: (i) the traditional literature which established the link between finance and growth; (ii) the new literature which qualified some of th...

  9. Heart rate variability during acute psychosocial stress: A randomized cross-over trial of verbal and non-verbal laboratory stressors.

    Science.gov (United States)

    Brugnera, Agostino; Zarbo, Cristina; Tarvainen, Mika P; Marchettini, Paolo; Adorni, Roberta; Compare, Angelo

    2018-05-01

    Acute psychosocial stress is typically investigated in laboratory settings using protocols with distinctive characteristics. For example, some tasks involve the action of speaking, which seems to alter Heart Rate Variability (HRV) through acute changes in respiration patterns. However, it is still unknown which task induces the strongest subjective and autonomic stress response. The present cross-over randomized trial sought to investigate the differences in perceived stress and in linear and non-linear analyses of HRV between three different verbal (Speech and Stroop) and non-verbal (Montreal Imaging Stress Task; MIST) stress tasks, in a sample of 60 healthy adults (51.7% females; mean age = 25.6 ± 3.83 years). Analyses were run controlling for respiration rates. Participants reported similar levels of perceived stress across the three tasks. However, MIST induced a stronger cardiovascular response than Speech and Stroop tasks, even after controlling for respiration rates. Finally, women reported higher levels of perceived stress and lower HRV both at rest and in response to acute psychosocial stressors, compared to men. Taken together, our results suggest the presence of gender-related differences during psychophysiological experiments on stress. They also suggest that verbal activity masked the vagal withdrawal through altered respiration patterns imposed by speaking. Therefore, our findings support the use of highly-standardized math task, such as MIST, as a valid and reliable alternative to verbal protocols during laboratory studies on stress. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Random forests for classification in ecology

    Science.gov (United States)

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  11. Geometry-based multiplication correction for passive neutron coincidence assay of materials with variable and unknown (α,n) neutron rates

    International Nuclear Information System (INIS)

    Langner, D.G.; Russo, P.A.

    1993-02-01

    We have studied the problem of assaying impure plutonium-bearing materials using passive neutron coincidence counting. We have developed a technique to analyze neutron coincidence data from impure plutonium samples that uses the bulk geometry of the sample to correct for multiplication in samples for which the (α,n) neutron production rate is unknown. This technique can be applied to any impure plutonium-bearing material whose matrix constituents are approximately constant, whose self-multiplication is low to moderate, whose plutonium isotopic composition is known and not substantially varying, and whose bulk geometry is measurable or can be derived. This technique requires a set of reference materials that have well-characterized plutonium contents. These reference materials are measured once to derive a calibration that is specific to the neutron detector and the material. The technique has been applied to molten salt extraction residues, PuF 4 samples that have a variable salt matrix, and impure plutonium oxide samples. It is also applied to pure plutonium oxide samples for comparison. Assays accurate to 4% (1 σ) were obtained for impure samples measured in a High-Level Neutron Coincidence Counter II. The effects on the technique of variations in neutron detector efficiency with energy and the effects of neutron capture in the sample are discussed

  12. Solving differential equations with unknown constitutive relations as recurrent neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Hagge, Tobias J.; Stinis, Panagiotis; Yeung, Enoch H.; Tartakovsky, Alexandre M.

    2017-12-08

    We solve a system of ordinary differential equations with an unknown functional form of a sink (reaction rate) term. We assume that the measurements (time series) of state variables are partially available, and use a recurrent neural network to “learn” the reaction rate from this data. This is achieved by including discretized ordinary differential equations as part of a recurrent neural network training problem. We extend TensorFlow’s recurrent neural network architecture to create a simple but scalable and effective solver for the unknown functions, and apply it to a fedbatch bioreactor simulation problem. Use of techniques from recent deep learning literature enables training of functions with behavior manifesting over thousands of time steps. Our networks are structurally similar to recurrent neural networks, but differ in purpose, and require modified training strategies.

  13. Power maximization of variable-speed variable-pitch wind turbines using passive adaptive neural fault tolerant control

    Science.gov (United States)

    Habibi, Hamed; Rahimi Nohooji, Hamed; Howard, Ian

    2017-09-01

    Power maximization has always been a practical consideration in wind turbines. The question of how to address optimal power capture, especially when the system dynamics are nonlinear and the actuators are subject to unknown faults, is significant. This paper studies the control methodology for variable-speed variable-pitch wind turbines including the effects of uncertain nonlinear dynamics, system fault uncertainties, and unknown external disturbances. The nonlinear model of the wind turbine is presented, and the problem of maximizing extracted energy is formulated by designing the optimal desired states. With the known system, a model-based nonlinear controller is designed; then, to handle uncertainties, the unknown nonlinearities of the wind turbine are estimated by utilizing radial basis function neural networks. The adaptive neural fault tolerant control is designed passively to be robust on model uncertainties, disturbances including wind speed and model noises, and completely unknown actuator faults including generator torque and pitch actuator torque. The Lyapunov direct method is employed to prove that the closed-loop system is uniformly bounded. Simulation studies are performed to verify the effectiveness of the proposed method.

  14. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.

    Science.gov (United States)

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-08-27

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.

  15. Performance study of LMS based adaptive algorithms for unknown system identification

    Energy Technology Data Exchange (ETDEWEB)

    Javed, Shazia; Ahmad, Noor Atinah [School of Mathematical Sciences, Universiti Sains Malaysia, 11800 Penang (Malaysia)

    2014-07-10

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signal is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.

  16. Performance study of LMS based adaptive algorithms for unknown system identification

    International Nuclear Information System (INIS)

    Javed, Shazia; Ahmad, Noor Atinah

    2014-01-01

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signal is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment

  17. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  18. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  19. Designing towards the unknown

    DEFF Research Database (Denmark)

    Wilde, Danielle; Underwood, Jenny

    2018-01-01

    the research potential to far-ranging possibilities. In this article we unpack the motivations driving the PKI project. We present our mixed-methodology, which entangles textile crafts, design interactions and materiality to shape an embodied enquiry. Our research outcomes are procedural and methodological......New materials with new capabilities demand new ways of approaching design. Destabilising existing methods is crucial to develop new methods. Yet, radical destabilisation—where outcomes remain unknown long enough that new discoveries become possible—is not easy in technology design where complex......, to design towards unknown outcomes, using unknown materials. The impossibility of this task is proving as useful as it is disruptive. At its most potent, it is destabilising expectations, aesthetics and processes. Keeping the researchers, collaborators and participants in a state of unknowing, is opening...

  20. Recension: Mao - The Unknown Story

    DEFF Research Database (Denmark)

    Clausen, Søren

    2005-01-01

    Anmeldelse - kritisk! - til Sveriges førende Kinatidsskrift af Jung Chang & Jon Halliday's sensationelle "Mao - the Unknown Story".......Anmeldelse - kritisk! - til Sveriges førende Kinatidsskrift af Jung Chang & Jon Halliday's sensationelle "Mao - the Unknown Story"....

  1. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely.

    Science.gov (United States)

    Widaman, Keith F; Grimm, Kevin J; Early, Dawnté R; Robins, Richard W; Conger, Rand D

    2013-07-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group.

  2. Adaptive neural network output feedback control for stochastic nonlinear systems with unknown dead-zone and unmodeled dynamics.

    Science.gov (United States)

    Tong, Shaocheng; Wang, Tong; Li, Yongming; Zhang, Huaguang

    2014-06-01

    This paper discusses the problem of adaptive neural network output feedback control for a class of stochastic nonlinear strict-feedback systems. The concerned systems have certain characteristics, such as unknown nonlinear uncertainties, unknown dead-zones, unmodeled dynamics and without the direct measurements of state variables. In this paper, the neural networks (NNs) are employed to approximate the unknown nonlinear uncertainties, and then by representing the dead-zone as a time-varying system with a bounded disturbance. An NN state observer is designed to estimate the unmeasured states. Based on both backstepping design technique and a stochastic small-gain theorem, a robust adaptive NN output feedback control scheme is developed. It is proved that all the variables involved in the closed-loop system are input-state-practically stable in probability, and also have robustness to the unmodeled dynamics. Meanwhile, the observer errors and the output of the system can be regulated to a small neighborhood of the origin by selecting appropriate design parameters. Simulation examples are also provided to illustrate the effectiveness of the proposed approach.

  3. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  4. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  5. Sliding Mode Observer-Based Current Sensor Fault Reconstruction and Unknown Load Disturbance Estimation for PMSM Driven System.

    Science.gov (United States)

    Zhao, Kaihui; Li, Peng; Zhang, Changfan; Li, Xiangfei; He, Jing; Lin, Yuliang

    2017-12-06

    This paper proposes a new scheme of reconstructing current sensor faults and estimating unknown load disturbance for a permanent magnet synchronous motor (PMSM)-driven system. First, the original PMSM system is transformed into two subsystems; the first subsystem has unknown system load disturbances, which are unrelated to sensor faults, and the second subsystem has sensor faults, but is free from unknown load disturbances. Introducing a new state variable, the augmented subsystem that has sensor faults can be transformed into having actuator faults. Second, two sliding mode observers (SMOs) are designed: the unknown load disturbance is estimated by the first SMO in the subsystem, which has unknown load disturbance, and the sensor faults can be reconstructed using the second SMO in the augmented subsystem, which has sensor faults. The gains of the proposed SMOs and their stability analysis are developed via the solution of linear matrix inequality (LMI). Finally, the effectiveness of the proposed scheme was verified by simulations and experiments. The results demonstrate that the proposed scheme can reconstruct current sensor faults and estimate unknown load disturbance for the PMSM-driven system.

  6. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    Science.gov (United States)

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  7. Convolutions of Heavy Tailed Random Variables and Applications to Portfolio Diversification and MA(1) Time Series

    OpenAIRE

    Geluk, Jaap; Peng, Liang; de Vries, Casper G.

    1999-01-01

    Suppose X1,X2 are independent random variables satisfying a second-order regular variation condition on the tail-sum and a balance condition on the tails. In this paper we give a description of the asymptotic behaviour as t → ∞ for P(X1 + X2 > t). The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.

  8. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  9. Random and systematic spatial variability of 137Cs inventories at reference sites in South-Central Brazil

    Directory of Open Access Journals (Sweden)

    Correchel Vladia

    2005-01-01

    Full Text Available The precision of the 137Cs fallout redistribution technique for the evaluation of soil erosion rates is strongly dependent on the quality of an average inventory taken at a representative reference site. The knowledge of the sources and of the degree of variation of the 137Cs fallout spatial distribution plays an important role on its use. Four reference sites were selected in the South-Central region of Brazil which were characterized in terms of soil chemical, physical and mineralogical aspects as well as the spatial variability of 137Cs inventories. Some important differences in the patterns of 137Cs depth distribution in the soil profiles of the different sites were found. They are probably associated to chemical, physical, mineralogical and biological differences of the soils but many questions still remain open for future investigation, mainly those regarding the adsorption and dynamics of the 137Cs ions in soil profiles under tropical conditions. The random spatial variability (inside each reference site was higher than the systematic spatial variability (between reference sites but their causes were not clearly identified as possible consequences of chemical, physical, mineralogical variability, and/or precipitation.

  10. Vertical random variability of the distribution coefficient in the soil and its effect on the migration of fallout radionuclides

    International Nuclear Information System (INIS)

    Bunzl, K.

    2002-01-01

    In the field, the distribution coefficient, K d , for the sorption of a radionuclide by the soil cannot be expected to be constant. Even in a well defined soil horizon, K d will vary stochastically in horizontal as well as in vertical direction around a mean value. The horizontal random variability of K d produce a pronounced tailing effect in the concentration depth profile of a fallout radionuclide, much less is known on the corresponding effect of the vertical random variability. To analyze this effect theoretically, the classical convection-dispersion model in combination with the random-walk particle method was applied. The concentration depth profile of a radionuclide was calculated one year after deposition assuming constant values of the pore water velocity, the diffusion/dispersion coefficient, and the distribution coefficient (K d = 100 cm 3 x g -1 ) and exhibiting a vertical variability for K d according to a log-normal distribution with a geometric mean of 100 cm 3 x g -1 and a coefficient of variation of CV 0.53. The results show that these two concentration depth profiles are only slightly different, the location of the peak is shifted somewhat upwards, and the dispersion of the concentration depth profile is slightly larger. A substantial tailing effect of the concentration depth profile is not perceivable. Especially with respect to the location of the peak, a very good approximation of the concentration depth profile is obtained if the arithmetic mean of the K d -values (K d = 113 cm 3 x g -1 ) and a slightly increased dispersion coefficient are used in the analytical solution of the classical convection-dispersion equation with constant K d . The evaluation of the observed concentration depth profile with the analytical solution of the classical convection-dispersion equation with constant parameters will, within the usual experimental limits, hardly reveal the presence of a log-normal random distribution of K d in the vertical direction in

  11. A survey of Type III restriction-modification systems reveals numerous, novel epigenetic regulators controlling phase-variable regulons; phasevarions

    Science.gov (United States)

    Atack, John M; Yang, Yuedong; Jennings, Michael P

    2018-01-01

    Abstract Many bacteria utilize simple DNA sequence repeats as a mechanism to randomly switch genes on and off. This process is called phase variation. Several phase-variable N6-adenine DNA-methyltransferases from Type III restriction-modification systems have been reported in bacterial pathogens. Random switching of DNA methyltransferases changes the global DNA methylation pattern, leading to changes in gene expression. These epigenetic regulatory systems are called phasevarions — phase-variable regulons. The extent of these phase-variable genes in the bacterial kingdom is unknown. Here, we interrogated a database of restriction-modification systems, REBASE, by searching for all simple DNA sequence repeats in mod genes that encode Type III N6-adenine DNA-methyltransferases. We report that 17.4% of Type III mod genes (662/3805) contain simple sequence repeats. Of these, only one-fifth have been previously identified. The newly discovered examples are widely distributed and include many examples in opportunistic pathogens as well as in environmental species. In many cases, multiple phasevarions exist in one genome, with examples of up to 4 independent phasevarions in some species. We found several new types of phase-variable mod genes, including the first example of a phase-variable methyltransferase in pathogenic Escherichia coli. Phasevarions are a common epigenetic regulation contingency strategy used by both pathogenic and non-pathogenic bacteria. PMID:29554328

  12. An AUC-based permutation variable importance measure for random forests.

    Science.gov (United States)

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  13. Kriging with Unknown Variance Components for Regional Ionospheric Reconstruction

    Directory of Open Access Journals (Sweden)

    Ling Huang

    2017-02-01

    Full Text Available Ionospheric delay effect is a critical issue that limits the accuracy of precise Global Navigation Satellite System (GNSS positioning and navigation for single-frequency users, especially in mid- and low-latitude regions where variations in the ionosphere are larger. Kriging spatial interpolation techniques have been recently introduced to model the spatial correlation and variability of ionosphere, which intrinsically assume that the ionosphere field is stochastically stationary but does not take the random observational errors into account. In this paper, by treating the spatial statistical information on ionosphere as prior knowledge and based on Total Electron Content (TEC semivariogram analysis, we use Kriging techniques to spatially interpolate TEC values. By assuming that the stochastic models of both the ionospheric signals and measurement errors are only known up to some unknown factors, we propose a new Kriging spatial interpolation method with unknown variance components for both the signals of ionosphere and TEC measurements. Variance component estimation has been integrated with Kriging to reconstruct regional ionospheric delays. The method has been applied to data from the Crustal Movement Observation Network of China (CMONOC and compared with the ordinary Kriging and polynomial interpolations with spherical cap harmonic functions, polynomial functions and low-degree spherical harmonic functions. The statistics of results indicate that the daily ionospheric variations during the experimental period characterized by the proposed approach have good agreement with the other methods, ranging from 10 to 80 TEC Unit (TECU, 1 TECU = 1 × 1016 electrons/m2 with an overall mean of 28.2 TECU. The proposed method can produce more appropriate estimations whose general TEC level is as smooth as the ordinary Kriging but with a smaller standard deviation around 3 TECU than others. The residual results show that the interpolation precision of the

  14. Deconvoluting preferences and errors: a model for binomial panel data

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Nielsen, Søren Feodor

    2010-01-01

    In many stated choice experiments researchers observe the random variables Vt, Xt, and Yt = 1{U + δxs22A4Xt + εt unknown parameter and U and εt are unobservable random variables. We show that under weak assumptions the distributions of U and εt and also the unknown para...

  15. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    In this paper, we investigate a dynamical system in a random setting of dual randomness in space and time variables in which both the imperfection of the structure and the load function are considered random , each with a statistical zero-mean .The auto- covariance of the load is correlated as an exponentially decaying ...

  16. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  17. The unknown-unknowns: Revealing the hidden insights in massive biomedical data using combined artificial intelligence and knowledge networks

    Directory of Open Access Journals (Sweden)

    Chris Yoo

    2017-12-01

    Full Text Available Genomic data is estimated to be doubling every seven months with over 2 trillion bases from whole genome sequence studies deposited in Genbank in just the last 15 years alone. Recent advances in compute and storage have enabled the use of artificial intelligence techniques in areas such as feature recognition in digital pathology and chemical synthesis for drug development. To apply A.I. productively to multidimensional data such as cellular processes and their dysregulation, the data must be transformed into a structured format, using prior knowledge to create contextual relationships and hierarchies upon which computational analysis can be performed. Here we present the organization of complex data into hypergraphs that facilitate the application of A.I. We provide an example use case of a hypergraph containing hundreds of biological data values and the results of several classes of A.I. algorithms applied in a popular compute cloud. While multiple, biologically insightful correlations between disease states, behavior, and molecular features were identified, the insights of scientific import were revealed only when exploration of the data included visualization of subgraphs of represented knowledge. The results suggest that while machine learning can identify known correlations and suggest testable ones, the greater probability of discovering unexpected relationships between seemingly independent variables (unknown-unknowns requires a context-aware system – hypergraphs that impart biological meaning in nodes and edges. We discuss the implications of a combined hypergraph-A.I. analysis approach to multidimensional data and the pre-processing requirements for such a system.

  18. Contemporary group estimates adjusted for climatic effects provide a finer definition of the unknown environmental challenges experienced by growing pigs.

    Science.gov (United States)

    Guy, S Z Y; Li, L; Thomson, P C; Hermesch, S

    2017-12-01

    Environmental descriptors derived from mean performances of contemporary groups (CGs) are assumed to capture any known and unknown environmental challenges. The objective of this paper was to obtain a finer definition of the unknown challenges, by adjusting CG estimates for the known climatic effects of monthly maximum air temperature (MaxT), minimum air temperature (MinT) and monthly rainfall (Rain). As the unknown component could include infection challenges, these refined descriptors may help to better model varying responses of sire progeny to environmental infection challenges for the definition of disease resilience. Data were recorded from 1999 to 2013 at a piggery in south-east Queensland, Australia (n = 31,230). Firstly, CG estimates of average daily gain (ADG) and backfat (BF) were adjusted for MaxT, MinT and Rain, which were fitted as splines. In the models used to derive CG estimates for ADG, MaxT and MinT were significant variables. The models that contained these significant climatic variables had CG estimates with a lower variance compared to models without significant climatic variables. Variance component estimates were similar across all models, suggesting that these significant climatic variables accounted for some known environmental variation captured in CG estimates. No climatic variables were significant in the models used to derive the CG estimates for BF. These CG estimates were used to categorize environments. There was no observable sire by environment interaction (Sire×E) for ADG when using the environmental descriptors based on CG estimates on BF. For the environmental descriptors based on CG estimates of ADG, there was significant Sire×E only when MinT was included in the model (p = .01). Therefore, this new definition of the environment, preadjusted by MinT, increased the ability to detect Sire×E. While the unknown challenges captured in refined CG estimates need verification for infection challenges, this may provide a

  19. RNA-seq: technical variability and sampling

    Science.gov (United States)

    2011-01-01

    Background RNA-seq is revolutionizing the way we study transcriptomes. mRNA can be surveyed without prior knowledge of gene transcripts. Alternative splicing of transcript isoforms and the identification of previously unknown exons are being reported. Initial reports of differences in exon usage, and splicing between samples as well as quantitative differences among samples are beginning to surface. Biological variation has been reported to be larger than technical variation. In addition, technical variation has been reported to be in line with expectations due to random sampling. However, strategies for dealing with technical variation will differ depending on the magnitude. The size of technical variance, and the role of sampling are examined in this manuscript. Results In this study three independent Solexa/Illumina experiments containing technical replicates are analyzed. When coverage is low, large disagreements between technical replicates are apparent. Exon detection between technical replicates is highly variable when the coverage is less than 5 reads per nucleotide and estimates of gene expression are more likely to disagree when coverage is low. Although large disagreements in the estimates of expression are observed at all levels of coverage. Conclusions Technical variability is too high to ignore. Technical variability results in inconsistent detection of exons at low levels of coverage. Further, the estimate of the relative abundance of a transcript can substantially disagree, even when coverage levels are high. This may be due to the low sampling fraction and if so, it will persist as an issue needing to be addressed in experimental design even as the next wave of technology produces larger numbers of reads. We provide practical recommendations for dealing with the technical variability, without dramatic cost increases. PMID:21645359

  20. Contemporary management of lymph node metastases from an unknown primary to the neck : II. A review of therapeutic options

    NARCIS (Netherlands)

    Strojan, Primoz; Ferlito, Alfio; Langendijk, Johannes A.; Corry, June; Woolgar, Julia A.; Rinaldo, Alessandra; Silver, Carl E.; Paleri, Vinidh; Fagan, Johannes J.; Pellitteri, Phillip K.; Haigentz, Missak; Suarez, Carlos; Robbins, K. Thomas; Rodrigo, Juan P.; Olsen, Kerry D.; Hinni, Michael L.; Werner, Jochen A.; Mondin, Vanni; Kowalski, Luiz P.; Devaney, Kenneth O.; de Bree, Remco; Takes, Robert P.; Wolf, Gregory T.; Shaha, Ashok R.; Genden, Eric M.; Barnes, Leon

    Although uncommon, cancer of an unknown primary (CUP) metastatic to cervical lymph nodes poses a range of dilemmas relating to optimal treatment. The ideal resolution would be a properly designed prospective randomized trial, but it is unlikely that this will ever be conducted in this group of

  1. Contemporary management of lymph node metastases from an unknown primary to the neck: II. a review of therapeutic options

    NARCIS (Netherlands)

    Strojan, P.; Ferlito, A.; Langendijk, J.A.; Corry, J.; Woolgar, J.A.; Rinaldo, A.; Silver, C.E.; Paleri, V.; Fagan, J.J.; Pellitteri, P.K.; Haigentz Jr., M.; Suarez, C.; Robbins, K.T.; Rodrigo, J.P.; Olsen, K.D.; Hinni, M.L.; Werner, J.A.; Mondin, V.; Kowalski, L.P.; Devaney, K.O.; Bree, R. de; Takes, R.P.; Wolf, G.T.; Shaha, A.R.; Genden, E.M.; Barnes, L.

    2013-01-01

    Although uncommon, cancer of an unknown primary (CUP) metastatic to cervical lymph nodes poses a range of dilemmas relating to optimal treatment. The ideal resolution would be a properly designed prospective randomized trial, but it is unlikely that this will ever be conducted in this group of

  2. Known knowns, known unknowns and unknown unknowns in prokaryotic transposition.

    Science.gov (United States)

    Siguier, Patricia; Gourbeyre, Edith; Chandler, Michael

    2017-08-01

    Although the phenomenon of transposition has been known for over 60 years, its overarching importance in modifying and streamlining genomes took some time to recognize. In spite of a robust understanding of transposition of some TE, there remain a number of important TE groups with potential high genome impact and unknown transposition mechanisms and yet others, only recently identified by bioinformatics, yet to be formally confirmed as mobile. Here, we point to some areas of limited understanding concerning well established important TE groups with DDE Tpases, to address central gaps in our knowledge of characterised Tn with other types of Tpases and finally, to highlight new potentially mobile DNA species. It is not exhaustive. Examples have been chosen to provide encouragement in the continued exploration of the considerable prokaryotic mobilome especially in light of the current threat to public health posed by the spread of multiple Ab R . Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Use of Medicines with Unknown Fetal Risk among Parturient Women from the 2004 Pelotas Birth Cohort (Brazil

    Directory of Open Access Journals (Sweden)

    Andréa Dâmaso Bertoldi

    2012-01-01

    Full Text Available Background. To estimate the exposure to medicines with unknown fetal risk during pregnancy and to analyze the maternal characteristics associated with it. Methods. A questionnaire was administered to 4,189 mothers of children belonging to the 2004 Pelotas (Brazil birth cohort study about use of any medicine during gestation. We evaluated the associations between use of medicines with unknown fetal risk and the independent variables through logistic regression models. Unknown fetal risk was defined as medicines in which studies in animals have revealed adverse effects on the fetus, and no controlled studies in women, or studies in women and animals, are available. Results. Out of the 4,189 women, 52.5% used at least one medicine from unknown fetal risk. Use of these medicines was associated with white skin color, high schooling, high income, six or more antenatal care consultations, hospital admission during pregnancy, and morbidity during gestation. Conclusion. The use of unknown fetal risk medicines is high, suggesting that their use must be addressed with caution with the aim of restricting their use to cases in which the benefits are greater than the potential risks.

  4. On the fluctuations of sums of independent random variables.

    Science.gov (United States)

    Feller, W

    1969-07-01

    If X(1), X(2),... are independent random variables with zero expectation and finite variances, the cumulative sums S(n) are, on the average, of the order of magnitude S(n), where S(n) (2) = E(S(n) (2)). The occasional maxima of the ratios S(n)/S(n) are surprisingly large and the problem is to estimate the extent of their probable fluctuations.Specifically, let S(n) (*) = (S(n) - b(n))/a(n), where {a(n)} and {b(n)}, two numerical sequences. For any interval I, denote by p(I) the probability that the event S(n) (*) epsilon I occurs for infinitely many n. Under mild conditions on {a(n)} and {b(n)}, it is shown that p(I) equals 0 or 1 according as a certain series converges or diverges. To obtain the upper limit of S(n)/a(n), one has to set b(n) = +/- epsilon a(n), but finer results are obtained with smaller b(n). No assumptions concerning the under-lying distributions are made; the criteria explain structurally which features of {X(n)} affect the fluctuations, but for concrete results something about P{S(n)>a(n)} must be known. For example, a complete solution is possible when the X(n) are normal, replacing the classical law of the iterated logarithm. Further concrete estimates may be obtained by combining the new criteria with some recently developed limit theorems.

  5. Systemic treatment of cancer of unknown primary origin

    International Nuclear Information System (INIS)

    Reckova, M.

    2013-01-01

    Cancer of unknown primary origin (CUP) comprises a heterogenous group of cancers with distinct biology and prognosis. There is, however, a specific group of patients with curable diseases, or incurable diseases with good prognosis. The main aim of treatment in the group of patients with CUP is timely initiation of therapy in the cases of curable disease. There is no known standard of care in the cases of CUP with poor prognosis, but most frequently, platinum-based regimens are used. In the cases of specific immunohistochemistry (IHC) or molecular gene expression profile, there are used the treatment regimens similar to those used in the patients with known primary tumor and similar IHC or molecular profile. Currently, most of data in patients with CUP are from phase II clinical trials. Thus proficiently designed phase III randomized clinical trials with translation research is priority, with aim to improve our knowledge and personalize treatment of such heterogenous group of patients as is a group of patients with CUP. (author)

  6. MODELING THE TIME VARIABILITY OF SDSS STRIPE 82 QUASARS AS A DAMPED RANDOM WALK

    International Nuclear Information System (INIS)

    MacLeod, C. L.; Ivezic, Z.; Bullock, E.; Kimball, A.; Sesar, B.; Westman, D.; Brooks, K.; Gibson, R.; Becker, A. C.; Kochanek, C. S.; Kozlowski, S.; Kelly, B.; De Vries, W. H.

    2010-01-01

    We model the time variability of ∼9000 spectroscopically confirmed quasars in SDSS Stripe 82 as a damped random walk (DRW). Using 2.7 million photometric measurements collected over 10 yr, we confirm the results of Kelly et al. and Kozlowski et al. that this model can explain quasar light curves at an impressive fidelity level (0.01-0.02 mag). The DRW model provides a simple, fast (O(N) for N data points), and powerful statistical description of quasar light curves by a characteristic timescale (τ) and an asymptotic rms variability on long timescales (SF ∞ ). We searched for correlations between these two variability parameters and physical parameters such as luminosity and black hole mass, and rest-frame wavelength. Our analysis shows SF ∞ to increase with decreasing luminosity and rest-frame wavelength as observed previously, and without a correlation with redshift. We find a correlation between SF ∞ and black hole mass with a power-law index of 0.18 ± 0.03, independent of the anti-correlation with luminosity. We find that τ increases with increasing wavelength with a power-law index of 0.17, remains nearly constant with redshift and luminosity, and increases with increasing black hole mass with a power-law index of 0.21 ± 0.07. The amplitude of variability is anti-correlated with the Eddington ratio, which suggests a scenario where optical fluctuations are tied to variations in the accretion rate. However, we find an additional dependence on luminosity and/or black hole mass that cannot be explained by the trend with Eddington ratio. The radio-loudest quasars have systematically larger variability amplitudes by about 30%, when corrected for the other observed trends, while the distribution of their characteristic timescale is indistinguishable from that of the full sample. We do not detect any statistically robust differences in the characteristic timescale and variability amplitude between the full sample and the small subsample of quasars detected

  7. Contribution to the application of the random vibration theory to the seismic analysis of structures via state variables

    International Nuclear Information System (INIS)

    Maestrini, A.P.

    1979-04-01

    Several problems related to the application of the theory of random by means of state variables are studied. The well-known equations that define the propagation of the mean and the variance for linear and non-linear systems are first presented. The Monte Carlo method is next resorted to in order to determine the applicability of the hypothesis of a normally distributed output in case of linear systems subjected to non-Gaussian excitations. Finally, attention is focused on the properties of linear filters and modulation functions proposed to simulate seismic excitations as non stationary random processes. Acceleration spectra obtained by multiplying rms spectra by a constant factor are compared with design spectra suggested by several authors for various soil conditions. In every case, filter properties are given. (Author) [pt

  8. Random Access Performance of Distributed Sensors Attacked by Unknown Jammers

    Directory of Open Access Journals (Sweden)

    Dae-Kyo Jeong

    2017-11-01

    Full Text Available In this paper, we model and investigate the random access (RA performance of sensor nodes (SN in a wireless sensor network (WSN. In the WSN, a central head sensor (HS collects the information from distributed SNs, and jammers disturb the information transmission primarily by generating interference. In this paper, two jamming attacks are considered: power and code jamming. Power jammers (if they are friendly jammers generate noises and, as a result, degrade the quality of the signal from SNs. Power jamming is equally harmful to all the SNs that are accessing HS and simply induces denial of service (DoS without any need to hack HS or SNs. On the other hand, code jammers mimic legitimate SNs by sending fake signals and thus need to know certain system parameters that are used by the legitimate SNs. As a result of code jamming, HS falsely allocates radio resources to SNs. The code jamming hence increases the failure probability in sending the information messages, as well as misleads the usage of radio resources. In this paper, we present the probabilities of successful preamble transmission with power ramping according to the jammer types and provide the resulting throughput and delay of information transmission by SNs, respectively. The effect of two jamming attacks on the RA performances is compared with numerical investigation. The results show that, compared to RA without jammers, power and code jamming degrade the throughput by up to 30.3% and 40.5%, respectively, while the delay performance by up to 40.1% and 65.6%, respectively.

  9. Convergence Analysis of Semi-Implicit Euler Methods for Solving Stochastic Age-Dependent Capital System with Variable Delays and Random Jump Magnitudes

    Directory of Open Access Journals (Sweden)

    Qinghui Du

    2014-01-01

    Full Text Available We consider semi-implicit Euler methods for stochastic age-dependent capital system with variable delays and random jump magnitudes, and investigate the convergence of the numerical approximation. It is proved that the numerical approximate solutions converge to the analytical solutions in the mean-square sense under given conditions.

  10. Qualitatively Assessing Randomness in SVD Results

    Science.gov (United States)

    Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.

    2012-12-01

    Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.

  11. Respiratory variability preceding and following sighs: a resetter hypothesis.

    Science.gov (United States)

    Vlemincx, Elke; Van Diest, Ilse; Lehrer, Paul M; Aubert, André E; Van den Bergh, Omer

    2010-04-01

    Respiratory behavior is characterized by complex variability with structured and random components. Assuming that both a lack of variability and too much randomness represent suboptimal breathing regulation, we hypothesized that sighing acts as a resetter inducing structured variability. Spontaneous breathing was measured in healthy persons (N=42) during a 20min period of quiet sitting using the LifeShirt(®) System. Four blocks of 10 breaths with a 50% window overlap were determined before and after spontaneous sighs. Total respiratory variability of minute ventilation was measured using the coefficient of variation and structured (correlated) variability was quantified using autocorrelation. Towards a sigh, total variability gradually increased without concomittant changes in correlated variability, suggesting that randomness increased. After a sigh, correlated variability increased. No changes in variability were found in comparable epochs without intermediate sighs. We conclude that a sigh resets structured respiratory variability, enhancing information processing in the respiratory system. Copyright © 2009 Elsevier B.V. All rights reserved.

  12. Some limit theorems for negatively associated random variables

    Indian Academy of Sciences (India)

    random sampling without replacement, and (i) joint distribution of ranks. ... wide applications in multivariate statistical analysis and system reliability, the ... strong law of large numbers for negatively associated sequences under the case where.

  13. An Evidence-Based Review Literature About Risk Indicators and Management of Unknown-Origin Xerostomia

    Directory of Open Access Journals (Sweden)

    Farzaneh Agha-hosseini

    2013-01-01

    Full Text Available Objective: This evidence-based article reviews risk indicators and management of unknown-origin xerostomia. Xerostomia and hyposalivation refer to different aspects of dry mouth. Xerostomia is a subjective sensation of dry mouth, whilst hyposalivation is defined as an objective assessment of reduced salivary flow rate. About 30% of the elderly (65 years and older experience xerostomia and hyposalivation. Structural and functional factors, or both may lead to salivary gland dysfunction.Study Selection: The EBM literature search was conducted by using the medical literature database MEDLINE via PubMed and OvidMedline search engines. Results were limited to English language articles (1965 to present including clinical trials (CT, randomized controlled trials (RCT, systematic reviews and review articles. Case control or cohort studies were included for the etiology.Results: Neuropathic etiology such as localized oral alteration of thermal sensations, saliva composition change (for example higher levels of K, Cl, Ca, IgA, amylase, calcium, PTH and cortisol, lower levels of estrogen and progesterone, smaller salivary gland size, and illnesses such as lichen planus, are risk indicators for unknown-origin xerostomia. The management is palliative and preventative. Management of symptoms includes drug administration (systemic secretogogues, saliva substitutes and bile secretion-stimulator, night guard, diet and habit modifications. Other managements may be indicated to treat adverse effects.Conclusion: Neuropathic etiology, saliva composition change, smaller salivary gland size, and illnesses such as oral lichen planus can be suggestive causes for unknown-origin xerostomia. However, longitudinal studies will be important to elucidate the causes of unknown-origin xerostomia.

  14. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2008-03-01

    Full Text Available The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space. These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras. The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  15. Chinese Unknown Word Recognition for PCFG-LA Parsing

    Directory of Open Access Journals (Sweden)

    Qiuping Huang

    2014-01-01

    Full Text Available This paper investigates the recognition of unknown words in Chinese parsing. Two methods are proposed to handle this problem. One is the modification of a character-based model. We model the emission probability of an unknown word using the first and last characters in the word. It aims to reduce the POS tag ambiguities of unknown words to improve the parsing performance. In addition, a novel method, using graph-based semisupervised learning (SSL, is proposed to improve the syntax parsing of unknown words. Its goal is to discover additional lexical knowledge from a large amount of unlabeled data to help the syntax parsing. The method is mainly to propagate lexical emission probabilities to unknown words by building the similarity graphs over the words of labeled and unlabeled data. The derived distributions are incorporated into the parsing process. The proposed methods are effective in dealing with the unknown words to improve the parsing. Empirical results for Penn Chinese Treebank and TCT Treebank revealed its effectiveness.

  16. Chaos, dynamical structure and climate variability

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, H.B. [Brookhaven National Lab., Upton, NY (United States). Dept. of Applied Science

    1995-09-01

    Deterministic chaos in dynamical systems offers a new paradigm for understanding irregular fluctuations. Techniques for identifying deterministic chaos from observed data, without recourse to mathematical models, are being developed. Powerful methods exist for reconstructing multidimensional phase space from an observed time series of a single scalar variable; these methods are invaluable when only a single scalar record of the dynamics is available. However, in some applications multiple concurrent time series may be available for consideration as phase space coordinates. Here the authors propose some basic analytical tools for such multichannel time series data, and illustrate them by applications to a simple synthetic model of chaos, to a low-order model of atmospheric circulation, and to two high-resolution paleoclimate proxy data series. The atmospheric circulation model, originally proposed by Lorenz, has 27 principal unknowns; they establish that the chaotic attractor can be embedded in a subspace of eight dimensions by exhibiting a specific subset of eight unknowns which pass multichannel tests for false nearest neighbors. They also show that one of the principal unknowns in the 27-variable model--the global mean sea surface temperature--is of no discernible usefulness in making short-term forecasts.

  17. A simplified method for random vibration analysis of structures with random parameters

    International Nuclear Information System (INIS)

    Ghienne, Martin; Blanzé, Claude

    2016-01-01

    Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues. (paper)

  18. Are glucose levels, glucose variability and autonomic control influenced by inspiratory muscle exercise in patients with type 2 diabetes? Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Schein, Aso; Correa, Aps; Casali, Karina Rabello; Schaan, Beatriz D

    2016-01-20

    Physical exercise reduces glucose levels and glucose variability in patients with type 2 diabetes. Acute inspiratory muscle exercise has been shown to reduce these parameters in a small group of patients with type 2 diabetes, but these results have yet to be confirmed in a well-designed study. The aim of this study is to investigate the effect of acute inspiratory muscle exercise on glucose levels, glucose variability, and cardiovascular autonomic function in patients with type 2 diabetes. This study will use a randomized clinical trial crossover design. A total of 14 subjects will be recruited and randomly allocated to two groups to perform acute inspiratory muscle loading at 2 % of maximal inspiratory pressure (PImax, placebo load) or 60 % of PImax (experimental load). Inspiratory muscle training could be a novel exercise modality to be used to decrease glucose levels and glucose variability. ClinicalTrials.gov NCT02292810 .

  19. Bubble CPAP versus CPAP with variable flow in newborns with respiratory distress: a randomized controlled trial.

    Science.gov (United States)

    Yagui, Ana Cristina Zanon; Vale, Luciana Assis Pires Andrade; Haddad, Luciana Branco; Prado, Cristiane; Rossi, Felipe Souza; Deutsch, Alice D Agostini; Rebello, Celso Moura

    2011-01-01

    To evaluate the efficacy and safety of nasal continuous positive airway pressure (NCPAP) using devices with variable flow or bubble continuous positive airway pressure (CPAP) regarding CPAP failure, presence of air leaks, total CPAP and oxygen time, and length of intensive care unit and hospital stay in neonates with moderate respiratory distress (RD) and birth weight (BW) ≥ 1,500 g. Forty newborns requiring NCPAP were randomized into two study groups: variable flow group (VF) and continuous flow group (CF). The study was conducted between October 2008 and April 2010. Demographic data, CPAP failure, presence of air leaks, and total CPAP and oxygen time were recorded. Categorical outcomes were tested using the chi-square test or the Fisher's exact test. Continuous variables were analyzed using the Mann-Whitney test. The level of significance was set at p CPAP failure (21.1 and 20.0% for VF and CF, respectively; p = 1.000), air leak syndrome (10.5 and 5.0%, respectively; p = 0.605), total CPAP time (median: 22.0 h, interquartile range [IQR]: 8.00-31.00 h and median: 22.0 h, IQR: 6.00-32.00 h, respectively; p = 0.822), and total oxygen time (median: 24.00 h, IQR: 7.00-85.00 h and median: 21.00 h, IQR: 9.50-66.75 h, respectively; p = 0.779). In newborns with BW ≥ 1,500 g and moderate RD, the use of continuous flow NCPAP showed the same benefits as the use of variable flow NCPAP.

  20. The quotient of normal random variables and application to asset price fat tails

    Science.gov (United States)

    Caginalp, Carey; Caginalp, Gunduz

    2018-06-01

    The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.

  1. European Randomized Study of Screening for Prostate Cancer Risk Calculator: External Validation, Variability, and Clinical Significance.

    Science.gov (United States)

    Gómez-Gómez, Enrique; Carrasco-Valiente, Julia; Blanca-Pedregosa, Ana; Barco-Sánchez, Beatriz; Fernandez-Rueda, Jose Luis; Molina-Abril, Helena; Valero-Rosa, Jose; Font-Ugalde, Pilar; Requena-Tapia, Maria José

    2017-04-01

    To externally validate the European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculator (RC) and to evaluate its variability between 2 consecutive prostate-specific antigen (PSA) values. We prospectively catalogued 1021 consecutive patients before prostate biopsy for suspicion of prostate cancer (PCa). The risk of PCa and significant PCa (Gleason score ≥7) from 749 patients was calculated according to ERSPC-RC (digital rectal examination-based version 3 of 4) for 2 consecutive PSA tests per patient. The calculators' predictions were analyzed using calibration plots and the area under the receiver operating characteristic curve (area under the curve). Cohen kappa coefficient was used to compare the ability and variability. Of 749 patients, PCa was detected in 251 (33.5%) and significant PCa was detected in 133 (17.8%). Calibration plots showed an acceptable parallelism and similar discrimination ability for both PSA levels with an area under the curve of 0.69 for PCa and 0.74 for significant PCa. The ERSPC showed 226 (30.2%) unnecessary biopsies with the loss of 10 significant PCa. The variability of the RC was 16% for PCa and 20% for significant PCa, and a higher variability was associated with a reduced risk of significant PCa. We can conclude that the performance of the ERSPC-RC in the present cohort shows a high similitude between the 2 PSA levels; however, the RC variability value is associated with a decreased risk of significant PCa. The use of the ERSPC in our cohort detects a high number of unnecessary biopsies. Thus, the incorporation of ERSPC-RC could help the clinical decision to carry out a prostate biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Known Unknowns in Judgment and Choice

    OpenAIRE

    Walters, Daniel

    2017-01-01

    This dissertation investigates how people make inferences about missing information. Whereas most prior literature focuses on how people process known information, I show that the extent to which people make inferences about missing information impacts judgments and choices. Specifically, I investigate how (1) awareness of known unknowns affects overconfidence in judgment in Chapter 1, (2) beliefs about the knowability of unknowns impacts investment strategies in Chapter 2, and (3) inferences...

  3. A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.

    Science.gov (United States)

    Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco

    2005-02-01

    Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.

  4. Lower limits for distribution tails of randomly stopped sums

    NARCIS (Netherlands)

    Denisov, D.E.; Korshunov, D.A.; Foss, S.G.

    2008-01-01

    We study lower limits for the ratio $\\overline{F^{*\\tau}}(x)/\\,\\overline F(x)$ of tail distributions, where $F^{*\\tau}$ is a distribution of a sum of a random size $\\tau$ of independent identically distributed random variables having a common distribution $F$, and a random variable $\\tau$ does not

  5. Quantifying variability in earthquake rupture models using multidimensional scaling: application to the 2011 Tohoku earthquake

    KAUST Repository

    Razafindrakoto, Hoby

    2015-04-22

    Finite-fault earthquake source inversion is an ill-posed inverse problem leading to non-unique solutions. In addition, various fault parametrizations and input data may have been used by different researchers for the same earthquake. Such variability leads to large intra-event variability in the inferred rupture models. One way to understand this problem is to develop robust metrics to quantify model variability. We propose a Multi Dimensional Scaling (MDS) approach to compare rupture models quantitatively. We consider normalized squared and grey-scale metrics that reflect the variability in the location, intensity and geometry of the source parameters. We test the approach on two-dimensional random fields generated using a von Kármán autocorrelation function and varying its spectral parameters. The spread of points in the MDS solution indicates different levels of model variability. We observe that the normalized squared metric is insensitive to variability of spectral parameters, whereas the grey-scale metric is sensitive to small-scale changes in geometry. From this benchmark, we formulate a similarity scale to rank the rupture models. As case studies, we examine inverted models from the Source Inversion Validation (SIV) exercise and published models of the 2011 Mw 9.0 Tohoku earthquake, allowing us to test our approach for a case with a known reference model and one with an unknown true solution. The normalized squared and grey-scale metrics are respectively sensitive to the overall intensity and the extension of the three classes of slip (very large, large, and low). Additionally, we observe that a three-dimensional MDS configuration is preferable for models with large variability. We also find that the models for the Tohoku earthquake derived from tsunami data and their corresponding predictions cluster with a systematic deviation from other models. We demonstrate the stability of the MDS point-cloud using a number of realizations and jackknife tests, for

  6. Quantifying variability in earthquake rupture models using multidimensional scaling: application to the 2011 Tohoku earthquake

    KAUST Repository

    Razafindrakoto, Hoby; Mai, Paul Martin; Genton, Marc G.; Zhang, Ling; Thingbaijam, Kiran Kumar

    2015-01-01

    Finite-fault earthquake source inversion is an ill-posed inverse problem leading to non-unique solutions. In addition, various fault parametrizations and input data may have been used by different researchers for the same earthquake. Such variability leads to large intra-event variability in the inferred rupture models. One way to understand this problem is to develop robust metrics to quantify model variability. We propose a Multi Dimensional Scaling (MDS) approach to compare rupture models quantitatively. We consider normalized squared and grey-scale metrics that reflect the variability in the location, intensity and geometry of the source parameters. We test the approach on two-dimensional random fields generated using a von Kármán autocorrelation function and varying its spectral parameters. The spread of points in the MDS solution indicates different levels of model variability. We observe that the normalized squared metric is insensitive to variability of spectral parameters, whereas the grey-scale metric is sensitive to small-scale changes in geometry. From this benchmark, we formulate a similarity scale to rank the rupture models. As case studies, we examine inverted models from the Source Inversion Validation (SIV) exercise and published models of the 2011 Mw 9.0 Tohoku earthquake, allowing us to test our approach for a case with a known reference model and one with an unknown true solution. The normalized squared and grey-scale metrics are respectively sensitive to the overall intensity and the extension of the three classes of slip (very large, large, and low). Additionally, we observe that a three-dimensional MDS configuration is preferable for models with large variability. We also find that the models for the Tohoku earthquake derived from tsunami data and their corresponding predictions cluster with a systematic deviation from other models. We demonstrate the stability of the MDS point-cloud using a number of realizations and jackknife tests, for

  7. Sum of ratios of products forα-μ random variables in wireless multihop relaying and multiple scattering

    KAUST Repository

    Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim

    2014-01-01

    The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.

  8. Sum of ratios of products forα-μ random variables in wireless multihop relaying and multiple scattering

    KAUST Repository

    Wang, Kezhi

    2014-09-01

    The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.

  9. The mesoscopic conductance of disordered rings, its random matrix theory and the generalized variable range hopping picture

    International Nuclear Information System (INIS)

    Stotland, Alexander; Peer, Tal; Cohen, Doron; Budoyo, Rangga; Kottos, Tsampikos

    2008-01-01

    The calculation of the conductance of disordered rings requires a theory that goes beyond the Kubo-Drude formulation. Assuming 'mesoscopic' circumstances the analysis of the electro-driven transitions shows similarities with a percolation problem in energy space. We argue that the texture and the sparsity of the perturbation matrix dictate the value of the conductance, and study its dependence on the disorder strength, ranging from the ballistic to the Anderson localization regime. An improved sparse random matrix model is introduced to capture the essential ingredients of the problem, and leads to a generalized variable range hopping picture. (fast track communication)

  10. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  11. A New Class of Particle Filters for Random Dynamic Systems with Unknown Statistics

    Directory of Open Access Journals (Sweden)

    Joaquín Míguez

    2004-11-01

    Full Text Available In recent years, particle filtering has become a powerful tool for tracking signals and time-varying parameters of random dynamic systems. These methods require a mathematical representation of the dynamics of the system evolution, together with assumptions of probabilistic models. In this paper, we present a new class of particle filtering methods that do not assume explicit mathematical forms of the probability distributions of the noise in the system. As a consequence, the proposed techniques are simpler, more robust, and more flexible than standard particle filters. Apart from the theoretical development of specific methods in the new class, we provide computer simulation results that demonstrate the performance of the algorithms in the problem of autonomous positioning of a vehicle in a 2-dimensional space.

  12. Growth Estimators and Confidence Intervals for the Mean of Negative Binomial Random Variables with Unknown Dispersion

    Directory of Open Access Journals (Sweden)

    David Shilane

    2013-01-01

    Full Text Available The negative binomial distribution becomes highly skewed under extreme dispersion. Even at moderately large sample sizes, the sample mean exhibits a heavy right tail. The standard normal approximation often does not provide adequate inferences about the data's expected value in this setting. In previous work, we have examined alternative methods of generating confidence intervals for the expected value. These methods were based upon Gamma and Chi Square approximations or tail probability bounds such as Bernstein's inequality. We now propose growth estimators of the negative binomial mean. Under high dispersion, zero values are likely to be overrepresented in the data. A growth estimator constructs a normal-style confidence interval by effectively removing a small, predetermined number of zeros from the data. We propose growth estimators based upon multiplicative adjustments of the sample mean and direct removal of zeros from the sample. These methods do not require estimating the nuisance dispersion parameter. We will demonstrate that the growth estimators' confidence intervals provide improved coverage over a wide range of parameter values and asymptotically converge to the sample mean. Interestingly, the proposed methods succeed despite adding both bias and variance to the normal approximation.

  13. Randomized Trial of a Lifestyle Physical Activity Intervention for Breast Cancer Survivors: Effects on Transtheoretical Model Variables.

    Science.gov (United States)

    Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen

    2018-01-01

    This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.

  14. Mobile assistant for unknown caller identification

    OpenAIRE

    Hribernik, Andraž

    2012-01-01

    The main motivation of this diploma thesis is a development of Android application, which helps user of application to find out who is the owner of unknown phone number. Data source for finding unknown phone number are free available web sources. Through the development of prototype, data from different web sources were integrated. Result of this integration is shown in Android application. Data integration includes access to semi-structured data on web portal of Phone Directory of Slovenia, ...

  15. Variational Infinite Hidden Conditional Random Fields

    NARCIS (Netherlands)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin

    2015-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of

  16. Use of implantable and external loop recorders in syncope with unknown causes

    Directory of Open Access Journals (Sweden)

    Kaoru Tanno

    2017-12-01

    Full Text Available The gold standard for diagnosing syncope is to elucidate the symptom-electrocardiogram (ECG correlation. The ECG recordings during syncope allow physicians to either confirm or exclude an arrhythmia as the mechanism of syncope. Many studies have investigated the use of internal loop recorder (ILR, while few studies have used external loop recorder (ELR for patients with unexplained syncope. The aim of this review is to clarify the clinical usefulness of ILR and ELR in the diagnosis and management of patients with unexplained syncope. Many observational and four randomized control studies have shown that ILR for patients with unknown syncope is a useful tool for early diagnosis and improving diagnosis rate. ILR also provides important information on the mechanism of syncope and treatment strategy. However, there is no evidence of total mortality or quality of life improvements with ILR. The diagnostic yield of ELR in patients with syncope was similar to that with ILR within the same timeframe. Therefore, ELR could be considered for long-term ECG monitoring before a patient switches to using ILR. A systematic approach and selection of ECG monitoring tools reduces health care costs and improves the selection of patients for optimal treatment possibilities. Keywords: Internal loop recorder, External loop recorder, Unknown Syncope

  17. Asymptotic distribution of products of sums of independent random ...

    Indian Academy of Sciences (India)

    integrable random variables (r.v.) are asymptotically log-normal. This fact ... the product of the partial sums of i.i.d. positive random variables as follows. .... Now define ..... by Henan Province Foundation and Frontier Technology Research Plan.

  18. A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables

    Science.gov (United States)

    Vernizzi, Graziano; Nakai, Miki

    2015-01-01

    It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…

  19. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  20. MoCha: Molecular Characterization of Unknown Pathways.

    Science.gov (United States)

    Lobo, Daniel; Hammelman, Jennifer; Levin, Michael

    2016-04-01

    Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.

  1. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  2. Variability in response to albuminuria-lowering drugs

    DEFF Research Database (Denmark)

    Petrykiv, Sergei I; de Zeeuw, Dick; Persson, Frederik

    2017-01-01

    AIMS: Albuminuria-lowering drugs have shown different effect size in different individuals. Since urine albumin levels are known to vary considerably from day-to-day, we questioned whether the between-individual variability in albuminuria response after therapy initiation reflects a random...... variability or a true response variation to treatment. In addition, we questioned whether the response variability is drug dependent. METHODS: To determine whether the response to treatment is random or a true drug response, we correlated in six clinical trials the change in albuminuria during placebo...... or active treatment (on-treatment) with the change in albuminuria during wash-out (off-treatment). If these responses correlate during active treatment, it suggests that at least part of the response variability can be attributed to drug response variability. We tested this for enalapril, losartan...

  3. Dynamic modelling and adaptive robust tracking control of a space robot with two-link flexible manipulators under unknown disturbances

    Science.gov (United States)

    Yang, Xinxin; Ge, Shuzhi Sam; He, Wei

    2018-04-01

    In this paper, both the closed-form dynamics and adaptive robust tracking control of a space robot with two-link flexible manipulators under unknown disturbances are developed. The dynamic model of the system is described with assumed modes approach and Lagrangian method. The flexible manipulators are represented as Euler-Bernoulli beams. Based on singular perturbation technique, the displacements/joint angles and flexible modes are modelled as slow and fast variables, respectively. A sliding mode control is designed for trajectories tracking of the slow subsystem under unknown but bounded disturbances, and an adaptive sliding mode control is derived for slow subsystem under unknown slowly time-varying disturbances. An optimal linear quadratic regulator method is proposed for the fast subsystem to damp out the vibrations of the flexible manipulators. Theoretical analysis validates the stability of the proposed composite controller. Numerical simulation results demonstrate the performance of the closed-loop flexible space robot system.

  4. Persistent Surveillance of Transient Events with Unknown Statistics

    Science.gov (United States)

    2016-12-18

    where H(m,k) is the Kullback-Leibler (KL) divergence be- tween two Poisson distributed random variables with means m and k and W is the Lambert W...greater than (1− ε)k−1, i.e., P ( Var(λ̂i,k|X (1:k−1))≤ δ k−1Var(λi)|X (1:k−1) ) > (1− ε)k−1 for all stations i ∈ [n]. Theorem 5 (ξ -Bound on the...n]. Theorem 6 (∆-Bound on Policy Optimality). For any ξi ∈ R+, i ∈ [n], given that 0 < |λ̂i,k−λi|< ξi with probability as given in Theorem 5, let

  5. The effects of variable practice on locomotor adaptation to a novel asymmetric gait.

    Science.gov (United States)

    Hinkel-Lipsker, Jacob W; Hahn, Michael E

    2017-09-01

    Very little is known about the effects of specific practice on motor learning of predictive balance control during novel bipedal gait. This information could provide an insight into how the direction and magnitude of predictive errors during acquisition of a novel gait task influence transfer of balance control, as well as yield a practice protocol for the restoration of balance for those with locomotor impairments. This study examined the effect of a variable practice paradigm on transfer of a novel asymmetric gait pattern in able-bodied individuals. Using a split-belt treadmill, one limb was driven at a constant velocity (constant limb) and the other underwent specific changes in velocity (variable limb) during practice according to one of three prescribed practice paradigms: serial, where the variable limb velocity increased linearly; random blocked, where variable limb underwent random belt velocity changes every 20 strides; and random practice, where the variable limb underwent random step-to-step changes in velocity. Random practice showed the highest balance control variability during acquisition compared to serial and random blocked practice which demonstrated the best transfer of balance control on one transfer test. Both random and random blocked practices showed significantly less balance control variability during a second transfer test compared to serial practice. These results indicate that random blocked practice may be best for generalizability of balance control while learning a novel gait, perhaps, indicating that individuals who underwent this practice paradigm were able to find the most optimal balance control solution during practice.

  6. Common characterization of variability and forecast errors of variable energy sources and their mitigation using reserves in power system integration studies

    Energy Technology Data Exchange (ETDEWEB)

    Menemenlis, N.; Huneault, M. [IREQ, Varennes, QC (Canada); Robitaille, A. [Dir. Plantif. de la Production Eolienne, Montreal, QC (Canada). HQ Production; Holttinen, H. [VTT Technical Research Centre of Finland, VTT (Finland)

    2012-07-01

    This In this paper we define and characterize the two random variables, variability and forecast error, over which uncertainty in power systems operations is characterized and mitigated. We show that the characterization of both these variables can be carried out with the same mathematical tools. Furthermore, this common characterization of random variables lends itself to a common methodology for the calculation of non-contingency reserves required to mitigate their effects. A parallel comparison of these two variables demonstrates similar inherent statistical properties. They depend on imminent conditions, evolve with time and can be asymmetric. Correlation is an important factor when aggregating individual wind farm characteristics in forming the distribution of the total wind generation for imminent conditions. (orig.)

  7. Soil variability in engineering applications

    Science.gov (United States)

    Vessia, Giovanna

    2014-05-01

    Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random

  8. Metabolic Profiling of Adiponectin Levels in Adults: Mendelian Randomization Analysis.

    Science.gov (United States)

    Borges, Maria Carolina; Barros, Aluísio J D; Ferreira, Diana L Santos; Casas, Juan Pablo; Horta, Bernardo Lessa; Kivimaki, Mika; Kumari, Meena; Menon, Usha; Gaunt, Tom R; Ben-Shlomo, Yoav; Freitas, Deise F; Oliveira, Isabel O; Gentry-Maharaj, Aleksandra; Fourkala, Evangelia; Lawlor, Debbie A; Hingorani, Aroon D

    2017-12-01

    Adiponectin, a circulating adipocyte-derived protein, has insulin-sensitizing, anti-inflammatory, antiatherogenic, and cardiomyocyte-protective properties in animal models. However, the systemic effects of adiponectin in humans are unknown. Our aims were to define the metabolic profile associated with higher blood adiponectin concentration and investigate whether variation in adiponectin concentration affects the systemic metabolic profile. We applied multivariable regression in ≤5909 adults and Mendelian randomization (using cis -acting genetic variants in the vicinity of the adiponectin gene as instrumental variables) for analyzing the causal effect of adiponectin in the metabolic profile of ≤37 545 adults. Participants were largely European from 6 longitudinal studies and 1 genome-wide association consortium. In the multivariable regression analyses, higher circulating adiponectin was associated with higher high-density lipoprotein lipids and lower very-low-density lipoprotein lipids, glucose levels, branched-chain amino acids, and inflammatory markers. However, these findings were not supported by Mendelian randomization analyses for most metabolites. Findings were consistent between sexes and after excluding high-risk groups (defined by age and occurrence of previous cardiovascular event) and 1 study with admixed population. Our findings indicate that blood adiponectin concentration is more likely to be an epiphenomenon in the context of metabolic disease than a key determinant. © 2017 The Authors.

  9. Protecting chips against hold time violations due to variability

    CERN Document Server

    Neuberger, Gustavo; Reis, Ricardo

    2013-01-01

    With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus

  10. Determination of the origin of unknown irradiated nuclear fuel.

    Science.gov (United States)

    Nicolaou, G

    2006-01-01

    An isotopic fingerprinting method is presented to determine the origin of unknown nuclear material with forensic importance. Spent nuclear fuel of known origin has been considered as the 'unknown' nuclear material in order to demonstrate the method and verify its prediction capabilities. The method compares, using factor analysis, the measured U, Pu isotopic compositions of the 'unknown' material with U, Pu isotopic compositions simulating well known spent fuels from a range of commercial nuclear power stations. Then, the 'unknown' fuel has the same origin as the commercial fuel with which it exhibits the highest similarity in U, Pu compositions.

  11. Determination of the origin of unknown irradiated nuclear fuel

    International Nuclear Information System (INIS)

    Nicolaou, G.

    2006-01-01

    An isotopic fingerprinting method is presented to determine the origin of unknown nuclear material with forensic importance. Spent nuclear fuel of known origin has been considered as the 'unknown' nuclear material in order to demonstrate the method and verify its prediction capabilities. The method compares, using factor analysis, the measured U, Pu isotopic compositions of the 'unknown' material with U, Pu isotopic compositions simulating well known spent fuels from a range of commercial nuclear power stations. Then, the 'unknown' fuel has the same origin as the commercial fuel with which it exhibits the highest similarity in U, Pu compositions

  12. Function analysis of unknown genes

    DEFF Research Database (Denmark)

    Rogowska-Wrzesinska, A.

    2002-01-01

      This thesis entitled "Function analysis of unknown genes" presents the use of proteome analysis for the characterisation of yeast (Saccharomyces cerevisiae) genes and their products (proteins especially those of unknown function). This study illustrates that proteome analysis can be used...... to describe different aspects of molecular biology of the cell, to study changes that occur in the cell due to overexpression or deletion of a gene and to identify various protein modifications. The biological questions and the results of the described studies show the diversity of the information that can...... genes and proteins. It reports the first global proteome database collecting 36 yeast single gene deletion mutants and selecting over 650 differences between analysed mutants and the wild type strain. The obtained results show that two-dimensional gel electrophoresis and mass spectrometry based proteome...

  13. Variable lung protective mechanical ventilation decreases incidence of postoperative delirium and cognitive dysfunction during open abdominal surgery.

    Science.gov (United States)

    Wang, Ruichun; Chen, Junping; Wu, Guorong

    2015-01-01

    Postoperative cognitive dysfunction (POCD) is a subtle impairment of cognitive abilities and can manifest on different neuropsychological features in the early postoperative period. It has been proved that the use of mechanical ventilation (MV) increased the development of delirium and POCD. However, the impact of variable and conventional lung protective mechanical ventilation on the incidence of POCD still remains unknown, which was the aim of this study. 162 patients scheduled to undergo elective gastrointestinal tumor resection via laparotomy in Ningbo No. 2 hospital with expected duration >2 h from June, 2013 to June, 2015 were enrolled in this study. Patients included were divided into two groups according to the scheme of lung protective MV, variable ventilation group (VV group, n=79) and conventional ventilation group (CV group, n=83) by randomization performed by random block randomization. The plasma levels of inflammatory cytokines, characteristics of the surgical procedure, incidence of delirium and POCD were collected and compared. Postoperative delirium was detected in 36 of 162 patients (22.2%) and 12 patients of these (16.5%) belonged to the VV group while 24 patients (28.9%) were in the CV group (P=0.036). POCD on the seventh postoperative day in CV group (26/83, 31.3%) was increased in comparison with the VV group (14/79, 17.7%) with significant statistical difference (P=0.045). The levels of inflammatory cytokines were all significantly higher in CV group than those in VV group on the 1st postoperative day (Pprotective MV decreased the incidence of postoperative delirium and POCD by reducing the systemic proinflammatory response.

  14. A combinatorial and probabilistic study of initial and end heights of descents in samples of geometrically distributed random variables and in permutations

    Directory of Open Access Journals (Sweden)

    Helmut Prodinger

    2007-01-01

    Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.

  15. Variably protease-sensitive prionopathy in the UK: a retrospective review 1991-2008

    NARCIS (Netherlands)

    Head, M.W.; Yull, H.M.; Ritchie, D.L.; Langeveld, J.P.M.; Fletcher, N.A.; Knight, R.S.; Ironside, J.W.

    2013-01-01

    Variably protease-sensitive prionopathy is a newly described human prion disease of unknown aetiology lying out with the hitherto recognized phenotypic spectrum of Creutzfeldt-Jakob disease. Two cases that conform to the variably protease-sensitive prionopathy phenotype have been identified

  16. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  17. 9 CFR 2.133 - Certification for random source dogs and cats.

    Science.gov (United States)

    2010-01-01

    ..., DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Miscellaneous § 2.133 Certification for random source... of birth or, if unknown, then the approximate age; (iv) The color and any distinctive markings; and...

  18. Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables

    Directory of Open Access Journals (Sweden)

    Andrew Rosalsky

    2004-12-01

    Full Text Available Let {X,Xn;n≥1} be a sequence of real-valued i.i.d. random variables and let Sn=∑i=1nXi, n≥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ϕ(x=1, then for every t>0, limsupn→∞P(|Sn|>tn1/p/(nϕ(n=tpα.

  19. Organizational Conditions for Dealing with The Unknown Unknown Illustrated by how a Dutch water management authority is preparing for climate change

    NARCIS (Netherlands)

    Termeer, Catrien J. A. M.; van den Brink, Margo A.

    2013-01-01

    The central question of this article is the extent to which organizations, governmental authorities in particular, are able to deal with the unknown unknown. Drawing on Weick's work on sensemaking, we introduce seven organizational conditions that can facilitate organizations to be reliable under

  20. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  1. Unknown foundation determination for scour.

    Science.gov (United States)

    2012-04-01

    Unknown foundations affect about 9,000 bridges in Texas. For bridges over rivers, this creates a problem : regarding scour decisions as the calculated scour depth cannot be compared to the foundation depth, and a : very conservative costly approach m...

  2. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune

    Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...

  3. Outpatient endometrial aspiration: an alternative to methotrexate for pregnancy of unknown location.

    Science.gov (United States)

    Insogna, Iris G; Farland, Leslie V; Missmer, Stacey A; Ginsburg, Elizabeth S; Brady, Paula C

    2017-08-01

    Pregnancies of unknown location with abnormal beta-human chorionic gonadotropin trends are frequently treated as presumed ectopic pregnancies with methotrexate. Preliminary data suggest that outpatient endometrial aspiration may be an effective tool to diagnose pregnancy location, while also sparing women exposure to methotrexate. The purpose of this study was to evaluate the utility of an endometrial sampling protocol for the diagnosis of pregnancies of unknown location after in vitro fertilization. A retrospective cohort study of 14,505 autologous fresh and frozen in vitro fertilization cycles from October 2007 to September 2015 was performed; 110 patients were diagnosed with pregnancy of unknown location, defined as a positive beta-human chorionic gonadotropin without ultrasound evidence of intrauterine or ectopic pregnancy and an abnormal beta-human chorionic gonadotropin trend (location, failed intrauterine pregnancy was diagnosed in 46 patients (42%), and ectopic pregnancy was diagnosed in 64 patients (58%). Clinical variables that included fresh or frozen embryo transfer, day of embryo transfer, serum beta-human chorionic gonadotropin at the time of sampling, endometrial thickness, and presence of an adnexal mass were not significantly different between patients with failed intrauterine pregnancy or ectopic pregnancy. In patients with failed intrauterine pregnancy, 100% demonstrated adequate postsampling beta-human chorionic gonadotropin declines; villi were identified in just 46% (n=21 patients). Patients with failed intrauterine pregnancy had significantly shorter time to resolution (negative serum beta-human chorionic gonadotropin) after sampling compared with patients with ectopic pregnancy (12.6 vs 26.3 days; Plocation are spared methotrexate, with a shorter time to pregnancy resolution than those who receive methotrexate. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Organizational conditions for dealing with the unknown unknown : illustrated by how a Dutch water management authority is preparing for climate change

    NARCIS (Netherlands)

    Termeer, C.J.A.M.; Brink, van den M.A.

    2013-01-01

    The central question of this article is the extent to which organizations, governmental authorities in particular, are able to deal with the unknown unknown. Drawing on Weick’s work on sensemaking, we introduce seven organizational conditions that can facilitate organizations to be reliable under

  5. A hybrid search algorithm for swarm robots searching in an unknown environment.

    Science.gov (United States)

    Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao

    2014-01-01

    This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency.

  6. ENSO modulation of interannual variability of dust aerosols over the northwest Indian ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Banerjee, P.; PrasannaKumar, S.

    Mineral dust is known to affect many aspects of the climate of the north Indian Ocean (IO) However, what controls its interannual variability over this region is largely unknown The authors study the mechanism controlling the interannual variability...

  7. What variables are important in predicting bovine viral diarrhea virus? A random forest approach.

    Science.gov (United States)

    Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo

    2015-07-24

    Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.

  8. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  9. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, R.; Brincker, Rune

    1998-01-01

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  10. The variable stars of NGC 1866

    International Nuclear Information System (INIS)

    Welch, D.L.; Cote, P.; Fischer, P.; Mateo, M.; Madore, B.F.

    1991-01-01

    A search has been conducted for new variables in the LMC cluster NGC 1866 using new multiepoch CCD photometry. Eight previously unknown Cepheid variables, most near the cluster core, are found. Of the new variables reported by Storm et al. (188), only six of 10 appear to be Cepheids and one of these is not a member. Periods and mean magnitudes and colors for sufficiently uncrowded variables are reported, as is one red giant variable of long period and one Cepheid which is a single-lined spectroscopic binary with a velocity semiamplitude greater than or equal to 10.5 km/s. The variation of light-curve amplitude with position in the instability strip is reported along with an apparently nonvariable star, which is a radial velocity member, in the strip. A true distance modulus of 18.57 + or - 0.01 mag is obtained for the cluster. 36 refs

  11. Nasal Jet-CPAP (variable flow) versus Bubble-CPAP in preterm infants with respiratory distress: an open label, randomized controlled trial.

    Science.gov (United States)

    Bhatti, A; Khan, J; Murki, S; Sundaram, V; Saini, S S; Kumar, P

    2015-11-01

    To compare the failure rates between Jet continuous positive airway pressure device (J-CPAP-variable flow) and Bubble continuous positive airway device (B-CPAP) in preterm infants with respiratory distress. Preterm newborns CPAP (a variable flow device) or B-CPAP (continuous flow device). A standardized protocol was followed for titration, weaning and removal of CPAP. Pressure was monitored close to the nares in both the devices every 6 hours and settings were adjusted to provide desired CPAP. The primary outcome was CPAP failure rate within 72 h of life. Secondary outcomes were CPAP failure within 7 days of life, need for surfactant post-randomization, time to CPAP failure, duration of CPAP and complications of prematurity. An intention to treat analysis was done. One-hundred seventy neonates were randomized, 80 to J-CPAP and 90 to B-CPAP. CPAP failure rates within 72 h were similar in infants who received J-CPAP and in those who received B-CPAP (29 versus 21%; relative risks 1.4 (0.8 to 2.3), P=0.25). Mean (95% confidence intervals) time to CPAP failure was 59 h (54 to 64) in the Jet CPAP group in comparison with 65 h (62 to 68) in the Bubble CPAP group (log rank P=0.19). All other secondary outcomes were similar between the two groups. In preterm infants with respiratory distress starting within 6 h of life, CPAP failure rates were similar with Jet CPAP and Bubble CPAP.

  12. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  13. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  14. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks(Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels. © 2006 IEEE.

  15. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  16. Some results on convergence rates for probabilities of moderate deviations for sums of random variables

    Directory of Open Access Journals (Sweden)

    Deli Li

    1992-01-01

    Full Text Available Let X, Xn, n≥1 be a sequence of iid real random variables, and Sn=∑k=1nXk, n≥1. Convergence rates of moderate deviations are derived, i.e., the rate of convergence to zero of certain tail probabilities of the partial sums are determined. For example, we obtain equivalent conditions for the convergence of series ∑n≥1(ψ2(n/nP(|Sn|≥nφ(n only under the assumptions convergence that EX=0 and EX2=1, where φ and ψ are taken from a broad class of functions. These results generalize and improve some recent results of Li (1991 and Gafurov (1982 and some previous work of Davis (1968. For b∈[0,1] and ϵ>0, letλϵ,b=∑n≥3((loglognb/nI(|Sn|≥(2+ϵnloglogn.The behaviour of Eλϵ,b as ϵ↓0 is also studied.

  17. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  18. A study of probabilistic fatigue crack propagation models in Mg Al Zn alloys under different specimen thickness conditions by using the residual of a random variable

    International Nuclear Information System (INIS)

    Choi, Seon Soon

    2012-01-01

    The primary aim of this paper was to evaluate several probabilistic fatigue crack propagation models using the residual of a random variable, and to present the model fit for probabilistic fatigue behavior in Mg Al Zn alloys. The proposed probabilistic models are the probabilistic Paris Erdogan model, probabilistic Walker model, probabilistic Forman model, and probabilistic modified Forman models. These models were prepared by applying a random variable to the empirical fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models vor describing fatigue crack propagation behavior in Mg Al Zn alloys were generally the probabilistic Paris Erdogan and probabilistic Walker models. The probabilistic Forman model was a good model only for a specimen with a thickness of 9.45mm

  19. Variational Bayesian labeled multi-Bernoulli filter with unknown sensor noise statistics

    Directory of Open Access Journals (Sweden)

    Qiu Hao

    2016-10-01

    Full Text Available It is difficult to build accurate model for measurement noise covariance in complex backgrounds. For the scenarios of unknown sensor noise variances, an adaptive multi-target tracking algorithm based on labeled random finite set and variational Bayesian (VB approximation is proposed. The variational approximation technique is introduced to the labeled multi-Bernoulli (LMB filter to jointly estimate the states of targets and sensor noise variances. Simulation results show that the proposed method can give unbiased estimation of cardinality and has better performance than the VB probability hypothesis density (VB-PHD filter and the VB cardinality balanced multi-target multi-Bernoulli (VB-CBMeMBer filter in harsh situations. The simulations also confirm the robustness of the proposed method against the time-varying noise variances. The computational complexity of proposed method is higher than the VB-PHD and VB-CBMeMBer in extreme cases, while the mean execution times of the three methods are close when targets are well separated.

  20. Ratio index variables or ANCOVA? Fisher's cats revisited.

    Science.gov (United States)

    Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S

    2010-01-01

    Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.

  1. The Initial Regression Statistical Characteristics of Intervals Between Zeros of Random Processes

    Directory of Open Access Journals (Sweden)

    V. K. Hohlov

    2014-01-01

    Full Text Available The article substantiates the initial regression statistical characteristics of intervals between zeros of realizing random processes, studies their properties allowing the use these features in the autonomous information systems (AIS of near location (NL. Coefficients of the initial regression (CIR to minimize the residual sum of squares of multiple initial regression views are justified on the basis of vector representations associated with a random vector notion of analyzed signal parameters. It is shown that even with no covariance-based private CIR it is possible to predict one random variable through another with respect to the deterministic components. The paper studies dependences of CIR interval sizes between zeros of the narrowband stationary in wide-sense random process with its energy spectrum. Particular CIR for random processes with Gaussian and rectangular energy spectra are obtained. It is shown that the considered CIRs do not depend on the average frequency of spectra, are determined by the relative bandwidth of the energy spectra, and weakly depend on the type of spectrum. CIR properties enable its use as an informative parameter when implementing temporary regression methods of signal processing, invariant to the average rate and variance of the input implementations. We consider estimates of the average energy spectrum frequency of the random stationary process by calculating the length of the time interval corresponding to the specified number of intervals between zeros. It is shown that the relative variance in estimation of the average energy spectrum frequency of stationary random process with increasing relative bandwidth ceases to depend on the last process implementation in processing above ten intervals between zeros. The obtained results can be used in the AIS NL to solve the tasks of detection and signal recognition, when a decision is made in conditions of unknown mathematical expectations on a limited observation

  2. Paclitaxel/carboplatin with or without belinostat as empiric first-line treatment for patients with carcinoma of unknown primary site

    DEFF Research Database (Denmark)

    Hainsworth, John D; Daugaard, Gedske; Lesimple, Thierry

    2015-01-01

    : The addition of belinostat to paclitaxel/carboplatin did not improve the PFS of patients with CUP who were receiving first-line therapy, although the patients who received belinostat had a higher investigator-assessed response rate. Future trials in CUP should focus on specific subsets, defined either......BACKGROUND: The objective of this study was to evaluate the efficacy of belinostat, a histone deacetylase inhibitor, when added to paclitaxel/carboplatin in the empiric first-line treatment of patients with carcinoma of unknown primary site (CUP). METHODS: In this randomized phase 2 trial......, previously untreated patients with CUP were randomized to receive belinostat plus paclitaxel/carboplatin (group A) or paclitaxel/carboplatin alone (group B) repeated every 21 days. Patients were re-evaluated every 2 cycles, and those without disease progression continued treatment for 6 cycles. Patients...

  3. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    DEFF Research Database (Denmark)

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...

  4. Individualized Anemia Management Reduces Hemoglobin Variability in Hemodialysis Patients

    OpenAIRE

    Gaweda, Adam E.; Aronoff, George R.; Jacobs, Alfred A.; Rai, Shesh N.; Brier, Michael E.

    2013-01-01

    One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA ...

  5. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  6. Classification of Unknown Thermocouple Types Using Similarity Factor Measurement

    Directory of Open Access Journals (Sweden)

    Seshu K. DAMARLA

    2011-01-01

    Full Text Available In contrast to classification using PCA method, a new methodology is proposed for type identification of unknown thermocouple. The new methodology is based on calculating the degree of similarity between two multivariate datasets using two types of similarity factors. One similarity factor is based on principle component analysis and the angles between the principle component subspaces while the other is based on the Mahalanobis distance between the datasets. Datasets containing thermo-emfs against given temperature ranges are formed for each type of thermocouple (e.g. J, K, S, T, R, E, B and N type by experimentation are considered as reference datasets. Datasets corresponding to unknown type are captured. Similarity factor between the datasets one of which being the unknown type and the other being each known type are compared. When maximum similarity factor occurs, then the class of unknown type is allocated to that of known type.

  7. A Review on asymptotic normality of sums of associated random ...

    African Journals Online (AJOL)

    Association between random variables is a generalization of independence of these random variables. This concept is more and more commonly used in current trends in any research elds in Statistics. In this paper, we proceed to a simple, clear and rigorous introduction to it. We will present the fundamental asymptotic ...

  8. The Gifted and the Shadow of the Night: Dabrowski's Overexcitabilities and Their Correlation to Insomnia, Death Anxiety, and Fear of the Unknown

    Science.gov (United States)

    Harrison, Gregory E.; Van Haneghan, James P.

    2011-01-01

    Purportedly fear of the unknown, death anxiety, and insomnia are prevalent problems among some gifted individuals. The present study tested this assertion and examined the relationship of these variables to Dabrowski's (1967) overexcitabilities. The study involved 73 gifted and 143 typical middle and high school adolescents who were given a death…

  9. Secure self-calibrating quantum random-bit generator

    International Nuclear Information System (INIS)

    Fiorentino, M.; Santori, C.; Spillane, S. M.; Beausoleil, R. G.; Munro, W. J.

    2007-01-01

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographic method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled

  10. A simulation-based goodness-of-fit test for random effects in generalized linear mixed models

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus

    2006-01-01

    The goodness-of-fit of the distribution of random effects in a generalized linear mixed model is assessed using a conditional simulation of the random effects conditional on the observations. Provided that the specified joint model for random effects and observations is correct, the marginal...... distribution of the simulated random effects coincides with the assumed random effects distribution. In practice, the specified model depends on some unknown parameter which is replaced by an estimate. We obtain a correction for this by deriving the asymptotic distribution of the empirical distribution...

  11. A simulation-based goodness-of-fit test for random effects in generalized linear mixed models

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    The goodness-of-fit of the distribution of random effects in a generalized linear mixed model is assessed using a conditional simulation of the random effects conditional on the observations. Provided that the specified joint model for random effects and observations is correct, the marginal...... distribution of the simulated random effects coincides with the assumed random effects distribution. In practice the specified model depends on some unknown parameter which is replaced by an estimate. We obtain a correction for this by deriving the asymptotic distribution of the empirical distribution function...

  12. Chronic kidney disease of unknown aetiology in Sri Lanka: is cadmium a likely cause?

    Directory of Open Access Journals (Sweden)

    Peiris-John Roshini J

    2011-07-01

    Full Text Available Abstract Background The rising prevalence of chronic kidney disease (CKD and subsequent end stage renal failure necessitating renal replacement therapy has profound consequences for affected individuals and health care resources. This community based study was conducted to identify potential predictors of microalbuminuria in a randomly selected sample of adults from the North Central Province (NCP of Sri Lanka, where the burden of CKD is pronounced and the underlying cause still unknown. Methods Exposures to possible risk factors were determined in randomly recruited subjects (425 females and 461 males from selected areas of the NCP of Sri Lanka using an interviewer administered questionnaire. Sulphosalicylic acid and the Light Dependent Resister microalbumin gel filtration method was used for initial screening for microalbuminuria and reconfirmed by the Micral strip test. Results Microalbumnuria was detected in 6.1% of the females and 8.5% of the males. Smoking (p Conclusions Hypertension, diabetes mellitus, UTI, and smoking are known risk factors for microalbuminuria. The association between microalbuminuria and consumption of well water suggests an environmental aetiology to CKD in NCP. The causative agent is yet to be identified. Investigations for cadmium as a potential causative agent needs to be initiated.

  13. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  14. On synchronisation of a class of complex chaotic systems with complex unknown parameters via integral sliding mode control

    Science.gov (United States)

    Tirandaz, Hamed; Karami-Mollaee, Ali

    2018-06-01

    Chaotic systems demonstrate complex behaviour in their state variables and their parameters, which generate some challenges and consequences. This paper presents a new synchronisation scheme based on integral sliding mode control (ISMC) method on a class of complex chaotic systems with complex unknown parameters. Synchronisation between corresponding states of a class of complex chaotic systems and also convergence of the errors of the system parameters to zero point are studied. The designed feedback control vector and complex unknown parameter vector are analytically achieved based on the Lyapunov stability theory. Moreover, the effectiveness of the proposed methodology is verified by synchronisation of the Chen complex system and the Lorenz complex systems as the leader and the follower chaotic systems, respectively. In conclusion, some numerical simulations related to the synchronisation methodology is given to illustrate the effectiveness of the theoretical discussions.

  15. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  16. Fault diagnosis of an intelligent hydraulic pump based on a nonlinear unknown input observer

    Directory of Open Access Journals (Sweden)

    Zhonghai MA

    2018-02-01

    Full Text Available Hydraulic piston pumps are commonly used in aircraft. In order to improve the viability of aircraft and energy efficiency, intelligent variable pressure pump systems have been used in aircraft hydraulic systems more and more widely. Efficient fault diagnosis plays an important role in improving the reliability and performance of hydraulic systems. In this paper, a fault diagnosis method of an intelligent hydraulic pump system (IHPS based on a nonlinear unknown input observer (NUIO is proposed. Different from factors of a full-order Luenberger-type unknown input observer, nonlinear factors of the IHPS are considered in the NUIO. Firstly, a new type of intelligent pump is presented, the mathematical model of which is established to describe the IHPS. Taking into account the real-time requirements of the IHPS and the special structure of the pump, the mechanism of the intelligent pump and failure modes are analyzed and two typical failure modes are obtained. Furthermore, a NUIO of the IHPS is performed based on the output pressure and swashplate angle signals. With the residual error signals produced by the NUIO, online intelligent pump failure occurring in real-time can be detected. Lastly, through analysis and simulation, it is confirmed that this diagnostic method could accurately diagnose and isolate those typical failure modes of the nonlinear IHPS. The method proposed in this paper is of great significance in improving the reliability of the IHPS. Keywords: Fault diagnosis, Hydraulic piston pump, Model-based, Nonlinear unknown input observer (NUIO, Residual error

  17. A Model for Positively Correlated Count Variables

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    2010-01-01

    An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields...... and their potential applications. The purpose of this paper is to summarize useful probabilistic results, study stochastic constructions and simulation techniques, and discuss some examples of α-permanental random fields. This should provide a useful basis for discussing the statistical aspects in future work....

  18. [Comment on “Unknowns about climate variability render treaty targets premature”] Time to act is now

    Science.gov (United States)

    Dickinson, Robert E.

    While I can agree with several of Singer's points, I think his discussion distorts and confuses by ignoring the more important questions to be asked. What the United States can or should do about the buildup of greenhouse gases is much more of an ethical, moral, and economic issue than one that can be answered by science alone, and thus, is rightly being decided by political processes rather than by scientific committees. We do know much more about the question of climate change from greenhouse gases than we did 20 years ago when the issue first became of major concern. Indeed, it would take thousands of pages to put down in full all the details of what we now know; and such a description would also require hundreds of pages to say what we still don't know. The past Intergovernmental Panel on Climate Change (IPCC) reports have been carefully crafted, albeit heavily abbreviated, summaries of our current scientific understanding. It is fairly certain that in another 20 years our scientific understanding will be yet much more improved, but there will also still be many important unknowns.

  19. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  20. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  1. Classification and prediction of port variables

    Energy Technology Data Exchange (ETDEWEB)

    Molina Serrano, B.

    2016-07-01

    Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)

  2. Exploring multicollinearity using a random matrix theory approach.

    Science.gov (United States)

    Feher, Kristen; Whelan, James; Müller, Samuel

    2012-01-01

    Clustering of gene expression data is often done with the latent aim of dimension reduction, by finding groups of genes that have a common response to potentially unknown stimuli. However, what is poorly understood to date is the behaviour of a low dimensional signal embedded in high dimensions. This paper introduces a multicollinear model which is based on random matrix theory results, and shows potential for the characterisation of a gene cluster's correlation matrix. This model projects a one dimensional signal into many dimensions and is based on the spiked covariance model, but rather characterises the behaviour of the corresponding correlation matrix. The eigenspectrum of the correlation matrix is empirically examined by simulation, under the addition of noise to the original signal. The simulation results are then used to propose a dimension estimation procedure of clusters from data. Moreover, the simulation results warn against considering pairwise correlations in isolation, as the model provides a mechanism whereby a pair of genes with `low' correlation may simply be due to the interaction of high dimension and noise. Instead, collective information about all the variables is given by the eigenspectrum.

  3. Variability in research ethics review of cluster randomized trials: a scenario-based survey in three countries

    Science.gov (United States)

    2014-01-01

    Background Cluster randomized trials (CRTs) present unique ethical challenges. In the absence of a uniform standard for their ethical design and conduct, problems such as variability in procedures and requirements by different research ethics committees will persist. We aimed to assess the need for ethics guidelines for CRTs among research ethics chairs internationally, investigate variability in procedures for research ethics review of CRTs within and among countries, and elicit research ethics chairs’ perspectives on specific ethical issues in CRTs, including the identification of research subjects. The proper identification of research subjects is a necessary requirement in the research ethics review process, to help ensure, on the one hand, that subjects are protected from harm and exploitation, and on the other, that reviews of CRTs are completed efficiently. Methods A web-based survey with closed- and open-ended questions was administered to research ethics chairs in Canada, the United States, and the United Kingdom. The survey presented three scenarios of CRTs involving cluster-level, professional-level, and individual-level interventions. For each scenario, a series of questions was posed with respect to the type of review required (full, expedited, or no review) and the identification of research subjects at cluster and individual levels. Results A total of 189 (35%) of 542 chairs responded. Overall, 144 (84%, 95% CI 79 to 90%) agreed or strongly agreed that there is a need for ethics guidelines for CRTs and 158 (92%, 95% CI 88 to 96%) agreed or strongly agreed that research ethics committees could be better informed about distinct ethical issues surrounding CRTs. There was considerable variability among research ethics chairs with respect to the type of review required, as well as the identification of research subjects. The cluster-cluster and professional-cluster scenarios produced the most disagreement. Conclusions Research ethics committees

  4. Protocol for counterfactually transporting an unknown qubit

    Directory of Open Access Journals (Sweden)

    Hatim eSalih

    2016-01-01

    Full Text Available Quantum teleportation circumvents the uncertainty principle using dual channels: a quantum one consisting of previously-shared entanglement, and a classical one, together allowing the disembodied transport of an unknown quantum state over distance. It has recently been shown that a classical bit can be counterfactually communicated between two parties in empty space, Alice and Bob. Here, by using our dual version of the chained quantum Zeno effect to achieve a counterfactual CNOT gate, we propose a protocol for transporting an unknown qubit counterfactually, that is without any physical particles travelling between Alice and Bob—no classical channel and no previously-shared entanglement.

  5. Fever of unknown origin

    International Nuclear Information System (INIS)

    Misaki, Takashi; Matsui, Akira; Tanaka, Fumiko; Okuno, Yoshishige; Mitsumori, Michihide; Torizuka, Tatsurou; Dokoh, Shigeharu; Hayakawa, Katsumi; Shimbo, Shin-ichirou

    1990-01-01

    Gallium-67 scintigraphy is a commonly performed imaging modality in deteting pyrogenic lesions in cases of long-standing inexplainable fever. To re-evaluate the significance of gallium imaging in such cases, a retrospective review was made of 56 scans performed in febrile patients in whom sufficient clinical and laboratory findings were obtained. Gallium scans were true positive in 30 patients, false positive in 3, true negative in 19, and false negative in 4. In the group of true positive, local inflammatory lesions were detected in 23 patients with a final diagnosis of lung tuberculosis, urinary tract infection, and inflammatory joint disease. Abnormal gallium accumulation, as shown in the other 7 patients, provided clues to the diagnosis of generalized disorders, such as hematological malignancies (n=3), systemic autoimmune diseases (n=3), and severe infectious mononucleosis (n=one). In the group of false positive, gallium imaging revealed intestinal excretion of gallium in 2 patients and physiological pulmonary hilar accumulation in one. In the true negative group of 19 patients, fever of unknown origin was resolved spontaneously in 12 patients, and with antibiotics and corticosteroids in 2 and 5 patients, respectively. Four patients having false negative scans were finally diagnosed as having urinary tract infection (n=2), bacterial meningitis (n=one), and polyarteritis (n=one). Gallium imaging would remain the technique of choice in searching for origin of unknown fever. It may also be useful for early diagnosis of systemic disease, as well as focal inflammation. (N.K.)

  6. Emergence of an optimal search strategy from a simple random walk.

    Science.gov (United States)

    Sakiyama, Tomoko; Gunji, Yukio-Pegio

    2013-09-06

    In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths.

  7. Global Learning Spectral Archive- A new Way to deal with Unknown Urban Spectra -

    Science.gov (United States)

    Jilge, M.; Heiden, U.; Habermeyer, M.; Jürgens, C.

    2015-12-01

    Rapid urbanization processes and the need of identifying urban materials demand urban planners and the remote sensing community since years. Urban planners cannot overcome the issue of up-to-date information of urban materials due to time-intensive fieldwork. Hyperspectral remote sensing can facilitate this issue by interpreting spectral signals to provide information of occurring materials. However, the complexity of urban areas and the occurrence of diverse urban materials vary due to regional and cultural aspects as well as the size of a city, which makes identification of surface materials a challenging analysis task. For the various surface material identification approaches, spectral libraries containing pure material spectra are commonly used, which are derived from field, laboratory or the hyperspectral image itself. One of the requirements for successful image analysis is that all spectrally different surface materials are represented by the library. Currently, a universal library, applicable in every urban area worldwide and taking each spectral variability into account, is and will not be existent. In this study, the issue of unknown surface material spectra and the demand of an urban site-specific spectral library is tackled by the development of a learning spectral archive tool. Starting with an incomplete library of labelled image spectra from several German cities, surface materials of pure image pixels will be identified in a hyperspectral image based on a similarity measure (e.g. SID-SAM). Additionally, unknown image spectra of urban objects are identified based on an object- and spectral-based-rule set. The detected unknown surface material spectra are entered with additional metadata, such as regional occurrence into the existing spectral library and thus, are reusable for further studies. Our approach is suitable for pure surface material detection of urban hyperspectral images that is globally applicable by taking incompleteness into account

  8. Attention Measures of Accuracy, Variability, and Fatigue Detect Early Response to Donepezil in Alzheimer's Disease: A Randomized, Double-blind, Placebo-Controlled Pilot Trial.

    Science.gov (United States)

    Vila-Castelar, Clara; Ly, Jenny J; Kaplan, Lillian; Van Dyk, Kathleen; Berger, Jeffrey T; Macina, Lucy O; Stewart, Jennifer L; Foldi, Nancy S

    2018-04-09

    Donepezil is widely used to treat Alzheimer's disease (AD), but detecting early response remains challenging for clinicians. Acetylcholine is known to directly modulate attention, particularly under high cognitive conditions, but no studies to date test whether measures of attention under high load can detect early effects of donepezil. We hypothesized that load-dependent attention tasks are sensitive to short-term treatment effects of donepezil, while global and other domain-specific cognitive measures are not. This longitudinal, randomized, double-blind, placebo-controlled pilot trial (ClinicalTrials.gov Identifier: NCT03073876) evaluated 23 participants newly diagnosed with AD initiating de novo donepezil treatment (5 mg). After baseline assessment, participants were randomized into Drug (n = 12) or Placebo (n = 11) groups, and retested after approximately 6 weeks. Cognitive assessment included: (a) attention tasks (Foreperiod Effect, Attentional Blink, and Covert Orienting tasks) measuring processing speed, top-down accuracy, orienting, intra-individual variability, and fatigue; (b) global measures (Alzheimer's Disease Assessment Scale-Cognitive Subscale, Mini-Mental Status Examination, Dementia Rating Scale); and (c) domain-specific measures (memory, language, visuospatial, and executive function). The Drug but not the Placebo group showed benefits of treatment at high-load measures by preserving top-down accuracy, improving intra-individual variability, and averting fatigue. In contrast, other global or cognitive domain-specific measures could not detect treatment effects over the same treatment interval. The pilot-study suggests that attention measures targeting accuracy, variability, and fatigue under high-load conditions could be sensitive to short-term cholinergic treatment. Given the central role of acetylcholine in attentional function, load-dependent attentional measures may be valuable cognitive markers of early treatment response.

  9. Locally optimal control under unknown dynamics with learnt cost function: application to industrial robot positioning

    Science.gov (United States)

    Guérin, Joris; Gibaru, Olivier; Thiery, Stéphane; Nyiri, Eric

    2017-01-01

    Recent methods of Reinforcement Learning have enabled to solve difficult, high dimensional, robotic tasks under unknown dynamics using iterative Linear Quadratic Gaussian control theory. These algorithms are based on building a local time-varying linear model of the dynamics from data gathered through interaction with the environment. In such tasks, the cost function is often expressed directly in terms of the state and control variables so that it can be locally quadratized to run the algorithm. If the cost is expressed in terms of other variables, a model is required to compute the cost function from the variables manipulated. We propose a method to learn the cost function directly from the data, in the same way as for the dynamics. This way, the cost function can be defined in terms of any measurable quantity and thus can be chosen more appropriately for the task to be carried out. With our method, any sensor information can be used to design the cost function. We demonstrate the efficiency of this method through simulating, with the V-REP software, the learning of a Cartesian positioning task on several industrial robots with different characteristics. The robots are controlled in joint space and no model is provided a priori. Our results are compared with another model free technique, consisting in writing the cost function as a state variable.

  10. An efficient Bayesian inference approach to inverse problems based on an adaptive sparse grid collocation method

    International Nuclear Information System (INIS)

    Ma Xiang; Zabaras, Nicholas

    2009-01-01

    A new approach to modeling inverse problems using a Bayesian inference method is introduced. The Bayesian approach considers the unknown parameters as random variables and seeks the probabilistic distribution of the unknowns. By introducing the concept of the stochastic prior state space to the Bayesian formulation, we reformulate the deterministic forward problem as a stochastic one. The adaptive hierarchical sparse grid collocation (ASGC) method is used for constructing an interpolant to the solution of the forward model in this prior space which is large enough to capture all the variability/uncertainty in the posterior distribution of the unknown parameters. This solution can be considered as a function of the random unknowns and serves as a stochastic surrogate model for the likelihood calculation. Hierarchical Bayesian formulation is used to derive the posterior probability density function (PPDF). The spatial model is represented as a convolution of a smooth kernel and a Markov random field. The state space of the PPDF is explored using Markov chain Monte Carlo algorithms to obtain statistics of the unknowns. The likelihood calculation is performed by directly sampling the approximate stochastic solution obtained through the ASGC method. The technique is assessed on two nonlinear inverse problems: source inversion and permeability estimation in flow through porous media

  11. Behavioral neurocardiac training in hypertension: a randomized, controlled trial.

    Science.gov (United States)

    Nolan, Robert P; Floras, John S; Harvey, Paula J; Kamath, Markad V; Picton, Peter E; Chessex, Caroline; Hiscock, Natalie; Powell, Jonathan; Catt, Michael; Hendrickx, Hilde; Talbot, Duncan; Chen, Maggie H

    2010-04-01

    It is not established whether behavioral interventions add benefit to pharmacological therapy for hypertension. We hypothesized that behavioral neurocardiac training (BNT) with heart rate variability biofeedback would reduce blood pressure further by modifying vagal heart rate modulation during reactivity and recovery from standardized cognitive tasks ("mental stress"). This randomized, controlled trial enrolled 65 patients with uncomplicated hypertension to BNT or active control (autogenic relaxation), with six 1-hour sessions over 2 months with home practice. Outcomes were analyzed with linear mixed models that adjusted for antihypertensive drugs. BNT reduced daytime and 24-hour systolic blood pressures (-2.4+/-0.9 mm Hg, P=0.009, and -2.1+/-0.9 mm Hg, P=0.03, respectively) and pulse pressures (-1.7+/-0.6 mm Hg, P=0.004, and -1.4+/-0.6 mm Hg, P=0.02, respectively). No effect was observed for controls (P>0.10 for all indices). BNT also increased RR-high-frequency power (0.15 to 0.40 Hz; P=0.01) and RR interval (P0.10). In contrast to relaxation therapy, BNT with heart rate variability biofeedback modestly lowers ambulatory blood pressure during wakefulness, and it augments tonic vagal heart rate modulation. It is unknown whether efficacy of this treatment can be improved with biofeedback of baroreflex gain. BNT, alone or as an adjunct to drug therapy, may represent a promising new intervention for hypertension.

  12. Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift

    DEFF Research Database (Denmark)

    Lehre, Per Kristian; Witt, Carsten

    2014-01-01

    Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...

  13. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  14. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, Milton; Manos, George C.

    1994-01-01

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective...... is to obtain an estimate of the free rocking response from the measured random response using the Random Decrement (RDD) Technique, and then estimate the coefficient of restitution from this free response estimate. In the paper this approach is investigated by simulating the response of a single degree...

  15. On lower limits and equivalences for distribution tails of randomly stopped sums

    NARCIS (Netherlands)

    Denisov, D.E.; Foss, S.G.; Korshunov, D.A.

    2008-01-01

    For a distribution F*t of a random sum St=¿1+¿+¿t of i.i.d. random variables with a common distribution F on the half-line [0, 8), we study the limits of the ratios of tails as x¿8 (here, t is a counting random variable which does not depend on {¿n}n=1). We also consider applications of the results

  16. Fuzzy randomness uncertainty in civil engineering and computational mechanics

    CERN Document Server

    Möller, Bernd

    2004-01-01

    This book, for the first time, provides a coherent, overall concept for taking account of uncertainty in the analysis, the safety assessment, and the design of structures. The reader is introduced to the problem of uncertainty modeling and familiarized with particular uncertainty models. For simultaneously considering stochastic and non-stochastic uncertainty the superordinated uncertainty model fuzzy randomness, which contains real valued random variables as well as fuzzy variables as special cases, is presented. For this purpose basic mathematical knowledge concerning the fuzzy set theory and the theory of fuzzy random variables is imparted. The body of the book comprises the appropriate quantification of uncertain structural parameters, the fuzzy and fuzzy probabilistic structural analysis, the fuzzy probabilistic safety assessment, and the fuzzy cluster structural design. The completely new algorithms are described in detail and illustrated by way of demonstrative examples.

  17. Fuzziness and randomness in an optimization framework

    International Nuclear Information System (INIS)

    Luhandjula, M.K.

    1994-03-01

    This paper presents a semi-infinite approach for linear programming in the presence of fuzzy random variable coefficients. As a byproduct a way for dealing with optimization problems including both fuzzy and random data is obtained. Numerical examples are provided for the sake of illustration. (author). 13 refs

  18. Randomization Does Not Help Much, Comparability Does

    Science.gov (United States)

    Saint-Mont, Uwe

    2015-01-01

    According to R.A. Fisher, randomization “relieves the experimenter from the anxiety of considering innumerable causes by which the data may be disturbed.” Since, in particular, it is said to control for known and unknown nuisance factors that may considerably challenge the validity of a result, it has become very popular. This contribution challenges the received view. First, looking for quantitative support, we study a number of straightforward, mathematically simple models. They all demonstrate that the optimism surrounding randomization is questionable: In small to medium-sized samples, random allocation of units to treatments typically yields a considerable imbalance between the groups, i.e., confounding due to randomization is the rule rather than the exception. In the second part of this contribution, the reasoning is extended to a number of traditional arguments in favour of randomization. This discussion is rather non-technical, and sometimes touches on the rather fundamental Frequentist/Bayesian debate. However, the result of this analysis turns out to be quite similar: While the contribution of randomization remains doubtful, comparability contributes much to a compelling conclusion. Summing up, classical experimentation based on sound background theory and the systematic construction of exchangeable groups seems to be advisable. PMID:26193621

  19. A Note on the W-S Lower Bound of the MEE Estimation

    Directory of Open Access Journals (Sweden)

    Badong Chen

    2014-02-01

    Full Text Available The minimum error entropy (MEE estimation is concerned with the estimation of a certain random variable (unknown variable based on another random variable (observation, so that the entropy of the estimation error is minimized. This estimation method may outperform the well-known minimum mean square error (MMSE estimation especially for non-Gaussian situations. There is an important performance bound on the MEE estimation, namely the W-S lower bound, which is computed as the conditional entropy of the unknown variable given observation. Though it has been known in the literature for a considerable time, up to now there is little study on this performance bound. In this paper, we reexamine the W-S lower bound. Some basic properties of the W-S lower bound are presented, and the characterization of Gaussian distribution using the W-S lower bound is investigated.

  20. Adaptive Control for Revolute Joints Robot Manipulator with Uncertain/Unknown Dynamic Parameters and in Presence of Disturbance in Control Input

    DEFF Research Database (Denmark)

    Seyed Sakha, Masoud; Shaker, Hamid Reza

    2017-01-01

    This paper presents an effective adaptive controller for revolute joints robot manipulator where the control input is accompanied with a random disturbance (with unknown PSD). It is clear that, disturbance can compromise the overall performance of the system. To cope with this problem, a control...... technique is proposed which uses the concept of exponential practical stability. Unlike other counterparts, the proposed method does not need information such as the physical parameters of robot and gravitational acceleration. The results show that the proposed controller achieves an excellent performance...

  1. Large deviations of heavy-tailed random sums with applications in insurance and finance

    NARCIS (Netherlands)

    Kluppelberg, C; Mikosch, T

    We prove large deviation results for the random sum S(t)=Sigma(i=1)(N(t)) X-i, t greater than or equal to 0, where (N(t))(t greater than or equal to 0) are non-negative integer-valued random variables and (X-n)(n is an element of N) are i.i.d. non-negative random Variables with common distribution

  2. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    DEFF Research Database (Denmark)

    Burgess, Stephen; Thompson, Simon G; Thompson, Grahame

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...

  3. Quantum jointly assisted cloning of an unknown three-dimensional equatorial state

    Science.gov (United States)

    Ma, Peng-Cheng; Chen, Gui-Bin; Li, Xiao-Wei; Zhan, You-Bang

    2018-02-01

    We present two schemes for perfectly cloning an unknown single-qutrit equatorial state with assistance from two and N state preparers, respectively. In the first scheme, the sender wishes to teleport an unknown single-qutrit equatorial state from two state preparers to a remote receiver, and then to create a perfect copy of the unknown state at her location. The scheme consists of two stages. The first stage of the scheme requires the usual teleportation. In the second stage, to help the sender realize the quantum cloning, two state preparers perform single-qutrit projective measurements on their own qutrits from the sender, then the sender can acquire a perfect copy of the unknown state. It is shown that, only if the two state preparers collaborate with each other, the sender can create a copy of the unknown state by means of some appropriate unitary operations. In the second scheme, we generalized the jointly assisted cloning in the first scheme to the case of N state prepares. In the present schemes, the total probability of success for assisted cloning of a perfect copy of the unknown state can reach 1.

  4. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  5. Inter-track interference mitigation with two-dimensional variable equalizer for bit patterned media recording

    Directory of Open Access Journals (Sweden)

    Yao Wang

    2017-05-01

    Full Text Available The increased track density in bit patterned media recording (BPMR causes increased inter-track interference (ITI, which degrades the bit error rate (BER performance. In order to mitigate the effect of the ITI, signals from multiple tracks can be equalized by a 2D equalizer with 1D target. Usually, the 2D fixed equalizer coefficients are obtained by using a pseudo-random bit sequence (PRBS for training. In this study, a 2D variable equalizer is proposed, where various sets of 2D equalizer coefficients are predetermined and stored for different ITI patterns besides the usual PRBS training. For data detection, as the ITI patterns are unknown in the first global iteration, the main and adjacent tracks are equalized with the conventional 2D fixed equalizer, detected with Bahl-Cocke-Jelinek-Raviv (BCJR detector and decoded with low-density parity-check (LDPC decoder. Then using the estimated bit information from main and adjacent tracks, the ITI pattern for each island of the main track can be estimated and the corresponding 2D variable equalizers are used to better equalize the bits on the main track. This process is executed iteratively by feeding back the main track information. Simulation results indicate that for both single-track and two-track detection, the proposed 2D variable equalizer can achieve better BER and frame error rate (FER compared to that with the 2D fixed equalizer.

  6. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  7. A Note on the Tail Behavior of Randomly Weighted Sums with Convolution-Equivalently Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Yang Yang

    2013-01-01

    Full Text Available We investigate the tailed asymptotic behavior of the randomly weighted sums with increments with convolution-equivalent distributions. Our obtained result can be directly applied to a discrete-time insurance risk model with insurance and financial risks and derive the asymptotics for the finite-time probability of the above risk model.

  8. Iterative Selection of Unknown Weights in Direct Weight Optimization Identification

    Directory of Open Access Journals (Sweden)

    Xiao Xuan

    2014-01-01

    Full Text Available To the direct weight optimization identification of the nonlinear system, we add some linear terms about input sequences in the former linear affine function so as to approximate the nonlinear property. To choose the two classes of unknown weights in the more linear terms, this paper derives the detailed process on how to choose these unknown weights from theoretical analysis and engineering practice, respectively, and makes sure of their key roles between the unknown weights. From the theoretical analysis, the added unknown weights’ auxiliary role can be known in the whole process of approximating the nonlinear system. From the practical analysis, we learn how to transform one complex optimization problem to its corresponding common quadratic program problem. Then, the common quadratic program problem can be solved by the basic interior point method. Finally, the efficiency and possibility of the proposed strategies can be confirmed by the simulation results.

  9. IMPROVED VARIABLE STAR SEARCH IN LARGE PHOTOMETRIC DATA SETS: NEW VARIABLES IN CoRoT FIELD LRa02 DETECTED BY BEST II

    International Nuclear Information System (INIS)

    Fruth, T.; Cabrera, J.; Csizmadia, Sz.; Eigmüller, P.; Erikson, A.; Kirste, S.; Pasternacki, T.; Rauer, H.; Titz-Weider, R.; Kabath, P.; Chini, R.; Lemke, R.; Murphy, M.

    2012-01-01

    The CoRoT field LRa02 has been observed with the Berlin Exoplanet Search Telescope II (BEST II) during the southern summer 2007/2008. A first analysis of stellar variability led to the publication of 345 newly discovered variable stars. Now, a deeper analysis of this data set was used to optimize the variability search procedure. Several methods and parameters have been tested in order to improve the selection process compared to the widely used J index for variability ranking. This paper describes an empirical approach to treat systematic trends in photometric data based upon the analysis of variance statistics that can significantly decrease the rate of false detections. Finally, the process of reanalysis and method improvement has virtually doubled the number of variable stars compared to the first analysis by Kabath et al. A supplementary catalog of 272 previously unknown periodic variables plus 52 stars with suspected variability is presented. Improved ephemerides are given for 19 known variables in the field. In addition, the BEST II results are compared with CoRoT data and its automatic variability classification.

  10. Variability in Second Language Learning: The Roles of Individual Differences, Learning Conditions, and Linguistic Complexity

    Science.gov (United States)

    Tagarelli, Kaitlyn M.; Ruiz, Simón; Vega, José Luis Moreno; Rebuschat, Patrick

    2016-01-01

    Second language learning outcomes are highly variable, due to a variety of factors, including individual differences, exposure conditions, and linguistic complexity. However, exactly how these factors interact to influence language learning is unknown. This article examines the relationship between these three variables in language learners.…

  11. Optimal Inference for Instrumental Variables Regression with non-Gaussian Errors

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael

    This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness...

  12. Modal Analysis Based on the Random Decrement Transform

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    of this paper is to present a state-of-the-art description of the Random Decrement technique where the statistical theory is outlined and examples are given. But also new results such as estimation of frequency response functions and quality assessment are introduced. Special attention is given......During the last years several papers utilizing the Random Decrement transform as a basis for extraction of modal parameters from the response of linear systems subjected to unknown ambient loads have been presented. Although the Random Decrement technique was developed in a decade starting from...... the introduktion in 1968 the technique seems still to be attractive. This is probably due to the simplicity and the speed of the algorithm and the fact that the theory of the technique has been extended by introducing statistical measures such as correlation functions or spectral densities. The purpose...

  13. Characterizing unknown systematics in large scale structure surveys

    International Nuclear Information System (INIS)

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.; Seo, Hee-Jong; Ross, Ashley J.; Bahcall, Neta; Brinkmann, Jonathan; Eisenstein, Daniel J.; Muna, Demitri; Palanque-Delabrouille, Nathalie; Yèche, Christophe; Pâris, Isabelle; Petitjean, Patrick; Schneider, Donald P.; Streblyanska, Alina; Weaver, Benjamin A.

    2014-01-01

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data, we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study

  14. Characterizing unknown systematics in large scale structure surveys

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Nishant; Ho, Shirley [McWilliams Center for Cosmology, Department of Physics, Carnegie Mellon University, Pittsburgh, PA 15213 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Seo, Hee-Jong [Berkeley Center for Cosmological Physics, LBL and Department of Physics, University of California, Berkeley, CA 94720 (United States); Ross, Ashley J. [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, PO1 3FX (United Kingdom); Bahcall, Neta [Princeton University Observatory, Peyton Hall, Princeton, NJ 08544 (United States); Brinkmann, Jonathan [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Eisenstein, Daniel J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Muna, Demitri [Department of Astronomy, Ohio State University, Columbus, OH 43210 (United States); Palanque-Delabrouille, Nathalie; Yèche, Christophe [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); Pâris, Isabelle [Departamento de Astronomía, Universidad de Chile, Casilla 36-D, Santiago (Chile); Petitjean, Patrick [Université Paris 6 et CNRS, Institut d' Astrophysique de Paris, 98bis blvd. Arago, 75014 Paris (France); Schneider, Donald P. [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States); Streblyanska, Alina [Instituto de Astrofisica de Canarias (IAC), E-38200 La Laguna, Tenerife (Spain); Weaver, Benjamin A., E-mail: nishanta@andrew.cmu.edu [Center for Cosmology and Particle Physics, New York University, New York, NY 10003 (United States)

    2014-04-01

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data, we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.

  15. Intravenous thrombolysis in ischemic stroke with unknown onset using CT perfusion.

    Science.gov (United States)

    Cortijo, E; García-Bermejo, P; Calleja, A I; Pérez-Fernández, S; Gómez, R; del Monte, J M; Reyes, J; Arenillas, J F

    2014-03-01

    Acute ischemic stroke patients with unclear onset time presenting >4.5 h from last-seen-normal (LSN) time are considered late patients and excluded from i.v. thrombolysis. We aimed to evaluate whether this subgroup of patients is different from patients presenting >4.5 h from a witnessed onset, in terms of eligibility and response to computed tomography perfusion (CTP)-guided i.v. thrombolysis. We prospectively studied consecutive acute non-lacunar middle cerebral artery (MCA) ischemic stroke patients presenting >4.5 h from LSN. All patients underwent multimodal CT and were considered eligible for i.v. thrombolysis according to CTP criteria. Two patient groups were established based on the knowledge of the stroke onset time. We compared the proportion of candidates suitable for intravenous thrombolysis between both groups, and their outcome after thrombolytic therapy. Among 147 MCA ischemic stroke patients presenting >4.5 h from LSN, stroke onset was witnessed in 74 and unknown in 73. Thirty-seven (50%) patients in the first group and 32 (44%) in the second met CTP criteria for thrombolysis (P = 0.7). Baseline variables were comparable between both groups with the exception of age, which was higher in the unclear onset group. The rates of early neurological improvement (54.1% vs 46.9%), 2-h MCA recanalization (43.5% vs 37%), symptomatic hemorrhagic transformation (3% vs 0%) and good 3-month functional outcome (62.2% vs 56.3%) did not differ significantly between both groups. Delayed stroke patients with unknown onset time were no different than patients >4.5 h regarding eligibility and response to CTP-based i.v. thrombolysis. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    Science.gov (United States)

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  17. Synchronization of coupled different chaotic FitzHugh-Nagumo neurons with unknown parameters under communication-direction-dependent coupling.

    Science.gov (United States)

    Iqbal, Muhammad; Rehan, Muhammad; Khaliq, Abdul; Saeed-ur-Rehman; Hong, Keum-Shik

    2014-01-01

    This paper investigates the chaotic behavior and synchronization of two different coupled chaotic FitzHugh-Nagumo (FHN) neurons with unknown parameters under external electrical stimulation (EES). The coupled FHN neurons of different parameters admit unidirectional and bidirectional gap junctions in the medium between them. Dynamical properties, such as the increase in synchronization error as a consequence of the deviation of neuronal parameters for unlike neurons, the effect of difference in coupling strengths caused by the unidirectional gap junctions, and the impact of large time-delay due to separation of neurons, are studied in exploring the behavior of the coupled system. A novel integral-based nonlinear adaptive control scheme, to cope with the infeasibility of the recovery variable, for synchronization of two coupled delayed chaotic FHN neurons of different and unknown parameters under uncertain EES is derived. Further, to guarantee robust synchronization of different neurons against disturbances, the proposed control methodology is modified to achieve the uniformly ultimately bounded synchronization. The parametric estimation errors can be reduced by selecting suitable control parameters. The effectiveness of the proposed control scheme is illustrated via numerical simulations.

  18. Robust Fault Detection for Switched Fuzzy Systems With Unknown Input.

    Science.gov (United States)

    Han, Jian; Zhang, Huaguang; Wang, Yingchun; Sun, Xun

    2017-10-03

    This paper investigates the fault detection problem for a class of switched nonlinear systems in the T-S fuzzy framework. The unknown input is considered in the systems. A novel fault detection unknown input observer design method is proposed. Based on the proposed observer, the unknown input can be removed from the fault detection residual. The weighted H∞ performance level is considered to ensure the robustness. In addition, the weighted H₋ performance level is introduced, which can increase the sensibility of the proposed detection method. To verify the proposed scheme, a numerical simulation example and an electromechanical system simulation example are provided at the end of this paper.

  19. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    NARCIS (Netherlands)

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  20. Potassium supplementation and heart rate : A meta-analysis of randomized controlled trials

    NARCIS (Netherlands)

    Gijsbers, L.; Moelenberg, F. J. M.; Bakker, S. J. L.; Geleijnse, J. M.

    Background and aims: Increasing the intake of potassium has been shown to lower blood pressure, but whether it also affects heart rate (HR) is largely unknown. We therefore assessed the effect of potassium supplementation on HR in a meta-analysis of randomized controlled trials. Methods and results:

  1. Reticulocyte dynamic and hemoglobin variability in hemodialysis patients treated with Darbepoetin alfa and C.E.R.A.: a randomized controlled trial.

    Science.gov (United States)

    Forni, Valentina; Bianchi, Giorgia; Ogna, Adam; Salvadé, Igor; Vuistiner, Philippe; Burnier, Michel; Gabutti, Luca

    2013-07-22

    In a simulation based on a pharmacokinetic model we demonstrated that increasing the erythropoiesis stimulating agents (ESAs) half-life or shortening their administration interval decreases hemoglobin variability. The benefit of reducing the administration interval was however lessened by the variability induced by more frequent dosage adjustments. The purpose of this study was to analyze the reticulocyte and hemoglobin kinetics and variability under different ESAs and administration intervals in a collective of chronic hemodialysis patients. The study was designed as an open-label, randomized, four-period cross-over investigation, including 30 patients under chronic hemodialysis at the regional hospital of Locarno (Switzerland) in February 2010 and lasting 2 years. Four subcutaneous treatment strategies (C.E.R.A. every 4 weeks Q4W and every 2 weeks Q2W, Darbepoetin alfa Q4W and Q2W) were compared with each other. The mean square successive difference of hemoglobin, reticulocyte count and ESAs dose was used to quantify variability. We distinguished a short- and a long-term variability based respectively on the weekly and monthly successive difference. No difference was found in the mean values of biological parameters (hemoglobin, reticulocytes, and ferritin) between the 4 strategies. ESAs type did not affect hemoglobin and reticulocyte variability, but C.E.R.A induced a more sustained reticulocytes response over time and increased the risk of hemoglobin overshooting (OR 2.7, p = 0.01). Shortening the administration interval lessened the amplitude of reticulocyte count fluctuations but resulted in more frequent ESAs dose adjustments and in amplified reticulocyte and hemoglobin variability. Q2W administration interval was however more favorable in terms of ESAs dose, allowing a 38% C.E.R.A. dose reduction, and no increase of Darbepoetin alfa. The reticulocyte dynamic was a more sensitive marker of time instability of the hemoglobin response under ESAs therapy

  2. Off-Policy Actor-Critic Structure for Optimal Control of Unknown Systems With Disturbances.

    Science.gov (United States)

    Song, Ruizhuo; Lewis, Frank L; Wei, Qinglai; Zhang, Huaguang

    2016-05-01

    An optimal control method is developed for unknown continuous-time systems with unknown disturbances in this paper. The integral reinforcement learning (IRL) algorithm is presented to obtain the iterative control. Off-policy learning is used to allow the dynamics to be completely unknown. Neural networks are used to construct critic and action networks. It is shown that if there are unknown disturbances, off-policy IRL may not converge or may be biased. For reducing the influence of unknown disturbances, a disturbances compensation controller is added. It is proven that the weight errors are uniformly ultimately bounded based on Lyapunov techniques. Convergence of the Hamiltonian function is also proven. The simulation study demonstrates the effectiveness of the proposed optimal control method for unknown systems with disturbances.

  3. The effects of sustained manual pressure stimulation according to Vojta Therapy on heart rate variability.

    Science.gov (United States)

    Opavsky, Jaroslav; Slachtova, Martina; Kutin, Miroslav; Hok, Pavel; Uhlir, Petr; Opavska, Hana; Hlustik, Petr

    2018-05-23

    The physiotherapeutic technique of Vojta reflex locomotion is often accompanied by various autonomic activity changes and unpleasant sensations. It is unknown whether these effects are specific to Vojta Therapy. Therefore, the aim of this study was to compare changes in cardiac autonomic control after Vojta reflex locomotion stimulation and after an appropriate sham stimulation. A total of 28 young healthy adults (20.4 - 25.7 years) were enrolled in this single-blind randomized cross-over study. Participants underwent two modes of 20-minute sustained manual pressure stimulation on the surface of the foot on two separate visits. One mode used manual pressure on the lateral heel, i.e., in a zone employed in the Vojta Therapy (active stimulation). The other mode used pressure on the lateral ankle (control), in an area not included among the active zones used by Vojta Therapy and whose activation does not evoke manifestations of reflex locomotion. Autonomic nervous system activity was evaluated using spectral analysis of heart rate variability before and after the intervention. The active stimulation was perceived as more unpleasant than the control stimulation. Heart rate variability parameters demonstrated almost identical autonomic responses after both stimulation types, showing either modest increase in parasympathetic activity, or increased heart rate variability with similar contribution of parasympathetic and sympathetic activity. The results demonstrate changes of cardiac autonomic control in both active and control stimulation, without evidence for a significant difference between the two.

  4. Random effects coefficient of determination for mixed and meta-analysis models.

    Science.gov (United States)

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  5. [Badminton--unknown sport].

    Science.gov (United States)

    Zekan-Petrinović, Lidija

    2007-01-01

    For a long time, badminton was considered to be only a slow and light game for children, a game that is played outdoors and is structurally undemanding.Today, it is not an unknown and unrecognised sport, especially after it was included into the Olympics Games in 1992. Badminton is one of the oldest sports in the world. It is suitable for all ages (for children and elderly equally), women and men and even handicapped persons. Beginners can start playing badminton matches early because the basics are learned quickly. As a recreational activity, badminton is very popular in Zagreb. In the last 10 years, a number of halls specialized for badminton or offering badminton as one of available sports activities have been opened in Zagreb. At present, there are over 70 professional playgrounds for training of top contestants but also for the citizens who can play recreational badminton.

  6. Detection of viral sequence fragments of HIV-1 subfamilies yet unknown

    Directory of Open Access Journals (Sweden)

    Stanke Mario

    2011-04-01

    Full Text Available Abstract Background Methods of determining whether or not any particular HIV-1 sequence stems - completely or in part - from some unknown HIV-1 subtype are important for the design of vaccines and molecular detection systems, as well as for epidemiological monitoring. Nevertheless, a single algorithm only, the Branching Index (BI, has been developed for this task so far. Moving along the genome of a query sequence in a sliding window, the BI computes a ratio quantifying how closely the query sequence clusters with a subtype clade. In its current version, however, the BI does not provide predicted boundaries of unknown fragments. Results We have developed Unknown Subtype Finder (USF, an algorithm based on a probabilistic model, which automatically determines which parts of an input sequence originate from a subtype yet unknown. The underlying model is based on a simple profile hidden Markov model (pHMM for each known subtype and an additional pHMM for an unknown subtype. The emission probabilities of the latter are estimated using the emission frequencies of the known subtypes by means of a (position-wise probabilistic model for the emergence of new subtypes. We have applied USF to SIV and HIV-1 sequences formerly classified as having emerged from an unknown subtype. Moreover, we have evaluated its performance on artificial HIV-1 recombinants and non-recombinant HIV-1 sequences. The results have been compared with the corresponding results of the BI. Conclusions Our results demonstrate that USF is suitable for detecting segments in HIV-1 sequences stemming from yet unknown subtypes. Comparing USF with the BI shows that our algorithm performs as good as the BI or better.

  7. Psychological profile: the problem of modeling the unknown criminal personality

    Directory of Open Access Journals (Sweden)

    Г. М. Гетьман

    2013-10-01

    Full Text Available The article investigates the problem of modeling an unknown person in the preparation of criminal psychological profile. Some approaches to the concept of "psychological profile" and "psychological portrait", in particular the proposed delineation of these terms. We consider the system steps in the development of the psychological profile of an unknown perpetrator.

  8. Multifocal chronic osteomyelitis of unknown etiology

    International Nuclear Information System (INIS)

    Kozlowski, K.; Masel, J.; Harbison, S.; Yu, J.; Royal Brisbane Children Hospital; Regional Hospital Bowral

    1983-01-01

    Five cases of chronic, inflammatory, multifocal bone lesions of unknown etiology are reported. Although bone biopsy confirmed osteomyelitis in each case in none of them were organisms found inspite of an extensive work up. Different clinical course of the disease reflects different aetiology in respective cases. These cases present changing aspects of osteomyelitis emerging since introduction of antibiotics. (orig.)

  9. Post-operative therapy following transoral robotic surgery for unknown primary cancers of the head and neck.

    Science.gov (United States)

    Patel, Sapna A; Parvathaneni, Aarthi; Parvathaneni, Upendra; Houlton, Jeffrey J; Karni, Ron J; Liao, Jay J; Futran, Neal D; Méndez, Eduardo

    2017-09-01

    Our primary objective is to describe the post- operative management in patients with an unknown primary squamous cell carcinoma of the head and neck (HNSCC) treated with trans-oral robotic surgery (TORS). We conducted a retrospective multi-institutional case series including all patients diagnosed with an unknown primary HNSCC who underwent TORS to identify the primary site from January 1, 2010 to June 30, 2016. We excluded those with recurrent disease, ≤6months of follow up from TORS, previous history of radiation therapy (RT) to the head and neck, or evidence of primary tumor site based on previous biopsies. Our main outcome measure was receipt of post-operative therapy. The tumor was identified in 26/35 (74.3%) subjects. Post-TORS, 2 subjects did not receive adjuvant therapy due to favorable pathology. Volume reduction of RT mucosal site coverage was achieved in 12/26 (46.1%) subjects who had lateralizing tumors, ie. those confined to the palatine tonsil or glossotonsillar sulcus. In addition, for 8/26 (30.1%), the contralateral neck RT was also avoided. In 9 subjects, no primary was identified (pT0); four of these received RT to the involved ipsilateral neck nodal basin only without pharyngeal mucosal irradiation. Surgical management of an unknown primary with TORS can lead to deintensification of adjuvant therapy including avoidance of chemotherapy and reduction in RT doses and volume. There was no increase in short term treatment failures. Treatment after TORS can vary significantly, thus we advocate adherence to NCCN guideline therapy post-TORS to avoid treatment-associated variability. Published by Elsevier Ltd.

  10. Bayesian source term determination with unknown covariance of measurements

    Science.gov (United States)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  11. On the Wigner law in dilute random matrices

    Science.gov (United States)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  12. Celiac Disease Presenting as Fever of Unknown Origin

    Directory of Open Access Journals (Sweden)

    Megan J. Cooney

    2013-01-01

    Full Text Available Celiac disease (CD is a common autoimmune enteropathy that occurs, in affected individuals, with exposure to gluten in the diet and improves with removal of dietary gluten. Although CD is readily considered in patients with classical presentations of the disease, atypical manifestations may be the only presenting symptoms. We present a case of CD in a 16-year-old female presenting as fever of unknown origin, which has not been reported previously. The postulated mechanism for fever in CD and the importance of clinicians having a low threshold for considering CD in the differential diagnosis of fever of unknown origin and other enigmatic clinical presentations is discussed.

  13. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  14. Fast grasping of unknown objects using principal component analysis

    Science.gov (United States)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  15. Perturbation Solutions for Random Linear Structural Systems subject to Random Excitation using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    perturbation method using stochastic differential equations. The joint statistical moments entering the perturbation solution are determined by considering an augmented dynamic system with state variables made up of the displacement and velocity vector and their first and second derivatives with respect......The paper deals with the first and second order statistical moments of the response of linear systems with random parameters subject to random excitation modelled as white-noise multiplied by an envelope function with random parameters. The method of analysis is basically a second order...... to the random parameters of the problem. Equations for partial derivatives are obtained from the partial differentiation of the equations of motion. The zero time-lag joint statistical moment equations for the augmented state vector are derived from the Itô differential formula. General formulation is given...

  16. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    Science.gov (United States)

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  17. The Chronic Kidney Disease Water Intake Trial: Protocol of a Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    William F. Clark

    2017-08-01

    Full Text Available Background: In observational studies, drinking more water associates with a slower rate of kidney function decline; whether the same is true in a randomized controlled trial is unknown. Objective: To examine the 1-year effect of a higher vs usual water intake on estimated glomerular filtration rate (eGFR in patients with chronic kidney disease. Design: Parallel-group randomized controlled trial. Setting: Nine centers in Ontario, Canada. Enrollment and randomization occurred between May 2013 and May 2016; follow-up for the primary outcome will continue until June 2017. Participants: Adults (n = 631 with stage 3 chronic kidney disease (eGFR 30-60 mL/min/1.73 m 2 and microalbuminuria. Intervention: The high water intake group was coached to increase their oral water intake by 1.0 to 1.5 L/day (depending on sex and weight, over and above usual consumed beverages, for a period of 1 year. The control group was coached to maintain their usual water intake during this time. Measures: Participants provided 24-hour urine samples at baseline and at 6 and 12 months after randomization; urine samples were analyzed for volume, creatinine, osmolality, and the albumin-to-creatinine ratio. Blood samples were obtained at baseline and at 3- to 6-month intervals after randomization, and analyzed for creatinine, copeptin, osmolality, and electrolytes. Other measures collected included health-related quality of life, blood pressure, body mass index, and diet. Primary outcome: The between-group change in eGFR from baseline (prerandomization to 12 months after randomization. Secondary outcomes: Change in plasma copeptin concentration, 24-hour urine albumin-to-creatinine ratio, measured creatinine clearance, estimated 5-year risk of kidney failure (using the 4-variable Kidney Failure Risk Equation, and health-related quality of life. Planned analysis: The primary analysis will follow an intention-to-treat approach. The between-group change in eGFR will be compared using

  18. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Science.gov (United States)

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  19. Machine learning techniques to select variable stars

    Directory of Open Access Journals (Sweden)

    García-Varela Alejandro

    2017-01-01

    Full Text Available In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.

  20. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  1. Quantum key distribution with an unknown and untrusted source

    Science.gov (United States)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  2. Grasping Unknown Objects in an Early Cognitive Vision System

    DEFF Research Database (Denmark)

    Popovic, Mila

    2011-01-01

    Grasping of unknown objects presents an important and challenging part of robot manipulation. The growing area of service robotics depends upon the ability of robots to autonomously grasp and manipulate a wide range of objects in everyday environments. Simple, non task-specific grasps of unknown ...... and comparing vision-based grasping methods, and the creation of algorithms for bootstrapping a process of acquiring world understanding for artificial cognitive agents....... presents a system for robotic grasping of unknown objects us- ing stereo vision. Grasps are defined based on contour and surface information provided by the Early Cognitive Vision System, that organizes visual informa- tion into a biologically motivated hierarchical representation. The contributions...... of the thesis are: the extension of the Early Cognitive Vision representation with a new type of feature hierarchy in the texture domain, the definition and evaluation of contour based grasping methods, the definition and evaluation of surface based grasping methods, the definition of a benchmark for testing...

  3. Prediction of university student’s addictability based on some demographic variables, academic procrastination, and interpersonal variables

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Tavakoli

    2014-02-01

    Full Text Available Objectives: This study aimed to predict addictability among the students, based on demographic variables, academic procrastination, and interpersonal variables, and also to study the prevalence of addictability among these students. Method: The participants were 500 students (260 females, 240 males selected through a stratified random sampling among the students in Islamic Azad University Branch Abadan. The participants were assessed through Individual specification inventory, addiction potential scale and Aitken procrastination Inventory. Findings: The findings showed %23/6 of students’ readiness for addiction. Men showed higher addictability than women, but age wasn’t an issue. Also variables such as economic status, age, major, and academic procrastination predicted %13, and among interpersonal variables, the variables of having friends who use drugs and dissociated family predicted %13/2 of the variance in addictability. Conclusion: This study contains applied implications for addiction prevention.

  4. 18F-FDG-PET/CT in fever of unknown origin

    DEFF Research Database (Denmark)

    Middelbo Buch-Olsen, Karen; Andersen, Rikke V; Hess, Søren

    2014-01-01

    OBJECTIVE: Fever of unknown origin continues to be a diagnostic challenge for clinicians. The aim of this study was to confirm whether (18)F-fluorodeoxyglucose ((18)F-FDG)-PET/computed tomography (CT) is a helpful tool in patients suffering from this condition. PATIENTS AND METHODS: Fifty......-seven patients with fever of unknown origin were examined with (18)F-FDG-PET/CT as part of their diagnostic workup at the clinicians' discretion. The medical records were read retrospectively to establish the final diagnosis and evaluate the degree to which PET/CT contributed to the diagnosis. RESULTS......-FDG-PET/CT is a useful tool in the investigation of fever of unknown origin; it can reduce patient inconvenience and possibly costs to society if used earlier in the diagnostic process....

  5. Comparison of structured and unstructured physical activity training on predicted VO2max and heart rate variability in adolescents - a randomized control trial.

    Science.gov (United States)

    Sharma, Vivek Kumar; Subramanian, Senthil Kumar; Radhakrishnan, Krishnakumar; Rajendran, Rajathi; Ravindran, Balasubramanian Sulur; Arunachalam, Vinayathan

    2017-05-01

    Physical inactivity contributes to many health issues. The WHO-recommended physical activity for adolescents encompasses aerobic, resistance, and bone strengthening exercises aimed at achieving health-related physical fitness. Heart rate variability (HRV) and maximal aerobic capacity (VO2max) are considered as noninvasive measures of cardiovascular health. The objective of this study is to compare the effect of structured and unstructured physical training on maximal aerobic capacity and HRV among adolescents. We designed a single blinded, parallel, randomized active-controlled trial (Registration No. CTRI/2013/08/003897) to compare the physiological effects of 6 months of globally recommended structured physical activity (SPA), with that of unstructured physical activity (USPA) in healthy school-going adolescents. We recruited 439 healthy student volunteers (boys: 250, girls: 189) in the age group of 12-17 years. Randomization across the groups was done using age and gender stratified randomization method, and the participants were divided into two groups: SPA (n=219, boys: 117, girls: 102) and USPA (n=220, boys: 119, girls: 101). Depending on their training status and gender the participants in both SPA and USPA groups were further subdivided into the following four sub-groups: SPA athlete boys (n=22) and girls (n=17), SPA nonathlete boys (n=95) and girls (n=85), USPA athlete boys (n=23) and girls (n=17), and USPA nonathlete boys (n=96) and girls (n=84). We recorded HRV, body fat%, and VO2 max using Rockport Walk Fitness test before and after the intervention. Maximum aerobic capacity and heart rate variability increased significantly while heart rate, systolic blood pressure, diastolic blood pressure, and body fat percentage decreased significantly after both SPA and USPA intervention. However, the improvement was more in SPA as compared to USPA. SPA is more beneficial for improving cardiorespiratory fitness, HRV, and reducing body fat percentage in terms of

  6. A Note on the Correlated Random Coefficient Model

    DEFF Research Database (Denmark)

    Kolodziejczyk, Christophe

    In this note we derive the bias of the OLS estimator for a correlated random coefficient model with one random coefficient, but which is correlated with a binary variable. We provide set-identification to the parameters of interest of the model. We also show how to reduce the bias of the estimator...

  7. General anesthesia suppresses normal heart rate variability in humans

    Science.gov (United States)

    Matchett, Gerald; Wood, Philip

    2014-06-01

    The human heart normally exhibits robust beat-to-beat heart rate variability (HRV). The loss of this variability is associated with pathology, including disease states such as congestive heart failure (CHF). The effect of general anesthesia on intrinsic HRV is unknown. In this prospective, observational study we enrolled 100 human subjects having elective major surgical procedures under general anesthesia. We recorded continuous heart rate data via continuous electrocardiogram before, during, and after anesthesia, and we assessed HRV of the R-R intervals. We assessed HRV using several common metrics including Detrended Fluctuation Analysis (DFA), Multifractal Analysis, and Multiscale Entropy Analysis. Each of these analyses was done in each of the four clinical phases for each study subject over the course of 24 h: Before anesthesia, during anesthesia, early recovery, and late recovery. On average, we observed a loss of variability on the aforementioned metrics that appeared to correspond to the state of general anesthesia. Following the conclusion of anesthesia, most study subjects appeared to regain their normal HRV, although this did not occur immediately. The resumption of normal HRV was especially delayed on DFA. Qualitatively, the reduction in HRV under anesthesia appears similar to the reduction in HRV observed in CHF. These observations will need to be validated in future studies, and the broader clinical implications of these observations, if any, are unknown.

  8. Intraabdominal abscessus of unknown etiology

    Directory of Open Access Journals (Sweden)

    Čolović Radoje

    2012-01-01

    Full Text Available Introduction. Intraabdominal abscesses are in 98-99% cases the result of secondary and only in 1-2% of primary peritonitis. They are easy and successfully diagnosed. Abdominal abscesses of unknown cause are extremely rare. Case Outline. The authors present a 68-year-old man, without significant data in past history, who suddenly developed epigastric pain, nausea, vomiting and leukocytosis which was treated with antibiotics resulting in the alleviation of complaints and reduction of white blood cells count. After five days ultrasonography showed incapsulated collection of dense fluid in the epigastrium confirmed by CT scan two days later. Upper endoscopy excluded ulcer and/or perforation of the stomach and duodenum. Under local anesthesia, through the upper part of the left rectal muscle, puncture followed by incision was done, and about 50 ml of dense pus was removed. Finger exploration of the cavity showed no foreign body within the cavity. Using drainage, the recovery was quick and uneventful. By preoperative and postoperative abdominal investigations no cause of the abscess was found. Two and a half years after surgery the patient remained symptom-free with normal clinical, laboratory and ultrasonographic findings. Conclusion. The authors presented an intraabdominal abscess of unknown cause that was successfully treated with antibiotics, percutaneous puncture and drainage under local anaesthesia. In spite of all diagnostic methods the cause of the abscess could not be found. Thus, such a possibility, although being rare, should be taken into account.

  9. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  10. Sources of variability of resting cerebral blood flow in healthy subjects

    DEFF Research Database (Denmark)

    Henriksen, Otto Mølby; Kruuse, Christina Rostrup; Olesen, Jes

    2013-01-01

    Measurements of cerebral blood flow (CBF) show large variability among healthy subjects. The aim of the present study was to investigate the relative effect of established factors influencing CBF on the variability of resting CBF. We retrospectively analyzed spontaneous variability in 430 CBF...... measurements acquired in 152 healthy, young subjects using (133)Xe single-photon emission computed tomography. Cerebral blood flow was correlated positively with both end-tidal expiratory PCO2 (PETCO2) and female gender and inversely with hematocrit (Hct). Between- and within-subject CO2 reactivity...... when Hct was also accounted for. The present study confirms large between-subject variability in CBF measurements and that gender, Hct, and PETCO2 explain only a small part of this variability. This implies that a large fraction of CBF variability may be due to unknown factors such as differences...

  11. Visual variability affects early verb learning.

    Science.gov (United States)

    Twomey, Katherine E; Lush, Lauren; Pearce, Ruth; Horst, Jessica S

    2014-09-01

    Research demonstrates that within-category visual variability facilitates noun learning; however, the effect of visual variability on verb learning is unknown. We habituated 24-month-old children to a novel verb paired with an animated star-shaped actor. Across multiple trials, children saw either a single action from an action category (identical actions condition, for example, travelling while repeatedly changing into a circle shape) or multiple actions from that action category (variable actions condition, for example, travelling while changing into a circle shape, then a square shape, then a triangle shape). Four test trials followed habituation. One paired the habituated verb with a new action from the habituated category (e.g., 'dacking' + pentagon shape) and one with a completely novel action (e.g., 'dacking' + leg movement). The others paired a new verb with a new same-category action (e.g., 'keefing' + pentagon shape), or a completely novel category action (e.g., 'keefing' + leg movement). Although all children discriminated novel verb/action pairs, children in the identical actions condition discriminated trials that included the completely novel verb, while children in the variable actions condition discriminated the out-of-category action. These data suggest that - as in noun learning - visual variability affects verb learning and children's ability to form action categories. © 2014 The British Psychological Society.

  12. Covariate adjustments in randomized controlled trials increased study power and reduced biasedness of effect size estimation.

    Science.gov (United States)

    Lee, Paul H

    2016-08-01

    This study aims to show that under several assumptions, in randomized controlled trials (RCTs), unadjusted, crude analysis will underestimate the Cohen's d effect size of the treatment, and an unbiased estimate of effect size can be obtained only by adjusting for all predictors of the outcome. Four simulations were performed to examine the effects of adjustment on the estimated effect size of the treatment and power of the analysis. In addition, we analyzed data from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study (older adults aged 65-94), an RCT with three treatment arms and one control arm. We showed that (1) the number of unadjusted covariates was associated with the effect size of the treatment; (2) the biasedness of effect size estimation was minimized if all covariates were adjusted for; (3) the power of the statistical analysis slightly decreased with the number of adjusted noise variables; and (4) exhaustively searching the covariates and noise variables adjusted for can lead to exaggeration of the true effect size. Analysis of the ACTIVE study data showed that the effect sizes adjusting for covariates of all three treatments were 7.39-24.70% larger than their unadjusted counterparts, whereas the effect size would be elevated by at most 57.92% by exhaustively searching the variables adjusted for. All covariates of the outcome in RCTs should be adjusted for, and if the effect of a particular variable on the outcome is unknown, adjustment will do more good than harm. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. An Undergraduate Research Experience on Studying Variable Stars

    Science.gov (United States)

    Amaral, A.; Percy, J. R.

    2016-06-01

    We describe and evaluate a summer undergraduate research project and experience by one of us (AA), under the supervision of the other (JP). The aim of the project was to sample current approaches to analyzing variable star data, and topics related to the study of Mira variable stars and their astrophysical importance. This project was done through the Summer Undergraduate Research Program (SURP) in astronomy at the University of Toronto. SURP allowed undergraduate students to explore and learn about many topics within astronomy and astrophysics, from instrumentation to cosmology. SURP introduced students to key skills which are essential for students hoping to pursue graduate studies in any scientific field. Variable stars proved to be an excellent topic for a research project. For beginners to independent research, it introduces key concepts in research such as critical thinking and problem solving, while illuminating previously learned topics in stellar physics. The focus of this summer project was to compare observations with structural and evolutionary models, including modelling the random walk behavior exhibited in the (O-C) diagrams of most Mira stars. We found that the random walk could be modelled by using random fluctuations of the period. This explanation agreed well with observations.

  14. Scheme for teleportation of unknown states of trapped ion

    Institute of Scientific and Technical Information of China (English)

    Chen Mei-Feng; Ma Song-She

    2008-01-01

    A scheme is presented for teleporting an unknown state in a trapped ion system.The scheme only requires a single laser beam.It allows the trap to be in any state with a few phonons,e.g.a thermal motion.Furthermore,it works in the regime,where the Rabi frequency of the laser is on the order of the trap frequency.Thus,the teleportation speed is greatly increased,which is important for decreasing the decoherence effect.This idea can also be used to teleport an unknown ionic entangled state.

  15. Study of Randomness in AES Ciphertexts Produced by Randomly Generated S-Boxes and S-Boxes with Various Modulus and Additive Constant Polynomials

    Science.gov (United States)

    Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan

    2016-06-01

    In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.

  16. One-day versus 3-day suprapubic catheterization after vaginal prolapse surgery : a prospective randomized trial

    NARCIS (Netherlands)

    Van der Steen, Annemarie; Detollenaere, Renee; Den Boon, Jan; Van Eijndhoven, Hugo

    For prolonged catheterization after vaginal prolapse surgery with anterior colporrhaphy, the optimal duration to prevent overdistention of the bladder remains unknown. We designed this study to determine the optimal length of catheterization. We conducted a prospective randomized trial in which 179

  17. Exponential gain of randomness certified by quantum contextuality

    Science.gov (United States)

    Um, Mark; Zhang, Junhua; Wang, Ye; Wang, Pengfei; Kim, Kihwan

    2017-04-01

    We demonstrate the protocol of exponential gain of randomness certified by quantum contextuality in a trapped ion system. The genuine randomness can be produced by quantum principle and certified by quantum inequalities. Recently, randomness expansion protocols based on inequality of Bell-text and Kochen-Specker (KS) theorem, have been demonstrated. These schemes have been theoretically innovated to exponentially expand the randomness and amplify the randomness from weak initial random seed. Here, we report the experimental evidence of such exponential expansion of randomness. In the experiment, we use three states of a 138Ba + ion between a ground state and two quadrupole states. In the 138Ba + ion system, we do not have detection loophole and we apply a methods to rule out certain hidden variable models that obey a kind of extended noncontextuality.

  18. Synchronization of Coupled Different Chaotic FitzHugh-Nagumo Neurons with Unknown Parameters under Communication-Direction-Dependent Coupling

    Directory of Open Access Journals (Sweden)

    Muhammad Iqbal

    2014-01-01

    Full Text Available This paper investigates the chaotic behavior and synchronization of two different coupled chaotic FitzHugh-Nagumo (FHN neurons with unknown parameters under external electrical stimulation (EES. The coupled FHN neurons of different parameters admit unidirectional and bidirectional gap junctions in the medium between them. Dynamical properties, such as the increase in synchronization error as a consequence of the deviation of neuronal parameters for unlike neurons, the effect of difference in coupling strengths caused by the unidirectional gap junctions, and the impact of large time-delay due to separation of neurons, are studied in exploring the behavior of the coupled system. A novel integral-based nonlinear adaptive control scheme, to cope with the infeasibility of the recovery variable, for synchronization of two coupled delayed chaotic FHN neurons of different and unknown parameters under uncertain EES is derived. Further, to guarantee robust synchronization of different neurons against disturbances, the proposed control methodology is modified to achieve the uniformly ultimately bounded synchronization. The parametric estimation errors can be reduced by selecting suitable control parameters. The effectiveness of the proposed control scheme is illustrated via numerical simulations.

  19. Decentralised output feedback control of Markovian jump interconnected systems with unknown interconnections

    Science.gov (United States)

    Li, Li-Wei; Yang, Guang-Hong

    2017-07-01

    The problem of decentralised output feedback control is addressed for Markovian jump interconnected systems with unknown interconnections and general transition rates (TRs) allowed to be unknown or known with uncertainties. A class of decentralised dynamic output feedback controllers are constructed, and a cyclic-small-gain condition is exploited to dispose the unknown interconnections so that the resultant closed-loop system is stochastically stable and satisfies an H∞ performance. With slack matrices to cope with the nonlinearities incurred by unknown and uncertain TRs in control synthesis, a novel controller design condition is developed in linear matrix inequality formalism. Compared with the existing works, the proposed approach leads to less conservatism. Finally, two examples are used to illustrate the effectiveness of the new results.

  20. Renal disease masquerading as pyrexia of unknown origin

    Directory of Open Access Journals (Sweden)

    D Korivi

    2013-01-01

    Full Text Available Pyrexia of unknown origin is a challenging clinical problem. Infections, malignancies, and connective tissue diseases form the major etiologies for this condition. We report a case of a 57-year-old diabetic male who presented with fever of unknown origin for several months. The course of investigations led to a kidney biopsy which clinched the cause of his fever as well as the underlying diagnosis. The light microscopy findings of expansile storiform fibrosis with a dense inflammatory infiltrate suggested the diagnosis which was confirmed by positive staining of Immunoglobulin G4, the dense lympho-plasmacytic infiltrate and elevated serum IgG4 concentrations. A course of steroids followed by mycophenolate mofetil as maintenance immunosuppression rendered the patient afebrile with improvement of renal function.

  1. Least squares estimation in a simple random coefficient autoregressive model

    DEFF Research Database (Denmark)

    Johansen, S; Lange, T

    2013-01-01

    The question we discuss is whether a simple random coefficient autoregressive model with infinite variance can create the long swings, or persistence, which are observed in many macroeconomic variables. The model is defined by yt=stρyt−1+εt,t=1,…,n, where st is an i.i.d. binary variable with p...... we prove the curious result that View the MathML source. The proof applies the notion of a tail index of sums of positive random variables with infinite variance to find the order of magnitude of View the MathML source and View the MathML source and hence the limit of View the MathML source...

  2. Autonomous Flight in Unknown Indoor Environments

    OpenAIRE

    Bachrach, Abraham Galton; He, Ruijie; Roy, Nicholas

    2009-01-01

    This paper presents our solution for enabling a quadrotor helicopter, equipped with a laser rangefinder sensor, to autonomously explore and map unstructured and unknown indoor environments. While these capabilities are already commodities on ground vehicles, air vehicles seeking the same performance face unique challenges. In this paper, we describe the difficulties in achieving fully autonomous helicopter flight, highlighting the differences between ground and helicopter robots that make it ...

  3. Statistics of α-μ Random Variables and Their Applications inWireless Multihop Relaying and Multiple Scattering Channels

    KAUST Repository

    Wang, Kezhi

    2015-06-01

    Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.

  4. Statistics of α-μ Random Variables and Their Applications inWireless Multihop Relaying and Multiple Scattering Channels

    KAUST Repository

    Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim

    2015-01-01

    Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.

  5. High taxonomic variability despite stable functional structure across microbial communities.

    Science.gov (United States)

    Louca, Stilianos; Jacques, Saulo M S; Pires, Aliny P F; Leal, Juliana S; Srivastava, Diane S; Parfrey, Laura Wegener; Farjalla, Vinicius F; Doebeli, Michael

    2016-12-05

    Understanding the processes that are driving variation of natural microbial communities across space or time is a major challenge for ecologists. Environmental conditions strongly shape the metabolic function of microbial communities; however, other processes such as biotic interactions, random demographic drift or dispersal limitation may also influence community dynamics. The relative importance of these processes and their effects on community function remain largely unknown. To address this uncertainty, here we examined bacterial and archaeal communities in replicate 'miniature' aquatic ecosystems contained within the foliage of wild bromeliads. We used marker gene sequencing to infer the taxonomic composition within nine metabolic functional groups, and shotgun environmental DNA sequencing to estimate the relative abundances of these groups. We found that all of the bromeliads exhibited remarkably similar functional community structures, but that the taxonomic composition within individual functional groups was highly variable. Furthermore, using statistical analyses, we found that non-neutral processes, including environmental filtering and potentially biotic interactions, at least partly shaped the composition within functional groups and were more important than spatial dispersal limitation and demographic drift. Hence both the functional structure and taxonomic composition within functional groups of natural microbial communities may be shaped by non-neutral and roughly separate processes.

  6. Softening in Random Networks of Non-Identical Beams.

    Science.gov (United States)

    Ban, Ehsan; Barocas, Victor H; Shephard, Mark S; Picu, Catalin R

    2016-02-01

    Random fiber networks are assemblies of elastic elements connected in random configurations. They are used as models for a broad range of fibrous materials including biopolymer gels and synthetic nonwovens. Although the mechanics of networks made from the same type of fibers has been studied extensively, the behavior of composite systems of fibers with different properties has received less attention. In this work we numerically and theoretically study random networks of beams and springs of different mechanical properties. We observe that the overall network stiffness decreases on average as the variability of fiber stiffness increases, at constant mean fiber stiffness. Numerical results and analytical arguments show that for small variabilities in fiber stiffness the amount of network softening scales linearly with the variance of the fiber stiffness distribution. This result holds for any beam structure and is expected to apply to a broad range of materials including cellular solids.

  7. Designing towards the Unknown: Engaging with Material and Aesthetic Uncertainty

    Directory of Open Access Journals (Sweden)

    Danielle Wilde

    2017-12-01

    Full Text Available New materials with new capabilities demand new ways of approaching design. Destabilising existing methods is crucial to develop new methods. Yet, radical destabilisation—where outcomes remain unknown long enough that new discoveries become possible—is not easy in technology design where complex interdisciplinary teams with time and resource constraints need to deliver concrete outcomes on schedule. The Poetic Kinaesthetic Interface project (PKI engages with this problematic directly. In PKI we use unfolding processes—informed by participatory, speculative and critical design—in emergent actions, to design towards unknown outcomes, using unknown materials. The impossibility of this task is proving as useful as it is disruptive. At its most potent, it is destabilising expectations, aesthetics and processes. Keeping the researchers, collaborators and participants in a state of unknowing, is opening the research potential to far-ranging possibilities. In this article we unpack the motivations driving the PKI project. We present our mixed-methodology, which entangles textile crafts, design interactions and materiality to shape an embodied enquiry. Our research outcomes are procedural and methodological. PKI brings together diverse human, non-human, known and unknown actors to discover where the emergent assemblages might lead. Our approach is re-invigorating—as it demands re-envisioning of—the design process.

  8. Value of Bone marrow Examination in Pyrexia of unknown origin

    Directory of Open Access Journals (Sweden)

    A Jha

    2013-10-01

    Full Text Available Background: Pyrexia of unknown origin is a common diagnostic dilemma. Series of diagnostic modalities are required to arrive at diagnosis. Bone marrow examination is one of the common tests implicated in the diagnosis in combination with other diagnostic modalities. Present study has attempted to explore the causes of pyrexia of unknown origin based on bone marrow morphological study. Materials and Methods: In a one year prospective study conducted at Manipal Teaching Hospital, Pokhara, Nepal; bone marrow aspiration and biopsy was performed and evaluated morphologically, in 57 patients fulfilling the criteria of classic pyrexia of unknown origin. Results: In 42% cases; specific diagnosis could be made and hematological neoplasm was the most common finding followed by megaloblastic anemia, hypoplastic anemia and one case each of hemophagocytosis, malaria and tuberculosis. Acute leukemia was the most frequently encountered hematological malignancy followed by multiple myeloma, chronic myeloid leukemia, essential thrombocythemia and myelodysplastic syndrome. Conclusion: Morphological examination of bone marrow has important role in diagnosis of pyrexia of unknown origin. However, yield of diagnosis can be increased if it is combined with other diagnostic modalities including radiological, microbiological and serological tests. DOI: http://dx.doi.org/10.3126/jpn.v3i6.8991 Journal of Pathology of Nepal (2013 Vol. 3, 447-451

  9. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  10. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  11. Ecohydrological change and variability over western North America from the Last Glacial Maximum to the near term future: The known, the unknown and the known unknown

    Science.gov (United States)

    Seager, R.; Mankin, J. S.; Cook, B.; Scheff, J.; Smerdon, J. E.; Coats, S.; Williams, P.

    2017-12-01

    Hydroclimate variability and change in western North America from the last glacial to the near future is reviewed focusing on non-orbital-induced variations. The motivating factor is model projections of intensifying aridity in southwestern North America as a consequence of rising greenhouse gases. Future change will be considered in the context of changes in precipitation, evaporative demand and CO2 and from perspectives of surface hydroclimate and ecological response. Current earth system models project increasing aridity in terms of standard drought measures and declining soil moisture quite robustly as a consequence of declining precipitation (in the interior southwest) and rising evaporative demand (everywhere). However, the same models project rising net primary production, leaf area index and evapotranspiration as the effects on plants of increasing CO2 outweigh the effects of increased water stress. There are reasons to be cautious about the realism of this modeled response but it poses a future with a competition between ecosystems and humans (dependent on soil moisture and runoff) for available water. Looking back over the last millennium with the aid of near hemispheric tree ring drought reconstructions we show that the driving role the tropical Pacific and tropical North Atlantic Oceans have in orchestrating the sequence of western droughts and pluvials extends to the Medieval megadroughts. These famed severe and extended droughts, still visible in the landscape today, were most likely caused by internal atmosphere-ocean variability including frequent, but not anomalously recurrent, La Nina-conditions accompanied by persistent warm shifts in the tropical North Atlantic. Passing over early and mid-Holocene climates that were strongly influenced by orbitally-forcing, we turn to Last Glacial Maximum climates. According to lake records the LGM was moist across western North America while pollen records show a mix of wetter-looking and less lush ("drier

  12. Random sampling of the Central European bat fauna reveals the existence of numerous hitherto unknown adenoviruses.

    Science.gov (United States)

    Vidovszky, Márton; Kohl, Claudia; Boldogh, Sándor; Görföl, Tamás; Wibbelt, Gudrun; Kurth, Andreas; Harrach, Balázs

    2015-12-01

    From over 1250 extant species of the order Chiroptera, 25 and 28 are known to occur in Germany and Hungary, respectively. Close to 350 samples originating from 28 bat species (17 from Germany, 27 from Hungary) were screened for the presence of adenoviruses (AdVs) using a nested PCR that targets the DNA polymerase gene of AdVs. An additional PCR was designed and applied to amplify a fragment from the gene encoding the IVa2 protein of mastadenoviruses. All German samples originated from organs of bats found moribund or dead. The Hungarian samples were excrements collected from colonies of known bat species, throat or rectal swab samples, taken from live individuals that had been captured for faunistic surveys and migration studies, as well as internal organs of dead specimens. Overall, 51 samples (14.73%) were found positive. We detected 28 seemingly novel and six previously described bat AdVs by sequencing the PCR products. The positivity rate was the highest among the guano samples of bat colonies. In phylogeny reconstructions, the AdVs detected in bats clustered roughly, but not perfectly, according to the hosts' families (Vespertilionidae, Rhinolophidae, Hipposideridae, Phyllostomidae and Pteropodidae). In a few cases, identical sequences were derived from animals of closely related species. On the other hand, some bat species proved to harbour more than one type of AdV. The high prevalence of infection and the large number of chiropteran species worldwide make us hypothesise that hundreds of different yet unknown AdV types might circulate in bats.

  13. A data based random number generator for a multivariate distribution (using stochastic interpolation)

    Science.gov (United States)

    Thompson, J. R.; Taylor, M. S.

    1982-01-01

    Let X be a K-dimensional random variable serving as input for a system with output Y (not necessarily of dimension k). given X, an outcome Y or a distribution of outcomes G(Y/X) may be obtained either explicitly or implicity. The situation is considered in which there is a real world data set X sub j sub = 1 (n) and a means of simulating an outcome Y. A method for empirical random number generation based on the sample of observations of the random variable X without estimating the underlying density is discussed.

  14. Influences of granulocyte growth factor in uterine perfusion on pregnancy outcome of patients with failure of embryo implantation for unknown reason.

    Science.gov (United States)

    He, Jun; Liu, Juan; Zhou, Hua; Chen, Chao Jun

    2016-11-01

    To investigate the influence of granulocyte growth factor in uterine perfusion on the pregnancy outcome of patients with failure of embryo implantation for unknown reason. Then, 68 patients with failure of embryo implantation for unknown reason were enrolled in our hospital from November 2013 to February 2015, which were divided into observation group and control group by random (34 patients in each group). Patients in observation group received basic treatment for granulocyte growth factor in uterine perfusion on the next day, while patients in control group received basic treatment with placebo. Then, endometrial preparation, adverse reaction and pregnancy outcome of patients were compared between the two groups. Comparing the endometrial preparation and average endometrial thickness of patients in control group (9.87±2.12) with those in observation group [(9.87±2.12), there is no significant difference (Pfactor, patients with failure of embryo implantation can effectively improve clinical pregnancy rate and embryo implantation rate without severe complication. Therefore, treatment of granlocyte growth factor can improve the pregnancy outcome of patients.

  15. Towards high-speed autonomous navigation of unknown environments

    Science.gov (United States)

    Richter, Charles; Roy, Nicholas

    2015-05-01

    In this paper, we summarize recent research enabling high-speed navigation in unknown environments for dynamic robots that perceive the world through onboard sensors. Many existing solutions to this problem guarantee safety by making the conservative assumption that any unknown portion of the map may contain an obstacle, and therefore constrain planned motions to lie entirely within known free space. In this work, we observe that safety constraints may significantly limit performance and that faster navigation is possible if the planner reasons about collision with unobserved obstacles probabilistically. Our overall approach is to use machine learning to approximate the expected costs of collision using the current state of the map and the planned trajectory. Our contribution is to demonstrate fast but safe planning using a learned function to predict future collision probabilities.

  16. Elimination of some unknown parameters and its effect on outlier detection

    Directory of Open Access Journals (Sweden)

    Serif Hekimoglu

    Full Text Available Outliers in observation set badly affect all the estimated unknown parameters and residuals, that is because outlier detection has a great importance for reliable estimation results. Tests for outliers (e.g. Baarda's and Pope's tests are frequently used to detect outliers in geodetic applications. In order to reduce the computational time, sometimes elimination of some unknown parameters, which are not of interest, is performed. In this case, although the estimated unknown parameters and residuals do not change, the cofactor matrix of the residuals and the redundancies of the observations change. In this study, the effects of the elimination of the unknown parameters on tests for outliers have been investigated. We have proved that the redundancies in initial functional model (IFM are smaller than the ones in reduced functional model (RFM where elimination is performed. To show this situation, a horizontal control network was simulated and then many experiences were performed. According to simulation results, tests for outlier in IFM are more reliable than the ones in RFM.

  17. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  18. A method based on a separation of variables in magnetohydrodynamics (MHD); Une methode de separation des variables en magnetohydrodynamique

    Energy Technology Data Exchange (ETDEWEB)

    Cessenat, M.; Genta, P.

    1996-12-31

    We use a method based on a separation of variables for solving a system of first order partial differential equations, in a very simple modelling of MHD. The method consists in introducing three unknown variables {phi}1, {phi}2, {phi}3 in addition of the time variable {tau} and then searching a solution which is separated with respect to {phi}1 and {tau} only. This is allowed by a very simple relation, called a `metric separation equation`, which governs the type of solutions with respect to time. The families of solutions for the system of equations thus obtained, correspond to a radial evolution of the fluid. Solving the MHD equations is then reduced to find the transverse component H{sub {Sigma}} of the magnetic field on the unit sphere {Sigma} by solving a non linear partial differential equation on {Sigma}. Thus we generalize ideas due to Courant-Friedrichs and to Sedov on dimensional analysis and self-similar solutions. (authors).

  19. Impact of Flavonols on Cardiometabolic Biomarkers: A Meta-Analysis of Randomized Controlled Human Trials to Explore the Role of Inter-Individual Variability

    Science.gov (United States)

    Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R.; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula

    2017-01-01

    Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD). However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta-analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta-analysis model and reported as difference in means (DM). Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01), LDL cholesterol (DM = −0.14 mmol/L; 95% CI: −0.21, 0.07), and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03), and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95% CI: −0.29, −0.08), and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: −4.09, −2.55). Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of origin and health status may influence the effect of flavonol intake on blood lipid levels. PMID:28208791

  20. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  1. Rendezvous with connectivity preservation for multi-robot systems with an unknown leader

    Science.gov (United States)

    Dong, Yi

    2018-02-01

    This paper studies the leader-following rendezvous problem with connectivity preservation for multi-agent systems composed of uncertain multi-robot systems subject to external disturbances and an unknown leader, both of which are generated by a so-called exosystem with parametric uncertainty. By combining internal model design, potential function technique and adaptive control, two distributed control strategies are proposed to maintain the connectivity of the communication network, to achieve the asymptotic tracking of all the followers to the output of the unknown leader system, as well as to reject unknown external disturbances. It is also worth to mention that the uncertain parameters in the multi-robot systems and exosystem are further allowed to belong to unknown and unbounded sets when applying the second fully distributed control law containing a dynamic gain inspired by high-gain adaptive control or self-tuning regulator.

  2. Iron supplementation in HIV-infected Malawian children with anemia: a double-blind, randomized, controlled trial

    NARCIS (Netherlands)

    Esan, Michael O.; van Hensbroek, Michael Boele; Nkhoma, Ernest; Musicha, Crispin; White, Sarah A.; ter Kuile, Feiko O.; Phiri, Kamija S.

    2013-01-01

    It is unknown whether iron supplementation in human immunodeficiency virus (HIV)-infected children living in regions with high infection pressure is safe or beneficial. A 2-arm, double-blind, randomized, controlled trial was conducted to examine the effects of iron supplementation on hemoglobin, HIV

  3. Travel time variability and rational inattention

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Jiang, Gege

    2017-01-01

    This paper sets up a rational inattention model for the choice of departure time for a traveler facing random travel time. The traveler chooses how much information to acquire about the travel time out-come before choosing departure time. This reduces the cost of travel time variability compared...

  4. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  5. Lod score curves for phase-unknown matings.

    Science.gov (United States)

    Hulbert-Shearon, T; Boehnke, M; Lange, K

    1996-01-01

    For a phase-unknown nuclear family, we show that the likelihood and lod score are unimodal, and we describe conditions under which the maximum occurs at recombination fraction theta = 0, theta = 1/2, and 0 < theta < 1/2. These simply stated necessary and sufficient conditions seem to have escaped the notice of previous statistical geneticists.

  6. Fracture fragility of HFIR vessel caused by random crack size or random toughness

    International Nuclear Information System (INIS)

    Chang, Shih-Jung; Proctor, L.D.

    1993-01-01

    This report discuses the probability of fracture (fracture fragility) versus a range of applied hoop stresses along the HFIR vessel which is obtained as an estimate of its fracture capacity. Both the crack size and the fracture toughness are assumed to be random variables that follow given distribution functions. Possible hoop stress is based on the numerical solution of the vessel response by applying a point pressure-pulse it the center of the fluid volume within the vessel. Both the fluid-structure interaction and radiation embrittlement are taken into consideration. Elastic fracture mechanics is used throughout the analysis. The probability of vessel fracture for a single crack caused by either a variable crack depth or a variable toughness is first derived. Then the probability of fracture with multiple number of cracks is obtained. The probability of fracture is further extended to include different levels of confidence and variability. It, therefore, enables one to estimate the high confidence and low probability capacity accident load

  7. Random Forest Variable Importance Spectral Indices Scheme for Burnt Forest Recovery Monitoring—Multilevel RF-VIMP

    Directory of Open Access Journals (Sweden)

    Sornkitja Boonprong

    2018-05-01

    Full Text Available Burnt forest recovery is normally monitored with a time-series analysis of satellite data because of its proficiency for large observation areas. Traditional methods, such as linear correlation plotting, have been proven to be effective, as forest recovery naturally increases with time. However, these methods are complicated and time consuming when increasing the number of observed parameters. In this work, we present a random forest variable importance (RF-VIMP scheme called multilevel RF-VIMP to compare and assess the relationship between 36 spectral indices (parameters of burnt boreal forest recovery in the Great Xing’an Mountain, China. Six Landsat images were acquired in the same month 0, 1, 4, 14, 16, and 20 years after a fire, and 39,380 fixed-location samples were then extracted to calculate the effectiveness of the 36 parameters. Consequently, the proposed method was applied to find correlations between the forest recovery indices. The experiment showed that the proposed method is suitable for explaining the efficacy of those spectral indices in terms of discrimination and trend analysis, and for showing the satellite data and forest succession dynamics when applied in a time series. The results suggest that the tasseled cap transformation wetness, brightness, and the shortwave infrared bands (both 1 and 2 perform better than other indices for both classification and monitoring.

  8. Collocation methods for uncertainty quanti cation in PDE models with random data

    KAUST Repository

    Nobile, Fabio

    2014-01-06

    In this talk we consider Partial Differential Equations (PDEs) whose input data are modeled as random fields to account for their intrinsic variability or our lack of knowledge. After parametrizing the input random fields by finitely many independent random variables, we exploit the high regularity of the solution of the PDE as a function of the input random variables and consider sparse polynomial approximations in probability (Polynomial Chaos expansion) by collocation methods. We first address interpolatory approximations where the PDE is solved on a sparse grid of Gauss points in the probability space and the solutions thus obtained interpolated by multivariate polynomials. We present recent results on optimized sparse grids in which the selection of points is based on a knapsack approach and relies on sharp estimates of the decay of the coefficients of the polynomial chaos expansion of the solution. Secondly, we consider regression approaches where the PDE is evaluated on randomly chosen points in the probability space and a polynomial approximation constructed by the least square method. We present recent theoretical results on the stability and optimality of the approximation under suitable conditions between the number of sampling points and the dimension of the polynomial space. In particular, we show that for uniform random variables, the number of sampling point has to scale quadratically with the dimension of the polynomial space to maintain the stability and optimality of the approximation. Numerical results show that such condition is sharp in the monovariate case but seems to be over-constraining in higher dimensions. The regression technique seems therefore to be attractive in higher dimensions.

  9. Adaptive Incentive Controls for Stackelberg Games with Unknown Cost Functionals.

    Science.gov (United States)

    1984-01-01

    APR EZT:: F I AN 73S e OsL:-: UNCLASSI?:-- Q4~.’~- .A.., 6, *~*i i~~*~~*.- U ADAPTIVE INCENTIVE CONTROLS FOR STACKELBERG GAMES WITH UNKNOWN COST...AD-A161 885 ADAPTIVE INCENTIVE CONTROLS FOR STACKELBERG GAMES WITH i/1 UNKNOWN COST FUNCTIONALSCU) ILLINOIS UNIV AT URBANA DECISION AND CONTROL LAB T...ORGANIZATION 6b. OFFICE SYMBOL 7.. NAME OF MONITORING ORGANIZATION CoriaeLcenef~pda~ Joint Services Electronics Program Laboratory, Univ. of Illinois N/A

  10. M-MRAC Backstepping for Systems with Unknown Virtual Control Coefficients

    Science.gov (United States)

    Stepanyan, Vahram; Krishnakumar, Kalmanje

    2015-01-01

    The paper presents an over-parametrization free certainty equivalence state feedback backstepping adaptive control design method for systems of any relative degree with unmatched uncertainties and unknown virtual control coefficients. It uses a fast prediction model to estimate the unknown parameters, which is independent of the control design. It is shown that the system's input and output tracking errors can be systematically decreased by the proper choice of the design parameters. The benefits of the approach are demonstrated in numerical simulations.

  11. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  12. Confidence Intervals for a Semiparametric Approach to Modeling Nonlinear Relations among Latent Variables

    Science.gov (United States)

    Pek, Jolynn; Losardo, Diane; Bauer, Daniel J.

    2011-01-01

    Compared to parametric models, nonparametric and semiparametric approaches to modeling nonlinearity between latent variables have the advantage of recovering global relationships of unknown functional form. Bauer (2005) proposed an indirect application of finite mixtures of structural equation models where latent components are estimated in the…

  13. More randomness from the same data

    International Nuclear Information System (INIS)

    Bancal, Jean-Daniel; Sheridan, Lana; Scarani, Valerio

    2014-01-01

    Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment (Christensen et al 2013 Phys. Rev. Lett. 111 130406), we show that about twice as much randomness as previously reported can be potentially extracted from this setup. (paper)

  14. Speeding up transmissions of unknown quantum information along Ising-type quantum channels

    International Nuclear Information System (INIS)

    Guo W J; Wei L F

    2017-01-01

    Quantum teleportation with entanglement channels and a series of two-qubit SWAP gates between the nearest-neighbor qubits are usually utilized to achieve the transfers of unknown quantum state from the sender to the distant receiver. In this paper, by simplifying the usual SWAP gates we propose an approach to speed up the transmissions of unknown quantum information, specifically including the single-qubit unknown state and two-qubit unknown entangled ones, by a series of entangling and disentangling operations between the remote qubits with distant interactions. The generic proposal is demonstrated specifically with experimentally-existing Ising-type quantum channels without transverse interaction; liquid NMR-molecules driven by global radio frequency electromagnetic pulses and capacitively-coupled Josephson circuits driven by local microwave pulses. The proposal should be particularly useful to set up the connections between the distant qubits in a chip of quantum computing. (paper)

  15. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  16. Renal Hemodynamic Effects of Serelaxin in Patients With Chronic Heart Failure A Randomized, Placebo-Controlled Study

    NARCIS (Netherlands)

    Voors, Adriaan A.; Dahlke, Marion; Meyer, Sven; Stepinska, Janina; Gottlieb, Stephen S.; Jones, Andrew; Zhang, Yiming; Laurent, Didier; Slart, Riemer H. J. A.; Navis, Gerjan J.

    2014-01-01

    Background-Serelaxin is a promising therapy for acute heart failure. The renal hemodynamic effects of serelaxin in patients with chronic heart failure are unknown. Methods and Results-In this double-blind, randomized, placebo-controlled, multicenter study, patients with New York Heart Association

  17. Identification of fractional-order systems with unknown initial values and structure

    Energy Technology Data Exchange (ETDEWEB)

    Du, Wei, E-mail: duwei0203@gmail.com [Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Miao, Qingying, E-mail: qymiao@sjtu.edu.cn [School of Continuing Education, Shanghai Jiao Tong University, Shanghai 200030 (China); Tong, Le, E-mail: tongle0328@gmail.com [Faculty of Applied Science and Textiles, The Hong Kong Polytechnic University, Hong Kong (China); Tang, Yang [Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China)

    2017-06-21

    In this paper, the identification problem of fractional-order chaotic systems is proposed and investigated via an evolutionary optimization approach. Different with other studies to date, this research focuses on the identification of fractional-order chaotic systems with not only unknown orders and parameters, but also unknown initial values and structure. A group of fractional-order chaotic systems, i.e., Lorenz, Lü, Chen, Rössler, Arneodo and Volta chaotic systems, are set as the system candidate pool. The identification problem of fractional-order chaotic systems in this research belongs to mixed integer nonlinear optimization in essence. A powerful evolutionary algorithm called composite differential evolution (CoDE) is introduced for the identification problem presented in this paper. Extensive experiments are carried out to show that the fractional-order chaotic systems with unknown initial values and structure can be successfully identified by means of CoDE. - Highlights: • Unknown initial values and structure are introduced in the identification of fractional-order chaotic systems; • Only a series of output is utilized in the identification of fractional-order chaotic systems; • CoDE is used for the identification problem and the results are satisfactory when compared with other DE variants.

  18. Strong result for real zeros of random algebraic polynomials

    Directory of Open Access Journals (Sweden)

    T. Uno

    2001-01-01

    Full Text Available An estimate is given for the lower bound of real zeros of random algebraic polynomials whose coefficients are non-identically distributed dependent Gaussian random variables. Moreover, our estimated measure of the exceptional set, which is independent of the degree of the polynomials, tends to zero as the degree of the polynomial tends to infinity.

  19. Effect of random edge failure on the average path length

    Energy Technology Data Exchange (ETDEWEB)

    Guo Dongchao; Liang Mangui; Li Dandan; Jiang Zhongyuan, E-mail: mgliang58@gmail.com, E-mail: 08112070@bjtu.edu.cn [Institute of Information Science, Beijing Jiaotong University, 100044, Beijing (China)

    2011-10-14

    We study the effect of random removal of edges on the average path length (APL) in a large class of uncorrelated random networks in which vertices are characterized by hidden variables controlling the attachment of edges between pairs of vertices. A formula for approximating the APL of networks suffering random edge removal is derived first. Then, the formula is confirmed by simulations for classical ER (Erdoes and Renyi) random graphs, BA (Barabasi and Albert) networks, networks with exponential degree distributions as well as random networks with asymptotic power-law degree distributions with exponent {alpha} > 2. (paper)

  20. Variability search in M 31 using principal component analysis and the Hubble Source Catalogue

    Science.gov (United States)

    Moretti, M. I.; Hatzidimitriou, D.; Karampelas, A.; Sokolovsky, K. V.; Bonanos, A. Z.; Gavras, P.; Yang, M.

    2018-06-01

    Principal component analysis (PCA) is being extensively used in Astronomy but not yet exhaustively exploited for variability search. The aim of this work is to investigate the effectiveness of using the PCA as a method to search for variable stars in large photometric data sets. We apply PCA to variability indices computed for light curves of 18 152 stars in three fields in M 31 extracted from the Hubble Source Catalogue. The projection of the data into the principal components is used as a stellar variability detection and classification tool, capable of distinguishing between RR Lyrae stars, long-period variables (LPVs) and non-variables. This projection recovered more than 90 per cent of the known variables and revealed 38 previously unknown variable stars (about 30 per cent more), all LPVs except for one object of uncertain variability type. We conclude that this methodology can indeed successfully identify candidate variable stars.

  1. A random-matrix theory of the number sense.

    Science.gov (United States)

    Hannagan, T; Nieder, A; Viswanathan, P; Dehaene, S

    2017-02-19

    Number sense, a spontaneous ability to process approximate numbers, has been documented in human adults, infants and newborns, and many other animals. Species as distant as monkeys and crows exhibit very similar neurons tuned to specific numerosities. How number sense can emerge in the absence of learning or fine tuning is currently unknown. We introduce a random-matrix theory of self-organized neural states where numbers are coded by vectors of activation across multiple units, and where the vector codes for successive integers are obtained through multiplication by a fixed but random matrix. This cortical implementation of the 'von Mises' algorithm explains many otherwise disconnected observations ranging from neural tuning curves in monkeys to looking times in neonates and cortical numerotopy in adults. The theory clarifies the origin of Weber-Fechner's Law and yields a novel and empirically validated prediction of multi-peak number neurons. Random matrices constitute a novel mechanism for the emergence of brain states coding for quantity.This article is part of a discussion meeting issue 'The origins of numerical abilities'. © 2017 The Author(s).

  2. GPR random noise reduction using BPD and EMD

    Science.gov (United States)

    Ostoori, Roya; Goudarzi, Alireza; Oskooi, Behrooz

    2018-04-01

    Ground-penetrating radar (GPR) exploration is a new high-frequency technology that explores near-surface objects and structures accurately. The high-frequency antenna of the GPR system makes it a high-resolution method compared to other geophysical methods. The frequency range of recorded GPR is so wide that random noise recording is inevitable due to acquisition. This kind of noise comes from unknown sources and its correlation to the adjacent traces is nearly zero. This characteristic of random noise along with the higher accuracy of GPR system makes denoising very important for interpretable results. The main objective of this paper is to reduce GPR random noise based on pursuing denoising using empirical mode decomposition. Our results showed that empirical mode decomposition in combination with basis pursuit denoising (BPD) provides satisfactory outputs due to the sifting process compared to the time-domain implementation of the BPD method on both synthetic and real examples. Our results demonstrate that because of the high computational costs, the BPD-empirical mode decomposition technique should only be used for heavily noisy signals.

  3. Multidimensional procurement auctions with unknown weights

    DEFF Research Database (Denmark)

    Greve, Thomas

    This paper studies the consequences of holding a procurement auction when the principal chooses not to show its preferences. My paper extends the procurement auction model of Che (1993) to a situation where both the principal and the agents have private information. Thus, unknown parameters of bo...... gives rise to an analysis of a principal that can not fully commit to the outcome induced by the scoring rule. Therefore, my result apply to contract theory and it’s problems with imperfect commitment....

  4. Randomized trials, generalizability, and meta-analysis: Graphical insights for binary outcomes

    Directory of Open Access Journals (Sweden)

    Kramer Barnett S

    2003-06-01

    Full Text Available Abstract Background Randomized trials stochastically answer the question. "What would be the effect of treatment on outcome if one turned back the clock and switched treatments in the given population?" Generalizations to other subjects are reliable only if the particular trial is performed on a random sample of the target population. By considering an unobserved binary variable, we graphically investigate how randomized trials can also stochastically answer the question, "What would be the effect of treatment on outcome in a population with a possibly different distribution of an unobserved binary baseline variable that does not interact with treatment in its effect on outcome?" Method For three different outcome measures, absolute difference (DIF, relative risk (RR, and odds ratio (OR, we constructed a modified BK-Plot under the assumption that treatment has the same effect on outcome if either all or no subjects had a given level of the unobserved binary variable. (A BK-Plot shows the effect of an unobserved binary covariate on a binary outcome in two treatment groups; it was originally developed to explain Simpsons's paradox. Results For DIF and RR, but not OR, the BK-Plot shows that the estimated treatment effect is invariant to the fraction of subjects with an unobserved binary variable at a given level. Conclusion The BK-Plot provides a simple method to understand generalizability in randomized trials. Meta-analyses of randomized trials with a binary outcome that are based on DIF or RR, but not OR, will avoid bias from an unobserved covariate that does not interact with treatment in its effect on outcome.

  5. Maximal Increments of Local Time of a Random Walk

    OpenAIRE

    Jain, Naresh C.; Pruitt, William E.

    1987-01-01

    Let $(S_j)$ be a lattice random walk, i.e., $S_j = X_1 + \\cdots + X_j$, where $X_1, X_2,\\ldots$ are independent random variables with values in $\\mathbb{Z}$ and common nondegenerate distribution $F$. Let $\\{t_n\\}$ be a nondecreasing sequence of positive integers, $t_n \\leq n$, and $L^\\ast_n = \\max_{0\\leq j\\leq n-t_n}(L_{j+t_n} - L_j)$, where $L_n = \\sum^n_{j=1}1_{\\{0\\}}(S_j)$, the number of times zero is visited by the random walk by time $n$. Assuming that the random walk is recurrent and sa...

  6. Posterior variability of inclusion shape based on tomographic measurement data

    International Nuclear Information System (INIS)

    Watzenig, Daniel; Fox, Colin

    2008-01-01

    We treat the problem of recovering the unknown shape of a single inclusion with unknown constant permittivity in an otherwise uniform background material, from uncertain measurements of trans-capacitance at electrodes outside the material. The ubiquitous presence of measurement noise implies that the practical measurement process is probabilistic, and the inverse problem is naturally stated as statistical inference. Formulating the inverse problem in a Bayesian inferential framework requires accurately modelling the forward map, measurement noise, and specifying a prior distribution for the cross-sectional material distribution. Numerical implementation of the forward map is via the boundary element method (BEM) taking advantage of a piecewise constant representation. Summary statistics are calculated using MCMC sampling to characterize posterior variability for synthetic and measured data sets.

  7. Inverse random source scattering for the Helmholtz equation in inhomogeneous media

    Science.gov (United States)

    Li, Ming; Chen, Chuchu; Li, Peijun

    2018-01-01

    This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.

  8. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, M.; Manos, G. C.

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective is to ...... of freedom system loaded by white noise, estimating the coefficient of restitution as explained, and comparing the estimates with the value used in the simulations. Several estimates for the coefficient of restitution are considered, and reasonable results are achieved....

  9. Experimental Evaluation of Novel Master-Slave Configurations for Position Control under Random Network Delay and Variable Load for Teleoperation

    Directory of Open Access Journals (Sweden)

    Ahmet Kuzu

    2014-01-01

    Full Text Available This paper proposes two novel master-slave configurations that provide improvements in both control and communication aspects of teleoperation systems to achieve an overall improved performance in position control. The proposed novel master-slave configurations integrate modular control and communication approaches, consisting of a delay regulator to address problems related to variable network delay common to such systems, and a model tracking control that runs on the slave side for the compensation of uncertainties and model mismatch on the slave side. One of the configurations uses a sliding mode observer and the other one uses a modified Smith predictor scheme on the master side to ensure position transparency between the master and slave, while reference tracking of the slave is ensured by a proportional-differentiator type controller in both configurations. Experiments conducted for the networked position control of a single-link arm under system uncertainties and randomly varying network delays demonstrate significant performance improvements with both configurations over the past literature.

  10. Clostridium difficile: A healthcare-associated infection of unknown ...

    African Journals Online (AJOL)

    Clostridium difficile: A healthcare-associated infection of unknown significance in adults in sub-Saharan Africa. ... Abstract. Background: Clostridium difficile infection (CDI) causes a high burden of disease in high-resource healthcare systems, with significant morbidity, mortality, and financial implications. CDI is a ...

  11. Severe scratcher-reaction: an unknown health hazard?

    Directory of Open Access Journals (Sweden)

    Carsten Sauer Mikkelsen

    2015-03-01

    Full Text Available Tattoos are well known to cause skin problems and the number of reported adverse reactions after tattooing has increased. Illegally imported tattoo ink is unrestrained and can contain unknown ingredients and contamination thereby posing a serious health hazard. We present a case illustrating the risk of pronounced phototoxic allergic reaction and other severe complications after using home kit tattoo ink.

  12. Pore-scale modeling of vapor transport in partially saturated capillary tube with variable area using chemical potential

    DEFF Research Database (Denmark)

    Addassi, Mouadh; Schreyer, Lynn; Johannesson, Björn

    2016-01-01

    Here we illustrate the usefulness of using the chemical potential as the primary unknown by modeling isothermal vapor transport through a partially saturated cylindrically symmetric capillary tube of variable cross-sectional area using a single equation. There are no fitting parameters and the nu......Here we illustrate the usefulness of using the chemical potential as the primary unknown by modeling isothermal vapor transport through a partially saturated cylindrically symmetric capillary tube of variable cross-sectional area using a single equation. There are no fitting parameters...... and the numerical solutions to the equation are compared with experimental results with excellent agreement. We demonstrate that isothermal vapor transport can be accurately modeled without modeling the details of the contact angle, microscale temperature fluctuations, or pressure fluctuations using a modification...

  13. Financial management of a large multisite randomized clinical trial.

    Science.gov (United States)

    Sheffet, Alice J; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E; Longbottom, Mary E; Howard, Virginia J; Marler, John R; Brott, Thomas G

    2014-08-01

    The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years' funding ($21 112 866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2500 randomized participants at 40 sites. Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Projections of the original grant's fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant's fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2500 targeted sample size, 138 (5·5%) were randomized during the first five years and 1387 (55·5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13 845) of the projected per-patient costs ($152 992) of the fixed model. Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. © 2014 The Authors. International Journal of Stroke © 2014 World Stroke Organization.

  14. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  15. Prediction of N2O emission from local information with Random Forest

    International Nuclear Information System (INIS)

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2013-01-01

    Nitrous oxide is a potent greenhouse gas, with a global warming potential 298 times greater than that of CO 2 . In agricultural soils, N 2 O emissions are influenced by a large number of environmental characteristics and crop management techniques that are not systematically reported in experiments. Random Forest (RF) is a machine learning method that can handle missing data and ranks input variables on the basis of their importance. We aimed to predict N 2 O emission on the basis of local information, to rank environmental and crop management variables according to their influence on N 2 O emission, and to compare the performances of RF with several regression models. RF outperformed the regression models for predictive purposes, and this approach led to the identification of three important input variables: N fertilization, type of crop, and experiment duration. This method could be used in the future for prediction of N 2 O emissions from local information. -- Highlights: ► Random Forest gave more accurate N 2 O predictions than regression. ► Missing data were well handled by Random Forest. ► The most important factors were nitrogen rate, type of crop and experiment duration. -- Random Forest, a machine learning method, outperformed the regression models for predicting N 2 O emissions and led to the identification of three important input variables

  16. Dissociable effects of practice variability on learning motor and timing skills.

    Science.gov (United States)

    Caramiaux, Baptiste; Bevilacqua, Frédéric; Wanderley, Marcelo M; Palmer, Caroline

    2018-01-01

    Motor skill acquisition inherently depends on the way one practices the motor task. The amount of motor task variability during practice has been shown to foster transfer of the learned skill to other similar motor tasks. In addition, variability in a learning schedule, in which a task and its variations are interweaved during practice, has been shown to help the transfer of learning in motor skill acquisition. However, there is little evidence on how motor task variations and variability schedules during practice act on the acquisition of complex motor skills such as music performance, in which a performer learns both the right movements (motor skill) and the right time to perform them (timing skill). This study investigated the impact of rate (tempo) variability and the schedule of tempo change during practice on timing and motor skill acquisition. Complete novices, with no musical training, practiced a simple musical sequence on a piano keyboard at different rates. Each novice was assigned to one of four learning conditions designed to manipulate the amount of tempo variability across trials (large or small tempo set) and the schedule of tempo change (randomized or non-randomized order) during practice. At test, the novices performed the same musical sequence at a familiar tempo and at novel tempi (testing tempo transfer), as well as two novel (but related) sequences at a familiar tempo (testing spatial transfer). We found that practice conditions had little effect on learning and transfer performance of timing skill. Interestingly, practice conditions influenced motor skill learning (reduction of movement variability): lower temporal variability during practice facilitated transfer to new tempi and new sequences; non-randomized learning schedule improved transfer to new tempi and new sequences. Tempo (rate) and the sequence difficulty (spatial manipulation) affected performance variability in both timing and movement. These findings suggest that there is a

  17. Effects of assisted and variable mechanical ventilation on cardiorespiratory interactions in anesthetized pigs

    International Nuclear Information System (INIS)

    Beda, Alessandro; Güldner, Andreas; Carvalho, Nadja C; Franke, Susanne; Uhlig, Christopher; Koch, Thea; De Abreu, Marcelo Gama; Simpson, David M; Pelosi, Paolo

    2012-01-01

    The physiological importance of respiratory sinus arrhythmia (RSA) and cardioventilatory coupling (CVC) has not yet been fully elucidated, but these phenomena might contribute to improve ventilation/perfusion matching, with beneficial effects on gas exchange. Furthermore, decreased RSA amplitude has been suggested as an indicator of impaired autonomic control and poor clinical outcome, also during positive-pressure mechanical ventilation (MV). However, it is currently unknown how different modes of MV, including variable tidal volumes (V T ), affect RSA and CVC during anesthesia. We compared the effects of pressure controlled (PCV) versus pressure assisted (PSV) ventilation, and of random variable versus constant V T , on RSA and CVC in eight anesthetized pigs. At comparable depth of anesthesia, global hemodynamics, and ventilation, RSA amplitude increased from 20 ms in PCV to 50 ms in PSV (p < 0.05). CVC was detected (using proportional Shannon entropy of the interval between each inspiration onset and the previous R-peak in ECG) in two animals in PCV and seven animals in PSV. Variable V T did not significantly influence these phenomena. Furthermore, heart period and systolic arterial pressure oscillations were in phase during PCV but in counter-phase during PSV. At the same depth of anesthesia in pigs, PSV increases RSA amplitude and CVC compared to PCV. Our data suggest that the central respiratory drive, but not the baroreflex or the mechano-electric feedback in the heart, is the main mechanism behind the RSA increase. Hence, differences in RSA and CVC between mechanically ventilated patients might reflect the difference in ventilation mode rather than autonomic impairment. Also, since gas exchange did not increase from PCV to PSV, it is questionable whether RSA has any significance in improving ventilation/perfusion matching during MV. (paper)

  18. Effects of assisted and variable mechanical ventilation on cardiorespiratory interactions in anesthetized pigs.

    Science.gov (United States)

    Beda, Alessandro; Güldner, Andreas; Simpson, David M; Carvalho, Nadja C; Franke, Susanne; Uhlig, Christopher; Koch, Thea; Pelosi, Paolo; de Abreu, Marcelo Gama

    2012-03-01

    The physiological importance of respiratory sinus arrhythmia (RSA) and cardioventilatory coupling (CVC) has not yet been fully elucidated, but these phenomena might contribute to improve ventilation/perfusion matching, with beneficial effects on gas exchange. Furthermore, decreased RSA amplitude has been suggested as an indicator of impaired autonomic control and poor clinical outcome, also during positive-pressure mechanical ventilation (MV). However, it is currently unknown how different modes of MV, including variable tidal volumes (V(T)), affect RSA and CVC during anesthesia. We compared the effects of pressure controlled (PCV) versus pressure assisted (PSV) ventilation, and of random variable versus constant V(T), on RSA and CVC in eight anesthetized pigs. At comparable depth of anesthesia, global hemodynamics, and ventilation, RSA amplitude increased from 20 ms in PCV to 50 ms in PSV (p < 0.05). CVC was detected (using proportional Shannon entropy of the interval between each inspiration onset and the previous R-peak in ECG) in two animals in PCV and seven animals in PSV. Variable V(T) did not significantly influence these phenomena. Furthermore, heart period and systolic arterial pressure oscillations were in phase during PCV but in counter-phase during PSV. At the same depth of anesthesia in pigs, PSV increases RSA amplitude and CVC compared to PCV. Our data suggest that the central respiratory drive, but not the baroreflex or the mechano-electric feedback in the heart, is the main mechanism behind the RSA increase. Hence, differences in RSA and CVC between mechanically ventilated patients might reflect the difference in ventilation mode rather than autonomic impairment. Also, since gas exchange did not increase from PCV to PSV, it is questionable whether RSA has any significance in improving ventilation/perfusion matching during MV.

  19. Clinical Implications of Glucose Variability: Chronic Complications of Diabetes

    Directory of Open Access Journals (Sweden)

    Hye Seung Jung

    2015-06-01

    Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.

  20. Row Reduced Echelon Form for Solving Fully Fuzzy System with Unknown Coefficients

    Directory of Open Access Journals (Sweden)

    Ghassan Malkawi

    2014-08-01

    Full Text Available This study proposes a new method for finding a feasible fuzzy solution in positive Fully Fuzzy Linear System (FFLS, where the coefficients are unknown. The fully fuzzy system is transferred to linear system in order to obtain the solution using row reduced echelon form, thereafter; the crisp solution is restricted in obtaining the positive fuzzy solution. The fuzzy solution of FFLS is included crisp intervals, to assign alternative values of unknown entries of fuzzy numbers. To illustrate the proposed method, numerical examples are solved, where the entries of coefficients are unknown in right or left hand side, to demonstrate the contributions in this study.

  1. Vision-based autonomous grasping of unknown piled objects

    International Nuclear Information System (INIS)

    Johnson, R.K.

    1994-01-01

    Computer vision techniques have been used to develop a vision-based grasping capability for autonomously picking and placing unknown piled objects. This work is currently being applied to the problem of hazardous waste sorting in support of the Department of Energy's Mixed Waste Operations Program

  2. 48 CFR 52.222-49 - Service Contract Act-Place of Performance Unknown.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Service Contract Act-Place... Provisions and Clauses 52.222-49 Service Contract Act—Place of Performance Unknown. As prescribed in 22.1006(f), insert the following clause: Service Contract Act—Place of Performance Unknown (MAY 1989) (a...

  3. Cartesian integration of Grassmann variables over invariant functions

    Energy Technology Data Exchange (ETDEWEB)

    Kieburg, Mario; Kohler, Heiner; Guhr, Thomas [Universitaet Duisburg-Essen, Duisburg (Germany)

    2009-07-01

    Supersymmetry plays an important role in field theory as well as in random matrix theory and mesoscopic physics. Anticommuting variables are the fundamental objects of supersymmetry. The integration over these variables is equivalent to the derivative. Recently[arxiv:0809.2674v1[math-ph] (2008)], we constructed a differential operator which only acts on the ordinary part of the superspace consisting of ordinary and anticommuting variables. This operator is equivalent to the integration over all anticommuting variables of an invariant function. We present this operator and its applications for functions which are rotation invariant under the supergroups U(k{sub 1}/k{sub 2}) and UOSp(k{sub 1}/k{sub 2}).

  4. Analytic regularity and collocation approximation for elliptic PDEs with random domain deformations

    KAUST Repository

    Castrillon, Julio

    2016-03-02

    In this work we consider the problem of approximating the statistics of a given Quantity of Interest (QoI) that depends on the solution of a linear elliptic PDE defined over a random domain parameterized by N random variables. The elliptic problem is remapped onto a corresponding PDE with a fixed deterministic domain. We show that the solution can be analytically extended to a well defined region in CN with respect to the random variables. A sparse grid stochastic collocation method is then used to compute the mean and variance of the QoI. Finally, convergence rates for the mean and variance of the QoI are derived and compared to those obtained in numerical experiments.

  5. A message-passing approach to random constraint satisfaction problems with growing domains

    International Nuclear Information System (INIS)

    Zhao, Chunyan; Zheng, Zhiming; Zhou, Haijun; Xu, Ke

    2011-01-01

    Message-passing algorithms based on belief propagation (BP) are implemented on a random constraint satisfaction problem (CSP) referred to as model RB, which is a prototype of hard random CSPs with growing domain size. In model RB, the number of candidate discrete values (the domain size) of each variable increases polynomially with the variable number N of the problem formula. Although the satisfiability threshold of model RB is exactly known, finding solutions for a single problem formula is quite challenging and attempts have been limited to cases of N ∼ 10 2 . In this paper, we propose two different kinds of message-passing algorithms guided by BP for this problem. Numerical simulations demonstrate that these algorithms allow us to find a solution for random formulas of model RB with constraint tightness slightly less than p cr , the threshold value for the satisfiability phase transition. To evaluate the performance of these algorithms, we also provide a local search algorithm (random walk) as a comparison. Besides this, the simulated time dependence of the problem size N and the entropy of the variables for growing domain size are discussed

  6. The GRB variability/peak luminosity correlation: new results

    International Nuclear Information System (INIS)

    Guidorzi, C.; Rossi, F.; Hurley, K.; Mundell, C.G.

    2005-01-01

    We test the correlation between time variability and isotropic-equivalent peak luminosity found by Reichart et al. (ApJ, 552 (2001) 57) using a set of 26 Gamma-Ray Bursts (GRBs) with known redshift. We confirm the correlation, thought with a larger spread around the best-fit power-law obtained by Reichart et al. which in turn does not provide an acceptable description any longer. In addiction, we find no evidence for correlation between variability and beaming-corrected peak luminosity for a subset of 14 GRBs whose beaming angles have been taken from Ghirlanda et al. (ApJ, 616 (2004) 331). Finally, we investigate the possible connection for some GRBs between the location in the variability/peak luminosity space and some afterglow properties, such as the detectability in the optical band, by adding some GRBs whose redshifts, unknown from direct measurements, have been derived assuming the Amati at al. (AeA, 390 (2002) 81) relationship

  7. Carcinoma of Unknown Primary Treatment (PDQ®)—Patient Version

    Science.gov (United States)

    Carcinoma of unknown primary (CUP), treatment can include surgery, radiation therapy, chemotherapy, or hormone therapy. Get detailed information about the diagnosis and treatment of CUP in this expert-reviewed summary.

  8. Financial Management of a Large Multi-site Randomized Clinical Trial

    Science.gov (United States)

    Sheffet, Alice J.; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E.; Longbottom, Mary E.; Howard, Virginia J.; Marler, John R.; Brott, Thomas G.

    2014-01-01

    Background The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years’ funding ($21,112,866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2,500 randomized participants at 40 sites. Aims Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Methods Projections of the original grant’s fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant’s fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Results Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2,500 targeted sample size, 138 (5.5%) were randomized during the first five years and 1,387 (55.5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13,845) of the projected per-patient costs ($152,992) of the fixed model. Conclusions Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. PMID:24661748

  9. Melanoma of unknown origin: a case series.

    LENUS (Irish Health Repository)

    Kelly, J

    2010-12-01

    The natural history of metastatic melanoma involving lymph nodes, in the absence of a known primary site (cutaneous, ocular or mucosal) has, to date, been poorly defined; and the optimal management of this rare subtype of disease is therefore unclear. Melanomas of unknown primary site (MUP) are estimated to comprise between 3.7 and 6% of all melanomas (Anbari et al. in Cancer 79:1861-1821, 1997).

  10. A systematic review of variables associated with sleep paralysis

    OpenAIRE

    Denis, Dan; French, Christopher C.; Gregory, Alice M.

    2017-01-01

    Sleep paralysis is a relatively common but under-researched phenomenon. While the causes are unknown, a number of studies have investigated potential risk factors. In this article, we conducted a systematic review on the available literature regarding variables associated with both the frequency and intensity of sleep paralysis episodes. A total of 42 studies met the inclusion criteria. For each study, sample size, study site, sex and age of participants, sleep paralysis measure, and results ...

  11. Atmospheric turbulence profiling with unknown power spectral density

    Science.gov (United States)

    Helin, Tapio; Kindermann, Stefan; Lehtonen, Jonatan; Ramlau, Ronny

    2018-04-01

    Adaptive optics (AO) is a technology in modern ground-based optical telescopes to compensate for the wavefront distortions caused by atmospheric turbulence. One method that allows to retrieve information about the atmosphere from telescope data is so-called SLODAR, where the atmospheric turbulence profile is estimated based on correlation data of Shack-Hartmann wavefront measurements. This approach relies on a layered Kolmogorov turbulence model. In this article, we propose a novel extension of the SLODAR concept by including a general non-Kolmogorov turbulence layer close to the ground with an unknown power spectral density. We prove that the joint estimation problem of the turbulence profile above ground simultaneously with the unknown power spectral density at the ground is ill-posed and propose three numerical reconstruction methods. We demonstrate by numerical simulations that our methods lead to substantial improvements in the turbulence profile reconstruction compared to the standard SLODAR-type approach. Also, our methods can accurately locate local perturbations in non-Kolmogorov power spectral densities.

  12. Neurological Autoantibody Prevalence in Epilepsy of Unknown Etiology.

    Science.gov (United States)

    Dubey, Divyanshu; Alqallaf, Abdulradha; Hays, Ryan; Freeman, Matthew; Chen, Kevin; Ding, Kan; Agostini, Mark; Vernino, Steven

    2017-04-01

    Autoimmune epilepsy is an underrecognized condition, and its true incidence is unknown. Identifying patients with an underlying autoimmune origin is critical because these patients' condition may remain refractory to conventional antiseizure medications but may respond to immunotherapy. To determine the prevalence of neurological autoantibodies (Abs) among adult patients with epilepsy of unknown etiology. Consecutive patients presenting to neurology services with new-onset epilepsy or established epilepsy of unknown etiology were identified. Serum samples were tested for autoimmune encephalitis Abs as well as thyroperoxidase (TPO) and glutamic acid decarboxylase 65 (GAD65) Abs. An antibody prevalence in epilepsy (APE) score based on clinical characteristics was assigned prospectively. Data were collected from June 1, 2015, to June 1, 2016. Presence of neurological Abs. A score based on clinical characteristics was assigned to estimate the probability of seropositivity prior to antibody test results. Good seizure outcome was estimated on the basis of significant reduction of seizure frequency at the first follow-up or seizure freedom. Of the 127 patients (68 males and 59 females) enrolled in the study, 15 were subsequently excluded after identification of an alternative diagnosis. Serum Abs suggesting a potential autoimmune etiology were detected in 39 (34.8%) cases. More than 1 Ab was detected in 7 patients (6.3%): 3 (2.7%) had TPO-Ab and voltage-gated potassium channel complex (VGKCc) Ab, 2 (1.8%) had GAD65-Ab and VGKCc-Ab, 1 had TPO-Ab and GAD65-Ab, and 1 had anti-Hu Ab and GAD65-Ab. Thirty-two patients (28.6%) had a single Ab marker. Among 112 patients included in the study, 15 (13.4%) had TPO-Ab, 14 (12.5%) had GAD65-Ab, 12 (10.7%) had VGKCc (4 of whom were positive for leucine-rich glioma-inactivated protein 1 [LGI1] Ab), and 4 (3.6%) had N-methyl-D-aspartate receptor (NMDAR) Ab. Even after excluding TPO-Ab and low-titer GAD65-Ab, Abs strongly suggesting an

  13. On a direct algorithm for the generation of log-normal pseudo-random numbers

    CERN Document Server

    Chamayou, J M F

    1976-01-01

    The random variable ( Pi /sub i=1//sup n/X/sub i//X/sub i+n/)/sup 1/ square root 2n/ is used to generate standard log normal variables Lambda (0, 1), where the X/sub i/ are independent uniform variables on (0, 1). (8 refs).

  14. Sources of variability in consonant perception of normal-hearing listeners

    DEFF Research Database (Denmark)

    Zaar, Johannes; Dau, Torsten

    2015-01-01

    between responses. The speech-induced variability across and within talkers and the across-listener variability were substantial and of similar magnitude. The noise-induced variability, obtained with time-shifted realizations of the same random process, was smaller but significantly larger than the amount......Responses obtained in consonant perception experiments typically show a large variability across stimuli of the same phonetic identity. The present study investigated the influence of different potential sources of this response variability. It was distinguished between source-induced variability......, referring to perceptual differences caused by acoustical differences in the speech tokens and/or the masking noise tokens, and receiver-related variability, referring to perceptual differences caused by within- and across-listener uncertainty. Consonant-vowel combinations consisting of 15 consonants...

  15. Offshore limit of coastal ocean variability identified from hydrography and altimeter data in the eastern Arabian Sea

    Digital Repository Service at National Institute of Oceanography (India)

    Antony, M.K.; Swamy, G.N.; Somayajulu, Y.K.

    In this communication, we describe a hitherto-unknown offshore limit to the coastal ocean variability signatures away from the continental shelf in the eastern Arabian Sea, based on hydrographic observations and satellite altimeter (TOPEX...

  16. Analytic regularity and collocation approximation for elliptic PDEs with random domain deformations

    KAUST Repository

    Castrillon, Julio; Nobile, Fabio; Tempone, Raul

    2016-01-01

    In this work we consider the problem of approximating the statistics of a given Quantity of Interest (QoI) that depends on the solution of a linear elliptic PDE defined over a random domain parameterized by N random variables. The elliptic problem

  17. Robust video watermarking via optimization algorithm for quantization of pseudo-random semi-global statistics

    Science.gov (United States)

    Kucukgoz, Mehmet; Harmanci, Oztan; Mihcak, Mehmet K.; Venkatesan, Ramarathnam

    2005-03-01

    In this paper, we propose a novel semi-blind video watermarking scheme, where we use pseudo-random robust semi-global features of video in the three dimensional wavelet transform domain. We design the watermark sequence via solving an optimization problem, such that the features of the mark-embedded video are the quantized versions of the features of the original video. The exact realizations of the algorithmic parameters are chosen pseudo-randomly via a secure pseudo-random number generator, whose seed is the secret key, that is known (resp. unknown) by the embedder and the receiver (resp. by the public). We experimentally show the robustness of our algorithm against several attacks, such as conventional signal processing modifications and adversarial estimation attacks.

  18. Properties and simulation of α-permanental random fields

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    An α-permanental random field is briefly speaking a model for a collection of random variables with positive associations, where α is a positive number and the probability generating function is given in terms of a covariance or more general function so that density and moment expressions are given...... by certain α-permanents. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of  α-permanental random fields and their potential applications. The purpose of this paper is first to summarize useful probabilistic results using the simplest possible setting......, and second to study stochastic constructions and simulation techniques, which should provide a useful basis for discussing the statistical aspects in future work. The paper also discusses some examples of  α-permanental random fields....

  19. Control of variable speed variable pitch wind turbine based on a disturbance observer

    Science.gov (United States)

    Ren, Haijun; Lei, Xin

    2017-11-01

    In this paper, a novel sliding mode controller based on disturbance observer (DOB) to optimize the efficiency of variable speed variable pitch (VSVP) wind turbine is developed and analyzed. Due to the highly nonlinearity of the VSVP system, the model is linearly processed to obtain the state space model of the system. Then, a conventional sliding mode controller is designed and a DOB is added to estimate wind speed. The proposed control strategy can successfully deal with the random nature of wind speed, the nonlinearity of VSVP system, the uncertainty of parameters and external disturbance. Via adding the observer to the sliding mode controller, it can greatly reduce the chattering produced by the sliding mode switching gain. The simulation results show that the proposed control system has the effectiveness and robustness.

  20. A Generalized Random Regret Minimization Model

    NARCIS (Netherlands)

    Chorus, C.G.

    2013-01-01

    This paper presents, discusses and tests a generalized Random Regret Minimization (G-RRM) model. The G-RRM model is created by replacing a fixed constant in the attribute-specific regret functions of the RRM model, by a regret-weight variable. Depending on the value of the regret-weights, the G-RRM

  1. Bilirubin as a potential causal factor in type 2 diabetes risk: a Mendelian randomization study

    Science.gov (United States)

    Abbasi, Ali; Deetman, Petronella E.; Corpeleijn, Eva; Gansevoort, Ron T.; Gans, Rijk O.B.; Hillege, Hans L.; van der Harst, Pim; Stolk, Ronald P.; Navis, Gerjan; Alizadeh, Behrooz Z.; Bakker, Stephan J.L.

    2014-01-01

    Circulating bilirubin, a natural antioxidant, is associated with decreased risk of type 2 diabetes (T2D), but the nature of the relationship remains unknown. We performed Mendelian randomization in a prospective cohort of 3,381 participants free of diabetes at baseline (aged 28-75 years; women, 52.6%). We used rs6742078 located in UDP-glucuronosyltransferase (UGT1A1) locus as instrumental variable (IV) to study a potential causal effect of serum total bilirubin on T2D risk. T2D developed in a total of 210 (6.2%) participants during a median follow-up of 7.8 years. In adjusted analyses, rs6742078, which explained 19.5% of bilirubin variation, was strongly associated with total bilirubin (a 0.68-SD increase in bilirubin levels per T allele; Pbilirubin levels, we observed a 25% (OR 0.75 [95%CI, 0.62-0.92]; P=0.004) lower risk of T2D. In Mendelian randomization analysis, the causal risk reduction for T2D was estimated to be 42% (causal ORIVestimation per 1-SD increase in log-transformed bilirubin 0.58 [95%CI, 0.39-0.84]; P=0.005), which was comparable to the observational estimate (Durbin-Wu-Hausman chi-square test Pfor difference =0.19). These novel results provide evidence that elevated bilirubin is causally associated with risk of T2D and support its role as a protective determinant. PMID:25368098

  2. A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses

    Science.gov (United States)

    Vasdekis, Vassilis G. S.; Cagnone, Silvia; Moustaki, Irini

    2012-01-01

    The paper proposes a composite likelihood estimation approach that uses bivariate instead of multivariate marginal probabilities for ordinal longitudinal responses using a latent variable model. The model considers time-dependent latent variables and item-specific random effects to be accountable for the interdependencies of the multivariate…

  3. Genetic variability of Artemisia capillaris (Wormwood capillary) by ...

    African Journals Online (AJOL)

    The genetic variability among individuals of Artemisia capillaris from state of Terengganu, Malaysia was examined by using the random amplified polymorphic DNA (RAPD) technique. The samples were collected from differences regional in Terengganu State. The genomic DNA was extracted from the samples leaves.

  4. Relationship of suicide rates with climate and economic variables in Europe during 2000-2012

    DEFF Research Database (Denmark)

    Fountoulakis, Konstantinos N; Chatzikosta, Isaia; Pastiadis, Konstantinos

    2016-01-01

    BACKGROUND: It is well known that suicidal rates vary considerably among European countries and the reasons for this are unknown, although several theories have been proposed. The effect of economic variables has been extensively studied but not that of climate. METHODS: Data from 29 European...... countries covering the years 2000-2012 and concerning male and female standardized suicidal rates (according to WHO), economic variables (according World Bank) and climate variables were gathered. The statistical analysis included cluster and principal component analysis and categorical regression. RESULTS......: The derived models explained 62.4 % of the variability of male suicidal rates. Economic variables alone explained 26.9 % and climate variables 37.6 %. For females, the respective figures were 41.7, 11.5 and 28.1 %. Male suicides correlated with high unemployment rate in the frame of high growth rate and high...

  5. Arbitrary-step randomly delayed robust filter with application to boost phase tracking

    Science.gov (United States)

    Qin, Wutao; Wang, Xiaogang; Bai, Yuliang; Cui, Naigang

    2018-04-01

    The conventional filters such as extended Kalman filter, unscented Kalman filter and cubature Kalman filter assume that the measurement is available in real-time and the measurement noise is Gaussian white noise. But in practice, both two assumptions are invalid. To solve this problem, a novel algorithm is proposed by taking the following four steps. At first, the measurement model is modified by the Bernoulli random variables to describe the random delay. Then, the expression of predicted measurement and covariance are reformulated, which could get rid of the restriction that the maximum number of delay must be one or two and the assumption that probabilities of Bernoulli random variables taking the value one are equal. Next, the arbitrary-step randomly delayed high-degree cubature Kalman filter is derived based on the 5th-degree spherical-radial rule and the reformulated expressions. Finally, the arbitrary-step randomly delayed high-degree cubature Kalman filter is modified to the arbitrary-step randomly delayed high-degree cubature Huber-based filter based on the Huber technique, which is essentially an M-estimator. Therefore, the proposed filter is not only robust to the randomly delayed measurements, but robust to the glint noise. The application to the boost phase tracking example demonstrate the superiority of the proposed algorithms.

  6. Design of a DNA chip for detection of unknown genetically modified organisms (GMOs).

    Science.gov (United States)

    Nesvold, Håvard; Kristoffersen, Anja Bråthen; Holst-Jensen, Arne; Berdal, Knut G

    2005-05-01

    Unknown genetically modified organisms (GMOs) have not undergone a risk evaluation, and hence might pose a danger to health and environment. There are, today, no methods for detecting unknown GMOs. In this paper we propose a novel method intended as a first step in an approach for detecting unknown genetically modified (GM) material in a single plant. A model is designed where biological and combinatorial reduction rules are applied to a set of DNA chip probes containing all possible sequences of uniform length n, creating probes capable of detecting unknown GMOs. The model is theoretically tested for Arabidopsis thaliana Columbia, and the probabilities for detecting inserts and receiving false positives are assessed for various parameters for this organism. From a theoretical standpoint, the model looks very promising but should be tested further in the laboratory. The model and algorithms will be available upon request to the corresponding author.

  7. The Effect of Known-and-Unknown Word Combinations on Intentional Vocabulary Learning

    Science.gov (United States)

    Kasahara, Kiwamu

    2011-01-01

    The purpose of this study is to examine whether learning a known-and-unknown word combination is superior in terms of retention and retrieval of meaning to learning a single unknown word. The term "combination" in this study means a two-word collocation of a familiar word and a word that is new to the participants. Following the results of…

  8. High Precision Fast Projective Synchronization for Chaotic Systems with Unknown Parameters

    Science.gov (United States)

    Nian, Fuzhong; Wang, Xingyuan; Lin, Da; Niu, Yujun

    2013-08-01

    A high precision fast projective synchronization method for chaotic systems with unknown parameters was proposed by introducing optimal matrix. Numerical simulations indicate that the precision be improved about three orders compared with other common methods under the same condition of software and hardware. Moreover, when average error is less than 10-3, the synchronization speed is 6500 times than common methods, the iteration needs only 4 times. The unknown parameters also were identified rapidly. The theoretical analysis and proof also were given.

  9. Machine learning search for variable stars

    Science.gov (United States)

    Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis

    2018-04-01

    Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.

  10. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    International Nuclear Information System (INIS)

    Loubenets, Elena R.

    2015-01-01

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence of this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)

  11. Genetic variability of cultivated cowpea in Benin assessed by random amplified polymorphic DNA

    NARCIS (Netherlands)

    Zannou, A.; Kossou, D.K.; Ahanchédé, A.; Zoundjihékpon, J.; Agbicodo, E.; Struik, P.C.; Sanni, A.

    2008-01-01

    Characterization of genetic diversity among cultivated cowpea [Vigna unguiculata (L.) Walp.] varieties is important to optimize the use of available genetic resources by farmers, local communities, researchers and breeders. Random amplified polymorphic DNA (RAPD) markers were used to evaluate the

  12. Does epicatechin contribute to the acute vascular function effects of dark chocolate? A randomized, crossover study

    NARCIS (Netherlands)

    Dower, James I.; Geleijnse, Marianne; Kroon, Paul A.; Philo, Mark; Mensink, Marco; Kromhout, Daan; Hollman, Peter C.H.

    2016-01-01

    Scope: Cocoa, rich in flavan-3-ols, improves vascular function, but the contribution of specific flavan-3-ols is unknown. We compared the effects of pure epicatechin, a major cocoa flavan-3-ol, and chocolate. Methods and results: In a randomized crossover study, twenty healthy men (40-80 years)

  13. The 'emergent scaling' phenomenon and the dielectric properties of random resistor-capacitor networks

    CERN Document Server

    Bouamrane, R

    2003-01-01

    An efficient algorithm, based on the Frank-Lobb reduction scheme, for calculating the equivalent dielectric properties of very large random resistor-capacitor (R-C) networks has been developed. It has been used to investigate the network size and composition dependence of dielectric properties and their statistical variability. The dielectric properties of 256 samples of random networks containing: 512, 2048, 8192 and 32 768 components distributed randomly in the ratios 60% R-40% C, 50% R-50% C and 40% R-60% C have been computed. It has been found that these properties exhibit the anomalous power law dependences on frequency known as the 'universal dielectric response' (UDR). Attention is drawn to the contrast between frequency ranges across which percolation determines dielectric response, where considerable variability is found amongst the samples, and those across which power laws define response where very little variability is found between samples. It is concluded that the power law UDRs are emergent pr...

  14. Introduction to Bayesian statistics

    CERN Document Server

    Koch, Karl-Rudolf

    2007-01-01

    This book presents Bayes' theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters. It does so in a simple manner that is easy to comprehend. The book compares traditional and Bayesian methods with the rules of probability presented in a logical way allowing an intuitive understanding of random variables and their probability distributions to be formed.

  15. A Comparison of the Prognostic Value of Early PSA Test-Based Variables Following External Beam Radiotherapy, With or Without Preceding Androgen Deprivation: Analysis of Data From the TROG 96.01 Randomized Trial

    International Nuclear Information System (INIS)

    Lamb, David S.; Denham, James W.; Joseph, David; Matthews, John; Atkinson, Chris; Spry, Nigel A.; Duchesne, Gillian; Ebert, Martin; Steigler, Allison; Delahunt, Brett; D'Este, Catherine

    2011-01-01

    Purpose: We sought to compare the prognostic value of early prostate-specific antigen (PSA) test-based variables for the 802 eligible patients treated in the Trans-Tasman Radiation Oncology Group 96.01 randomized trial. Methods and Materials: Patients in this trial had T2b, T2c, T3, and T4 N0 prostate cancer and were randomized to 0, 3, or 6 months of neoadjuvant androgen deprivation therapy (NADT) prior to and during radiation treatment at 66 Gy to the prostate and seminal vesicles. The early PSA test-based variables evaluated were the pretreatment initial PSA (iPSA) value, PSA values at 2 and 4 months into NADT, the PSA nadir (nPSA) value after radiation in all patients, and PSA response signatures in men receiving radiation. Comparisons of endpoints were made using Cox models of local progression-free survival, distant failure-free survival, biochemical failure-free survival, and prostate cancer-specific survival. Results: The nPSA value was a powerful predictor of all endpoints regardless of whether NADT was given before radiation. PSA response signatures also predicted all endpoints in men treated by radiation alone. iPSA and PSA results at 2 and 4 months into NADT predicted biochemical failure-free survival but not any of the clinical endpoints. nPSA values correlated with those of iPSA, Gleason grade, and T stage and were significantly higher in men receiving radiation alone than in those receiving NADT. Conclusions: The postradiation nPSA value is the strongest prognostic indicator of all early PSA-based variables. However, its use as a surrogate endpoint needs to take into account its dependence on pretreatment variables and treatment method.

  16. Blood Pressure Variability and Cognitive Function Among Older African Americans: Introducing a New Blood Pressure Variability Measure.

    Science.gov (United States)

    Tsang, Siny; Sperling, Scott A; Park, Moon Ho; Helenius, Ira M; Williams, Ishan C; Manning, Carol

    2017-09-01

    Although blood pressure (BP) variability has been reported to be associated with cognitive impairment, whether this relationship affects African Americans has been unclear. We sought correlations between systolic and diastolic BP variability and cognitive function in community-dwelling older African Americans, and introduced a new BP variability measure that can be applied to BP data collected in clinical practice. We assessed cognitive function in 94 cognitively normal older African Americans using the Mini-Mental State Examination (MMSE) and the Computer Assessment of Mild Cognitive Impairment (CAMCI). We used BP measurements taken at the patients' three most recent primary care clinic visits to generate three traditional BP variability indices, range, standard deviation, and coefficient of variation, plus a new index, random slope, which accounts for unequal BP measurement intervals within and across patients. MMSE scores did not correlate with any of the BP variability indices. Patients with greater diastolic BP variability were less accurate on the CAMCI verbal memory and incidental memory tasks. Results were similar across the four BP variability indices. In a sample of cognitively intact older African American adults, BP variability did not correlate with global cognitive function, as measured by the MMSE. However, higher diastolic BP variability correlated with poorer verbal and incidental memory. By accounting for differences in BP measurement intervals, our new BP variability index may help alert primary care physicians to patients at particular risk for cognitive decline.

  17. Unknown quantum states: The quantum de Finetti representation

    International Nuclear Information System (INIS)

    Caves, Carlton M.; Fuchs, Christopher A.; Schack, Ruediger

    2002-01-01

    We present an elementary proof of the quantum de Finetti representation theorem, a quantum analog of de Finetti's classical theorem on exchangeable probability assignments. This contrasts with the original proof of Hudson and Moody [Z. Wahrschein. verw. Geb. 33, 343 (1976)], which relies on advanced mathematics and does not share the same potential for generalization. The classical de Finetti theorem provides an operational definition of the concept of an unknown probability in Bayesian probability theory, where probabilities are taken to be degrees of belief instead of objective states of nature. The quantum de Finetti theorem, in a closely analogous fashion, deals with exchangeable density-operator assignments and provides an operational definition of the concept of an ''unknown quantum state'' in quantum-state tomography. This result is especially important for information-based interpretations of quantum mechanics, where quantum states, like probabilities, are taken to be states of knowledge rather than states of nature. We further demonstrate that the theorem fails for real Hilbert spaces and discuss the significance of this point

  18. Return probabilities for the reflected random walk on N_0

    NARCIS (Netherlands)

    Essifi, R.; Peigné, M.

    2015-01-01

    Let \\((Y_n)\\) be a sequence of i.i.d. \\(\\mathbb{Z }\\)-valued random variables with law \\(\\mu \\). The reflected random walk \\((X_n)\\) is defined recursively by \\(X_0=x \\in \\mathbb{N }_0, X_{n+1}=\\vert X_n+Y_{n+1}\\vert \\). Under mild hypotheses on the law \\(\\mu \\), it is proved that, for any \\( y \\in

  19. A randomized trial of high-dairy-protein, variable-carbohydrate diets and exercise on body composition in adults with obesity.

    Science.gov (United States)

    Parr, Evelyn B; Coffey, Vernon G; Cato, Louise E; Phillips, Stuart M; Burke, Louise M; Hawley, John A

    2016-05-01

    This study determined the effects of 16-week high-dairy-protein, variable-carbohydrate (CHO) diets and exercise training (EXT) on body composition in men and women with overweight/obesity. One hundred and eleven participants (age 47 ± 6 years, body mass 90.9 ± 11.7 kg, BMI 33 ± 4 kg/m(2) , values mean ± SD) were randomly stratified to diets with either: high dairy protein, moderate CHO (40% CHO: 30% protein: 30% fat; ∼4 dairy servings); high dairy protein, high CHO (55%: 30%: 15%; ∼4 dairy servings); or control (55%: 15%: 30%; ∼1 dairy serving). Energy restriction (500 kcal/day) was achieved through diet (∼250 kcal/day) and EXT (∼250 kcal/day). Body composition was measured using dual-energy X-ray absorptiometry before, midway, and upon completion of the intervention. Eighty-nine (25 M/64 F) of 115 participants completed the 16-week intervention, losing 7.7 ± 3.2 kg fat mass (P exercise stimulus. © 2016 The Obesity Society.

  20. Intrathecal immunoglobulin synthesis in patients with symptomatic epilepsy and epilepsy of unknown etiology ('cryptogenic').

    Science.gov (United States)

    Fauser, S; Soellner, C; Bien, C G; Tumani, H

    2017-09-01

    To compare the frequency of intrathecal immunoglobulin (Ig) synthesis in patients with symptomatic epilepsy and epilepsy of unknown etiology ('cryptogenic'). Patients with epileptic (n = 301) and non-epileptic (n = 10) seizures were retrospectively screened for autochthonous intrathecal Ig synthesis and oligoclonal bands (OCBs) in the cerebrospinal fluid. Intrathecal IgG/OCBs were detected in 8% of patients with epilepsies of unknown etiology, 5% of patients with first seizures of unknown cause and 0-4% of patients with epilepsy due to brain tumors, cerebrovascular disease or other etiologies. Intrathecal IgG/OCBs were not seen in patients with psychogenic seizures. Identical OCBs in serum and cerebrospinal fluid were more common in all patient groups (10-40% depending on underlying etiology). Intrathecal IgG synthesis/OCBs were observed slightly more frequently in patients with 'cryptogenic' epilepsy and with first seizures of unknown etiology than in other patient groups. However, this remained an infrequent finding and thus we could not confirm humoral immunity as a leading disease mechanism in patients with epilepsy in general or with unknown etiology in particular. © 2017 EAN.

  1. Random small interfering RNA library screen identifies siRNAs that induce human erythroleukemia cell differentiation.

    Science.gov (United States)

    Fan, Cuiqing; Xiong, Yuan; Zhu, Ning; Lu, Yabin; Zhang, Jiewen; Wang, Song; Liang, Zicai; Shen, Yan; Chen, Meihong

    2011-03-01

    Cancers are characterized by poor differentiation. Differentiation therapy is a strategy to alleviate malignant phenotypes by inducing cancer cell differentiation. Here we carried out a combinatorial high-throughput screen with a random siRNA library on human erythroleukemia K-562 cell differentiation. Two siRNAs screened from the library were validated to be able to induce erythroid differentiation to varying degrees, determined by CD235 and globin up-regulation, GATA-2 down-regulation, and cell growth inhibition. The screen we performed here is the first trial of screening cancer differentiation-inducing agents from a random siRNA library, demonstrating that a random siRNA library can be considered as a new resource in efforts to seek new therapeutic agents for cancers. As a random siRNA library has a broad coverage for the entire genome, including known/unknown genes and protein coding/non-coding sequences, screening using a random siRNA library can be expected to greatly augment the repertoire of therapeutic siRNAs for cancers.

  2. RBF neural network based H∞ synchronization for unknown chaotic ...

    Indian Academy of Sciences (India)

    , 172 ... the effect of disturbance to an H∞ norm constraint. It is shown that ... unknown chaotic systems; linear matrix inequality (LMI); learning law. 1. Introduction .... (9) is RBFNN H∞ synchronized if the synchronization error e(t) satisfies. ∫ ∞.

  3. Fuzzy norm method for evaluating random vibration of airborne platform from limited PSD data

    Directory of Open Access Journals (Sweden)

    Wang Zhongyu

    2014-12-01

    Full Text Available For random vibration of airborne platform, the accurate evaluation is a key indicator to ensure normal operation of airborne equipment in flight. However, only limited power spectral density (PSD data can be obtained at the stage of flight test. Thus, those conventional evaluation methods cannot be employed when the distribution characteristics and priori information are unknown. In this paper, the fuzzy norm method (FNM is proposed which combines the advantages of fuzzy theory and norm theory. The proposed method can deeply dig system information from limited data, which probability distribution is not taken into account. Firstly, the FNM is employed to evaluate variable interval and expanded uncertainty from limited PSD data, and the performance of FNM is demonstrated by confidence level, reliability and computing accuracy of expanded uncertainty. In addition, the optimal fuzzy parameters are discussed to meet the requirements of aviation standards and metrological practice. Finally, computer simulation is used to prove the adaptability of FNM. Compared with statistical methods, FNM has superiority for evaluating expanded uncertainty from limited data. The results show that the reliability of calculation and evaluation is superior to 95%.

  4. A phase III randomized trial comparing glucocorticoid monotherapy versus glucocorticoid and rituximab in patients with autoimmune haemolytic anaemia

    DEFF Research Database (Denmark)

    Birgens, Henrik Sverre; Frederiksen, Henrik; Hasselbalch, Hans C

    2013-01-01

    The impact of first-line treatment with the anti-CD 20 chimeric monoclonal antibody rituximab in patients with warm-antibody reactive autoimmune haemolytic anaemia (WAIHA) is unknown. We report the first randomized study of 64 patients with newly diagnosed WAIHA who received prednisolone and ritu...

  5. Unifying parameter estimation and the Deutsch-Jozsa algorithm for continuous variables

    International Nuclear Information System (INIS)

    Zwierz, Marcin; Perez-Delgado, Carlos A.; Kok, Pieter

    2010-01-01

    We reveal a close relationship between quantum metrology and the Deutsch-Jozsa algorithm on continuous-variable quantum systems. We develop a general procedure, characterized by two parameters, that unifies parameter estimation and the Deutsch-Jozsa algorithm. Depending on which parameter we keep constant, the procedure implements either the parameter-estimation protocol or the Deutsch-Jozsa algorithm. The parameter-estimation part of the procedure attains the Heisenberg limit and is therefore optimal. Due to the use of approximate normalizable continuous-variable eigenstates, the Deutsch-Jozsa algorithm is probabilistic. The procedure estimates a value of an unknown parameter and solves the Deutsch-Jozsa problem without the use of any entanglement.

  6. Balancing treatment allocations by clinician or center in randomized trials allows unacceptable levels of treatment prediction.

    Science.gov (United States)

    Hills, Robert K; Gray, Richard; Wheatley, Keith

    2009-08-01

    Randomized controlled trials are the standard method for comparing treatments because they avoid the selection bias that might arise if clinicians were free to choose which treatment a patient would receive. In practice, allocation of treatments in randomized controlled trials is often not wholly random with various 'pseudo-randomization' methods, such as minimization or balanced blocks, used to ensure good balance between treatments within potentially important prognostic or predictive subgroups. These methods avoid selection bias so long as full concealment of the next treatment allocation is maintained. There is concern, however, that pseudo-random methods may allow clinicians to predict future treatment allocations from previous allocation history, particularly if allocations are balanced by clinician or center. We investigate here to what extent treatment prediction is possible. Using computer simulations of minimization and balanced block randomizations, the success rates of various prediction strategies were investigated for varying numbers of stratification variables, including the patient's clinician. Prediction rates for minimization and balanced block randomization typically exceed 60% when clinician is included as a stratification variable and, under certain circumstances, can exceed 80%. Increasing the number of clinicians and other stratification variables did not greatly reduce the prediction rates. Without clinician as a stratification variable, prediction rates are poor unless few clinicians participate. Prediction rates are unacceptably high when allocations are balanced by clinician or by center. This could easily lead to selection bias that might suggest spurious, or mask real, treatment effects. Unless treatment is blinded, randomization should not be balanced by clinician (or by center), and clinician-center effects should be allowed for instead by retrospectively stratified analyses. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese

  7. On the number of subgraphs of the Barabasi-Albert random graph

    Energy Technology Data Exchange (ETDEWEB)

    Ryabchenko, Aleksandr A; Samosvat, Egor A [Moscow Institute of Physics and Technology (State University), Dolgoprudnyi, Moscow Region, Russian Frderation (Russian Federation)

    2012-06-30

    We study a model of a random graph of the type of the Barabasi-Albert preferential attachment model. We develop a technique that makes it possible to estimate the mathematical expectation for a fairly wide class of random variables in the model under consideration. We use this technique to prove a theorem on the asymptotics of the mathematical expectation of the number of subgraphs isomorphic to a certain fixed graph in the random graphs of this model.

  8. Content-Based Multimedia Retrieval in the Presence of Unknown User Preferences

    DEFF Research Database (Denmark)

    Beecks, Christian; Assent, Ira; Seidl, Thomas

    2011-01-01

    Content-based multimedia retrieval requires an appropriate similarity model which reflects user preferences. When these preferences are unknown or when the structure of the data collection is unclear, retrieving the most preferable objects the user has in mind is challenging, as the notion...... address the problem of content-based multimedia retrieval in the presence of unknown user preferences. Our idea consists in performing content-based retrieval by considering all possibilities in a family of similarity models simultaneously. To this end, we propose a novel content-based retrieval approach...

  9. A cluster expansion approach to exponential random graph models

    International Nuclear Information System (INIS)

    Yin, Mei

    2012-01-01

    The exponential family of random graphs are among the most widely studied network models. We show that any exponential random graph model may alternatively be viewed as a lattice gas model with a finite Banach space norm. The system may then be treated using cluster expansion methods from statistical mechanics. In particular, we derive a convergent power series expansion for the limiting free energy in the case of small parameters. Since the free energy is the generating function for the expectations of other random variables, this characterizes the structure and behavior of the limiting network in this parameter region

  10. Modeling climatic effects of anthropogenic CO2 emissions: Unknowns and uncertainties

    Science.gov (United States)

    Soon, W.; Baliunas, S.; Idso, S.; Kondratyev, K. Ya.; Posmentier, E. S.

    2001-12-01

    A likelihood of disastrous global environmental consequences has been surmised as a result of projected increases in anthropogenic greenhouse gas emissions. These estimates are based on computer climate modeling, a branch of science still in its infancy despite recent, substantial strides in knowledge. Because the expected anthropogenic climate forcings are relatively small compared to other background and forcing factors (internal and external), the credibility of the modeled global and regional responses rests on the validity of the models. We focus on this important question of climate model validation. Specifically, we review common deficiencies in general circulation model calculations of atmospheric temperature, surface temperature, precipitation and their spatial and temporal variability. These deficiencies arise from complex problems associated with parameterization of multiply-interacting climate components, forcings and feedbacks, involving especially clouds and oceans. We also review examples of expected climatic impacts from anthropogenic CO2 forcing. Given the host of uncertainties and unknowns in the difficult but important task of climate modeling, the unique attribution of observed current climate change to increased atmospheric CO2 concentration, including the relatively well-observed latest 20 years, is not possible. We further conclude that the incautious use of GCMs to make future climate projections from incomplete or unknown forcing scenarios is antithetical to the intrinsically heuristic value of models. Such uncritical application of climate models has led to the commonly-held but erroneous impression that modeling has proven or substantiated the hypothesis that CO2 added to the air has caused or will cause significant global warming. An assessment of the positive skills of GCMs and their use in suggesting a discernible human influence on global climate can be found in the joint World Meteorological Organisation and United Nations

  11. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  12. Genetic variability of Amorphophallus muelleri Blume in Java based on Random Amplified Polymorphic DNA

    Directory of Open Access Journals (Sweden)

    DIYAH MARTANTI

    2008-10-01

    Full Text Available Amorphophallus muelleri Blume (Araceae is valued for its glucomanan content for use in food industry (healthy diet food, paper industry, pharmacy and cosmetics. The species is triploid (2n=3x=39 and the seed is developed apomictically. The present research is aimed to identify genetic variability of six population of A. muelleri from Java (consisted of 50 accessions using random amplified polymorphic DNA (RAPD. The six populations of the species are: East Java: (1 Silo-Jember, (2 Saradan-Madiun, (3 IPB (cultivated, from Saradan-Madiun, (4 Panti-Jember, (5 Probolinggo; and Central Java: (6 Cilacap. The results showed that five RAPD primers generated 42 scorable bands of which 29 (69.05% were polymorphic. Size of the bands varied from 300bp to 1.5kbp. The 50 accessions of A. muelleri were divided into two main clusters, some of them were grouped based on their populations, and some others were not. The range of individual genetic dissimilarity was from 0.02 to 0.36. The results showed that among six populations investigated, Saradan population showed the highest levels of genetic variation with mean values of na = 1.500+ 0.5061, ne = 1.3174 + 0.3841, PLP = 50% and He = 0, 0.1832+0.2054, whereas Silo-Jember population showed the lowest levels of genetic variation with mean values na = 1.2619+ 0.4450, ne = 1.1890 + 0.3507, PLP = 26.19% and He = 0.1048+0.1887. Efforts to conserve, domesticate, cultivate and improve genetically should be based on the genetic properties of each population and individual within population, especially Saradan population which has the highest levels of genetic variation, need more attention for its conservation.

  13. Random recurrence equations and ruin in a Markov-dependent stochastic economic environment

    DEFF Research Database (Denmark)

    Collamore, Jeffrey F.

    2009-01-01

    series models.  Our results build upon work of Goldie, who has developed tail asymptotics applicable for independent sequences of random variables subject to a random recurrence equation.  In contrast, we adopt a general approach based on the theory of Harris recurrent Markov chains and the associated...

  14. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  15. Critical Behavior of the Annealed Ising Model on Random Regular Graphs

    Science.gov (United States)

    Can, Van Hao

    2017-11-01

    In Giardinà et al. (ALEA Lat Am J Probab Math Stat 13(1):121-161, 2016), the authors have defined an annealed Ising model on random graphs and proved limit theorems for the magnetization of this model on some random graphs including random 2-regular graphs. Then in Can (Annealed limit theorems for the Ising model on random regular graphs, arXiv:1701.08639, 2017), we generalized their results to the class of all random regular graphs. In this paper, we study the critical behavior of this model. In particular, we determine the critical exponents and prove a non standard limit theorem stating that the magnetization scaled by n^{3/4} converges to a specific random variable, with n the number of vertices of random regular graphs.

  16. Liability for Unknown Risks: A Law and Economics Perspective

    NARCIS (Netherlands)

    M.G. Faure (Michael); L.T. Visscher (Louis); F. Weber (Franziska)

    2017-01-01

    textabstractIn the law and economics literature liability is generally regarded as an instrument which provides potential tortfeasors with incentives for optimal care taking. The question, however, arises whether liability can still provide those incentives when risks are unknown. That is the

  17. Teleportation of Unknown Superpositions of Collective Atomic Coherent States

    Institute of Scientific and Technical Information of China (English)

    ZHENG ShiBiao

    2001-01-01

    We propose a scheme to teleport an unknown superposition of two atomic coherent states with different phases. Our scheme is based on resonant and dispersive atom-field interaction. Our scheme provides a possibility of teleporting macroscopic superposition states of many atoms first time.``

  18. Adresse inconnue / Address unknown / Suchwiin Bulmyeong

    OpenAIRE

    Serge Gruzinski

    2005-01-01

    Tous les films asiatiques parlent de métissage, même ceux qui se présentent comme de vastes fresques historiques perdues dans le temps. Les emprunts aux traditions hollywoodiennes et européennes n'ont cessé d'enrichir une cinématographie aussi ancienne que celle du monde occidental. Dans Adresse inconnue (Address unknown) le cinéaste coréen Kim Ki-duk explore l'expérience du métissage et le corps du métis à la frontière entre Corée du Nord et Corée du sud. Fils d'un GI américain et noir et d'...

  19. Multiple Imputation of Predictor Variables Using Generalized Additive Models

    NARCIS (Netherlands)

    de Jong, Roel; van Buuren, Stef; Spiess, Martin

    2016-01-01

    The sensitivity of multiple imputation methods to deviations from their distributional assumptions is investigated using simulations, where the parameters of scientific interest are the coefficients of a linear regression model, and values in predictor variables are missing at random. The

  20. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  1. On grey levels in random CAPTCHA generation

    Science.gov (United States)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  2. Adaptive Tracking and Obstacle Avoidance Control for Mobile Robots with Unknown Sliding

    Directory of Open Access Journals (Sweden)

    Mingyue Cui

    2012-11-01

    Full Text Available An adaptive control approach is proposed for trajectory tracking and obstacle avoidance for mobile robots with consideration given to unknown sliding. A kinematic model of mobile robots is established in this paper, in which both longitudinal and lateral sliding are considered and processed as three time-varying parameters. A sliding model observer is introduced to estimate the sliding parameters online. A stable tracking control law for this nonholonomic system is proposed to compensate the unknown sliding effect. From Lyapunov-stability analysis, it is proved, regardless of unknown sliding, that tracking errors of the controlled closed-loop system are asymptotically stable, the tracking errors converge to zero outside the obstacle detection region and obstacle avoidance is guaranteed inside the obstacle detection region. The efficiency and robustness of the proposed control system are verified by simulation results.

  3. Ergodicity for the Randomly Forced 2D Navier-Stokes Equations

    International Nuclear Information System (INIS)

    Kuksin, Sergei; Shirikyan, Armen

    2001-01-01

    We study space-periodic 2D Navier-Stokes equations perturbed by an unbounded random kick-force. It is assumed that Fourier coefficients of the kicks are independent random variables all of whose moments are bounded and that the distributions of the first N 0 coefficients (where N 0 is a sufficiently large integer) have positive densities against the Lebesgue measure. We treat the equation as a random dynamical system in the space of square integrable divergence-free vector fields. We prove that this dynamical system has a unique stationary measure and study its ergodic properties

  4. Interaction of random wave-current over uneven and porous bottoms

    International Nuclear Information System (INIS)

    Suo Yaohong; Zhang Zhonghua; Zhang Jiafan; Suo Xiaohong

    2009-01-01

    Starting from linear wave theory and applying Green's second identity and considering wave-current interaction for porous bottoms and variable water depth, the comprehensive mild-slope equation model theory of wave-current interaction is developed, then paying attention to the effect of random waves, by use of Kubo et al.'s method, a model theory of the interaction between random waves and current over uneven and porous bottoms is established. Finally the characteristics of the random waves are discussed numerically from both the geometric-optics approximation and the target spectrum.

  5. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  6. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  7. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  8. Reducing variability in the output of pattern classifiers using histogram shaping

    International Nuclear Information System (INIS)

    Gupta, Shalini; Kan, Chih-Wen; Markey, Mia K.

    2010-01-01

    Purpose: The authors present a novel technique based on histogram shaping to reduce the variability in the output and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs. Methods: The authors identify different sources of variability in the output of linear pattern classifiers with identical ROC curves, which also result in classifiers with differently distributed outputs. They theoretically develop a novel technique based on the matching of the histograms of these differently distributed pattern classifier outputs to reduce the variability in their (sensitivity, specificity) pairs at fixed decision thresholds, and to reduce the variability in their actual output values. They empirically demonstrate the efficacy of the proposed technique by means of analyses on the simulated data and real world mammography data. Results: For the simulated data, with three different known sources of variability, and for the real world mammography data with unknown sources of variability, the proposed classifier output calibration technique significantly reduced the variability in the classifiers' (sensitivity, specificity) pairs at fixed decision thresholds. Furthermore, for classifiers with monotonically or approximately monotonically related output variables, the histogram shaping technique also significantly reduced the variability in their actual output values. Conclusions: Classifier output calibration based on histogram shaping can be successfully employed to reduce the variability in the output values and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs.

  9. Multiple analysis of an unknown optical multilayer coating

    International Nuclear Information System (INIS)

    Dobrowolski, J.A.; Ho, F.C.; Waldorf, A.

    1985-01-01

    Results are given of the analysis at five different laboratories of an unknown optical multilayer coating. In all, eleven different analytical and laboratory techniques were applied to the problem. The multilayer nominally consisted of three dielectric and two metallic layers. It was demonstrated convincingly that with present day techniques it is possible to determine the basic structure of such a coating

  10. Distributed Optimization Design of Continuous-Time Multiagent Systems With Unknown-Frequency Disturbances.

    Science.gov (United States)

    Wang, Xinghu; Hong, Yiguang; Yi, Peng; Ji, Haibo; Kang, Yu

    2017-05-24

    In this paper, a distributed optimization problem is studied for continuous-time multiagent systems with unknown-frequency disturbances. A distributed gradient-based control is proposed for the agents to achieve the optimal consensus with estimating unknown frequencies and rejecting the bounded disturbance in the semi-global sense. Based on convex optimization analysis and adaptive internal model approach, the exact optimization solution can be obtained for the multiagent system disturbed by exogenous disturbances with uncertain parameters.

  11. Estimation of the false discovery proportion with unknown dependence.

    Science.gov (United States)

    Fan, Jianqing; Han, Xu

    2017-09-01

    Large-scale multiple testing with correlated test statistics arises frequently in many scientific research. Incorporating correlation information in approximating false discovery proportion has attracted increasing attention in recent years. When the covariance matrix of test statistics is known, Fan, Han & Gu (2012) provided an accurate approximation of False Discovery Proportion (FDP) under arbitrary dependence structure and some sparsity assumption. However, the covariance matrix is often unknown in many applications and such dependence information has to be estimated before approximating FDP. The estimation accuracy can greatly affect FDP approximation. In the current paper, we aim to theoretically study the impact of unknown dependence on the testing procedure and establish a general framework such that FDP can be well approximated. The impacts of unknown dependence on approximating FDP are in the following two major aspects: through estimating eigenvalues/eigenvectors and through estimating marginal variances. To address the challenges in these two aspects, we firstly develop general requirements on estimates of eigenvalues and eigenvectors for a good approximation of FDP. We then give conditions on the structures of covariance matrices that satisfy such requirements. Such dependence structures include banded/sparse covariance matrices and (conditional) sparse precision matrices. Within this framework, we also consider a special example to illustrate our method where data are sampled from an approximate factor model, which encompasses most practical situations. We provide a good approximation of FDP via exploiting this specific dependence structure. The results are further generalized to the situation where the multivariate normality assumption is relaxed. Our results are demonstrated by simulation studies and some real data applications.

  12. Numerical method of identification of an unknown source term in a heat equation

    Directory of Open Access Journals (Sweden)

    Fatullayev Afet Golayo?lu

    2002-01-01

    Full Text Available A numerical procedure for an inverse problem of identification of an unknown source in a heat equation is presented. Approach of proposed method is to approximate unknown function by polygons linear pieces which are determined consecutively from the solution of minimization problem based on the overspecified data. Numerical examples are presented.

  13. (Non-) Gibbsianness and Phase Transitions in Random Lattice Spin Models

    NARCIS (Netherlands)

    Külske, C.

    1999-01-01

    We consider disordered lattice spin models with finite-volume Gibbs measures µΛ[η](dσ). Here σ denotes a lattice spin variable and η a lattice random variable with product distribution P describing the quenched disorder of the model. We ask: when will the joint measures limΛ↑Zd P(dη)µΛ[η](dσ) be

  14. Appraisal and Reliability of Variable Engagement Model Prediction ...

    African Journals Online (AJOL)

    The variable engagement model based on the stress - crack opening displacement relationship and, which describes the behaviour of randomly oriented steel fibres composite subjected to uniaxial tension has been evaluated so as to determine the safety indices associated when the fibres are subjected to pullout and with ...

  15. Cancer of unknown primitive metastatic. About two clinical cases

    International Nuclear Information System (INIS)

    Cawen, L; Cordoba, A.

    2010-01-01

    This work is about the two clinical cases about the unknown primitive metastatic cancer. The main techniques used for the diagnosis, treatment and monitoring of different s carcinomas are: Electronic microscope, molecular biology and genetics, especially histopathological study, topographic survey, ultrasound, radiography, chemotherapy, radiotherapy

  16. High-order sliding mode observer for fractional commensurate linear systems with unknown input

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem

    2017-01-01

    In this paper, a high-order sliding mode observer (HOSMO) is proposed for the joint estimation of the pseudo-state and the unknown input of fractional commensurate linear systems with single unknown input and a single output. The convergence of the proposed observer is proved using a Lyapunov-based approach. In addition, an enhanced variant of the proposed fractional-HOSMO is introduced to avoid the peaking phenomenon and thus to improve the estimation results in the transient phase. Simulation results are provided to illustrate the performance of the proposed fractional observer in both noise-free and noisy cases. The effect of the observer’s gains on the estimated pseudo-state and unknown input is also discussed.

  17. High-order sliding mode observer for fractional commensurate linear systems with unknown input

    KAUST Repository

    Belkhatir, Zehor

    2017-05-20

    In this paper, a high-order sliding mode observer (HOSMO) is proposed for the joint estimation of the pseudo-state and the unknown input of fractional commensurate linear systems with single unknown input and a single output. The convergence of the proposed observer is proved using a Lyapunov-based approach. In addition, an enhanced variant of the proposed fractional-HOSMO is introduced to avoid the peaking phenomenon and thus to improve the estimation results in the transient phase. Simulation results are provided to illustrate the performance of the proposed fractional observer in both noise-free and noisy cases. The effect of the observer’s gains on the estimated pseudo-state and unknown input is also discussed.

  18. On reconstruction of an unknown polygonal cavity in a linearized elasticity with one measurement

    International Nuclear Information System (INIS)

    Ikehata, M; Itou, H

    2011-01-01

    In this paper we consider a reconstruction problem of an unknown polygonal cavity in a linearized elastic body. For this problem, an extraction formula of the convex hull of the unknown polygonal cavity is established by means of the enclosure method introduced by Ikehata. The advantages of our method are that it needs only a single set of boundary data and we do not require any a priori assumptions for the unknown polygonal cavity and any constraints on boundary data. The theoretical formula may have possibility of application in nondestructive evaluation.

  19. Smooth conditional distribution function and quantiles under random censorship.

    Science.gov (United States)

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  20. Random amplified polymorphic DNA (RAPD) markers reveal genetic ...

    African Journals Online (AJOL)

    The present study evaluated genetic variability of superior bael genotypes collected from different parts of Andaman Islands, India using fruit characters and random amplified polymorphic DNA (RAPD) markers. Genomic DNA extracted from leaf material using cetyl trimethyl ammonium bromide (CTAB) method was ...

  1. Infinite conditional random fields for human behavior analysis

    NARCIS (Netherlands)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF

  2. Variational data assimilation using targetted random walks

    KAUST Repository

    Cotter, S. L.

    2011-02-15

    The variational approach to data assimilation is a widely used methodology for both online prediction and for reanalysis. In either of these scenarios, it can be important to assess uncertainties in the assimilated state. Ideally, it is desirable to have complete information concerning the Bayesian posterior distribution for unknown state given data. We show that complete computational probing of this posterior distribution is now within the reach in the offline situation. We introduce a Markov chain-Monte Carlo (MCMC) method which enables us to directly sample from the Bayesian posterior distribution on the unknown functions of interest given observations. Since we are aware that these methods are currently too computationally expensive to consider using in an online filtering scenario, we frame this in the context of offline reanalysis. Using a simple random walk-type MCMC method, we are able to characterize the posterior distribution using only evaluations of the forward model of the problem, and of the model and data mismatch. No adjoint model is required for the method we use; however, more sophisticated MCMC methods are available which exploit derivative information. For simplicity of exposition, we consider the problem of assimilating data, either Eulerian or Lagrangian, into a low Reynolds number flow in a two-dimensional periodic geometry. We will show that in many cases it is possible to recover the initial condition and model error (which we describe as unknown forcing to the model) from data, and that with increasing amounts of informative data, the uncertainty in our estimations reduces. © 2011 John Wiley & Sons, Ltd.

  3. Variable-bias coin tossing

    International Nuclear Information System (INIS)

    Colbeck, Roger; Kent, Adrian

    2006-01-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT

  4. Variable-bias coin tossing

    Science.gov (United States)

    Colbeck, Roger; Kent, Adrian

    2006-03-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.

  5. Colloquium: Random matrices and chaos in nuclear spectra

    International Nuclear Information System (INIS)

    Papenbrock, T.; Weidenmueller, H. A.

    2007-01-01

    Chaos occurs in quantum systems if the statistical properties of the eigenvalue spectrum coincide with predictions of random-matrix theory. Chaos is a typical feature of atomic nuclei and other self-bound Fermi systems. How can the existence of chaos be reconciled with the known dynamical features of spherical nuclei? Such nuclei are described by the shell model (a mean-field theory) plus a residual interaction. The question is answered using a statistical approach (the two-body random ensemble): The matrix elements of the residual interaction are taken to be random variables. Chaos is shown to be a generic feature of the ensemble and some of its properties are displayed, emphasizing those which differ from standard random-matrix theory. In particular, the existence of correlations among spectra carrying different quantum numbers is demonstrated. These are subject to experimental verification

  6. A causal role for right frontopolar cortex in directed, but not random, exploration

    OpenAIRE

    Zajkowski, Wojciech K; Kossut, Malgorzata; Wilson, Robert C

    2017-01-01

    The explore-exploit dilemma occurs anytime we must choose between exploring unknown options for information and exploiting known resources for reward. Previous work suggests that people use two different strategies to solve the explore-exploit dilemma: directed exploration, driven by information seeking, and random exploration, driven by decision noise. Here, we show that these two strategies rely on different neural systems. Using transcranial magnetic stimulation to inhibit the right fronto...

  7. A Bayesian Analysis of a Random Effects Small Business Loan Credit Scoring Model

    Directory of Open Access Journals (Sweden)

    Patrick J. Farrell

    2011-09-01

    Full Text Available One of the most important aspects of credit scoring is constructing a model that has low misclassification rates and is also flexible enough to allow for random variation. It is also well known that, when there are a large number of highly correlated variables as is typical in studies involving questionnaire data, a method must be found to reduce the number of variables to those that have high predictive power. Here we propose a Bayesian multivariate logistic regression model with both fixed and random effects for small business loan credit scoring and a variable reduction method using Bayes factors. The method is illustrated on an interesting data set based on questionnaires sent to loan officers in Canadian banks and venture capital companies

  8. Extensive screening for primary tumor is redundant in melanoma of unknown primary

    DEFF Research Database (Denmark)

    Tos, Tina; Klyver, Helle; Drzewiecki, Krzysztof T

    2011-01-01

    For decades, patients in our institution with metastastic melanoma of unknown primary have been subjected to extensive examinations in search of the primary tumor. This retrospective study questions the results, and thus the feasibility of these examinations. Of 103 patients diagnosed with unknow......, for patients referred with metastastic melanoma of unknown primary, we recommend that a detailed history is obtained, and a standard physical examination performed, in addition to a histopathological review and CT/PET for staging....

  9. Teleportation of an Unknown Atomic State via Adiabatic Passage

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    We propose a scheme for teleporting an unknown atomic state via adiabatic passage. Taking advantage of adiabatic passage, the atom has no probability of being excited and thus the atomic spontaneous emission is suppressed.We also show that the fidelity can reach 1 under certain condition.

  10. Metastasis to neck from unknown primary tumor

    International Nuclear Information System (INIS)

    Jose, B.; Bosch, A.; Caldwell, W.L.; Frias, Z.

    1979-01-01

    The records of 54 consecutive patients who were irradiated for metastatic disease in the neck from an unknown primary tumor were reviewed. The overall survival results are comparable to those of other reported series. Patients with high or posterior cervical lymph node involvement were irradiated with fields including the nasopharynx and oropharynx. Patients with high neck nodes had a better survival rate than those with low neck nodes. The size of the neck tumors and the local control after treatment also have prognostic significance. (Auth.)

  11. Uniqueness conditions for finitely dependent random fields

    International Nuclear Information System (INIS)

    Dobrushin, R.L.; Pecherski, E.A.

    1981-01-01

    The authors consider a random field for which uniqueness and some additional conditions guaranteeing that the correlations between the variables of the field decrease rapidly enough with the distance between the values of the parameter occur. The main result of the paper states that in such a case uniqueness is true for any other field with transition probabilities sufficiently close to those of the original field. Then they apply this result to some ''degenerate'' classes of random fields for which one can check this condition of correlation to decay, and thus obtain some new conditions of uniqueness. (Auth.)

  12. Quadrotor Control in the Presence of Unknown Mass Properties

    Science.gov (United States)

    Duivenvoorden, Rikky Ricardo Petrus Rufino

    Quadrotor UAVs are popular due to their mechanical simplicity, as well as their capability to hover and vertically take-off and land. As applications diversify, quadrotors are increasingly required to operate under unknown mass properties, for example as a multirole sensor platform or for package delivery operations. The work presented here consists of the derivation of a generalized quadrotor dynamic model without the typical simplifying assumptions on the first and second moments of mass. The maximum payload capacity of a quadrotor in hover, and the observability of the unknown mass properties are discussed. A brief introduction of L1 adaptive control is provided, and three different L 1 adaptive controllers were designed for the Parrot AR.Drone quadrotor. Their tracking and disturbance rejection performance was compared to the baseline nonlinear controller in experiments. Finally, the results of the combination of L1 adaptive control with iterative learning control are presented, showing high performance trajectory tracking under uncertainty.

  13. Matrix- and tensor-based recommender systems for the discovery of currently unknown inorganic compounds

    Science.gov (United States)

    Seko, Atsuto; Hayashi, Hiroyuki; Kashima, Hisashi; Tanaka, Isao

    2018-01-01

    Chemically relevant compositions (CRCs) and atomic arrangements of inorganic compounds have been collected as inorganic crystal structure databases. Machine learning is a unique approach to search for currently unknown CRCs from vast candidates. Herein we propose matrix- and tensor-based recommender system approaches to predict currently unknown CRCs from database entries of CRCs. Firstly, the performance of the recommender system approaches to discover currently unknown CRCs is examined. A Tucker decomposition recommender system shows the best discovery rate of CRCs as the majority of the top 100 recommended ternary and quaternary compositions correspond to CRCs. Secondly, systematic density functional theory (DFT) calculations are performed to investigate the phase stability of the recommended compositions. The phase stability of the 27 compositions reveals that 23 currently unknown compounds are newly found to be stable. These results indicate that the recommender system has great potential to accelerate the discovery of new compounds.

  14. Analysis of Modal Travel Time Variability Due to Mesoscale Ocean Structure

    National Research Council Canada - National Science Library

    Smith, Amy

    1997-01-01

    .... First, for an open ocean environment away from strong boundary currents, the effects of randomly phased linear baroclinic Rossby waves on acoustic travel time are shown to produce a variable overall...

  15. Using Variable Interval Reinforcement Schedules to Support Students in the Classroom: An Introduction with Illustrative Examples

    Science.gov (United States)

    Hulac, David; Benson, Nicholas; Nesmith, Matthew C.; Wollersheim Shervey, Sarah

    2016-01-01

    When behaviors are reinforced with a variable interval reinforcement schedule, reinforcement is available only after an unknown period of time. These types of reinforcement schedules are most useful for reinforcing slow and steady responding and for differentially reinforcing behaviors that are incompatible with some problematic behaviors. This…

  16. Adresse inconnue / Address unknown / Suchwiin Bulmyeong

    Directory of Open Access Journals (Sweden)

    Serge Gruzinski

    2005-03-01

    Full Text Available Tous les films asiatiques parlent de métissage, même ceux qui se présentent comme de vastes fresques historiques perdues dans le temps. Les emprunts aux traditions hollywoodiennes et européennes n'ont cessé d'enrichir une cinématographie aussi ancienne que celle du monde occidental. Dans Adresse inconnue (Address unknown le cinéaste coréen Kim Ki-duk explore l'expérience du métissage et le corps du métis à la frontière entre Corée du Nord et Corée du sud. Fils d'un GI américain et noir et d...

  17. Optimal conclusive teleportation of a d-dimensional two-particle unknown quantum state

    Institute of Scientific and Technical Information of China (English)

    Yang Yu-Guang; Wen Qiao-Yan; Zhu Fu-Chen

    2006-01-01

    A conclusive teleportation protocol of a d-dimensional two-particle unknown quantum state using three ddimensional particles in an arbitrary pure state is proposed. A sender teleports the unknown state conclusively to a receiver by using the positive operator valued measure(POVM) and introducing an ancillary qudit to perform the generalized Bell basis measurement. We calculate the optimal teleportation fidelity. We also discuss and analyse the reason why the information on the teleported state is lost in the course of the protocol.

  18. Random mutagenesis in Corynebacterium glutamicum ATCC 13032 using an IS6100-based transposon vector identified the last unknown gene in the histidine biosynthesis pathway

    Directory of Open Access Journals (Sweden)

    Gaigalat Lars

    2006-08-01

    Full Text Available Abstract Background Corynebacterium glutamicum, a Gram-positive bacterium of the class Actinobacteria, is an industrially relevant producer of amino acids. Several methods for the targeted genetic manipulation of this organism and rational strain improvement have been developed. An efficient transposon mutagenesis system for the completely sequenced type strain ATCC 13032 would significantly advance functional genome analysis in this bacterium. Results A comprehensive transposon mutant library comprising 10,080 independent clones was constructed by electrotransformation of the restriction-deficient derivative of strain ATCC 13032, C. glutamicum RES167, with an IS6100-containing non-replicative plasmid. Transposon mutants had stable cointegrates between the transposon vector and the chromosome. Altogether 172 transposon integration sites have been determined by sequencing of the chromosomal inserts, revealing that each integration occurred at a different locus. Statistical target site analyses revealed an apparent absence of a target site preference. From the library, auxotrophic mutants were obtained with a frequency of 2.9%. By auxanography analyses nearly two thirds of the auxotrophs were further characterized, including mutants with single, double and alternative nutritional requirements. In most cases the nutritional requirement observed could be correlated to the annotation of the mutated gene involved in the biosynthesis of an amino acid, a nucleotide or a vitamin. One notable exception was a clone mutagenized by transposition into the gene cg0910, which exhibited an auxotrophy for histidine. The protein sequence deduced from cg0910 showed high sequence similarities to inositol-1(or 4-monophosphatases (EC 3.1.3.25. Subsequent genetic deletion of cg0910 delivered the same histidine-auxotrophic phenotype. Genetic complementation of the mutants as well as supplementation by histidinol suggests that cg0910 encodes the hitherto unknown

  19. Drop Spreading with Random Viscosity

    Science.gov (United States)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  20. Blind Identification of FIR Channels in the Presence of Unknown Noise

    Directory of Open Access Journals (Sweden)

    Kon Max Wong

    2007-01-01

    Full Text Available Blind channel identification techniques based on second-order statistics (SOS of the received data have been a topic of active research in recent years. Among the most popular is the subspace method (SS proposed by Moulines et al. (1995. It has good performance when the channel output is corrupted by white noise. However, when the channel noise is correlated and unknown as is often encountered in practice, the performance of the SS method degrades severely. In this paper, we address the problem of estimating FIR channels in the presence of arbitrarily correlated noise whose covariance matrix is unknown. We propose several algorithms according to the different available system resources: (1 when only one receiving antenna is available, by upsampling the output, we develop the maximum a posteriori (MAP algorithm for which a simple criterion is obtained and an efficient implementation algorithm is developed; (2 when two receiving antennae are available, by upsampling both the outputs and utilizing canonical correlation decomposition (CCD to obtain the subspaces, we present two algorithms (CCD-SS and CCD-ML to blindly estimate the channels. Our algorithms perform well in unknown noise environment and outperform existing methods proposed for similar scenarios.