WorldWideScience

Sample records for random variable representing

  1. Strong Decomposition of Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.

    2007-01-01

    A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....

  2. Students' Misconceptions about Random Variables

    Science.gov (United States)

    Kachapova, Farida; Kachapov, Ilias

    2012-01-01

    This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)

  3. Free random variables

    CERN Document Server

    Voiculescu, Dan; Nica, Alexandru

    1992-01-01

    This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.

  4. Symmetrization of binary random variables

    OpenAIRE

    Kagan, Abram; Mallows, Colin L.; Shepp, Larry A.; Vanderbei, Robert J.; Vardi, Yehuda

    1999-01-01

    A random variable [math] is called an independent symmetrizer of a given random variable [math] if (a) it is independent of [math] and (b) the distribution of [math] is symmetric about [math] . In cases where the distribution of [math] is symmetric about its mean, it is easy to see that the constant random variable [math] is a minimum-variance independent symmetrizer. Taking [math] to have the same distribution as [math] clearly produces a symmetric sum, but it may not be of minimum variance....

  5. Contextuality in canonical systems of random variables.

    Science.gov (United States)

    Dzhafarov, Ehtibar N; Cervantes, Víctor H; Kujala, Janne V

    2017-11-13

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).

  6. Contextuality in canonical systems of random variables

    Science.gov (United States)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  7. Is the Aquarius sea surface salinity variability representative?

    Science.gov (United States)

    Carton, J.; Grodsky, S.

    2016-12-01

    The leading mode of the Aquarius monthly anomalous sea surface salinity (SSS) is evaluated within the 50S-50N belt, where SSS retrieval accuracy is higher. This mode accounts for about 18% of the variance and resembles a pattern of the ENSO-induced anomalous rainfall. The leading mode of SSS variability deducted from a longer JAMSTEC analysis also accounts for about 17% of the variance and has very similar spatial pattern and almost a perfect correspondence of its temporal principal component to the SOI index. In that sense, the Aquarius SSS variability at low and middle latitudes is representative of SSS variability that may be obtained from longer records. This is explained by the fact that during the Aquarius period (2011-2015), the SOI index changed significantly from La Nina toward El Nino state, thus spanning a significant range of its characteristic variations. Multivariate EOF analysis of anomalous SSS and SST suggests that ENSO-induced shift in the tropical Pacific rainfall produces negatively correlated variability of temperature and salinity, which are expected if the anomalous surface flux (stronger rainfall coincident with less downward radiation) drives the system. But, anomalous SSS and SST are positively correlated in some areas including the northwestern Atlantic shelf (north of the Gulfstream) and the Pacific sector adjusting to the California peninsula. This positive correlation is indicative of an advection driven regime that is analyzed separately.

  8. Probabilistic graphs using coupled random variables

    Science.gov (United States)

    Nelson, Kenric P.; Barbu, Madalina; Scannell, Brian J.

    2014-05-01

    Neural network design has utilized flexible nonlinear processes which can mimic biological systems, but has suffered from a lack of traceability in the resulting network. Graphical probabilistic models ground network design in probabilistic reasoning, but the restrictions reduce the expressive capability of each node making network designs complex. The ability to model coupled random variables using the calculus of nonextensive statistical mechanics provides a neural node design incorporating nonlinear coupling between input states while maintaining the rigor of probabilistic reasoning. A generalization of Bayes rule using the coupled product enables a single node to model correlation between hundreds of random variables. A coupled Markov random field is designed for the inferencing and classification of UCI's MLR `Multiple Features Data Set' such that thousands of linear correlation parameters can be replaced with a single coupling parameter with just a (3%, 4%) reduction in (classification, inference) performance.

  9. Maximal Inequalities for Dependent Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jorgensen, Jorgen

    2016-01-01

    Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X......-k. Then a maximal inequality gives conditions ensuring that the maximal partial sum M-n = max(1) (...

  10. Fast Generation of Discrete Random Variables

    Directory of Open Access Journals (Sweden)

    George Marsaglia

    2004-07-01

    Full Text Available We describe two methods and provide C programs for generating discrete random variables with functions that are simple and fast, averaging ten times as fast as published methods and more than five times as fast as the fastest of those. We provide general procedures for implementing the two methods, as well as specific procedures for three of the most important discrete distributions: Poisson, binomial and hypergeometric.

  11. Fractional calculus approach to the statistical characterization of random variables and vectors

    OpenAIRE

    Cottone, D. ; Paola, M.D.

    2015-01-01

    Fractional moments have been investigated by many authors to represent the density of univariate and bivariate random variables in different contexts. Fractional moments are indeed important when the density of the random variable has inverse power-law tails and, consequently, it lacks integer order moments. In this paper, starting from the Mellin transform of the characteristic function and by fractional calculus method we present a new perspective on the statistics of random variables. Intr...

  12. Fractional calculus approach to the statistical characterization of random variables and vectors

    Science.gov (United States)

    Cottone, Giulio; Di Paola, Mario; Metzler, Ralf

    2010-03-01

    Fractional moments have been investigated by many authors to represent the density of univariate and bivariate random variables in different contexts. Fractional moments are indeed important when the density of the random variable has inverse power-law tails and, consequently, it lacks integer order moments. In this paper, starting from the Mellin transform of the characteristic function and by fractional calculus method we present a new perspective on the statistics of random variables. Introducing the class of complex moments, that include both integer and fractional moments, we show that every random variable can be represented within this approach, even if its integer moments diverge. Applications to the statistical characterization of raw data and in the representation of both random variables and vectors are provided, showing that the good numerical convergence makes the proposed approach a good and reliable tool also for practical data analysis.

  13. Random vectorial fields representing the local structure of turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Chevillard, Laurent [Laboratoire de Physique de l' ENS Lyon, CNRS, Universite de Lyon, 46 allee d' Italie, 69007 Lyon (France); Robert, Raoul [Institut Fourier, CNRS, Universite Grenoble 1, 100 rue des Mathematiques, BP 74, 38402 Saint-Martin d' Heres cedex (France); Vargas, Vincent, E-mail: laurent.chevillard@ens-lyon.fr [Ceremade, CNRS, Universite Paris-Dauphine, F-75016 Paris (France)

    2011-12-22

    We propose a method to build up a random homogeneous, isotropic and incompressible turbulent velocity field that mimics turbulence in the inertial range. The underlying Gaussian field is given by a modified Biot-Savart law. The long range correlated nature of turbulence is then incorporated heuristically using a non linear transformation inspired by the recent fluid deformation imposed by the Euler equations. The resulting velocity field shows a non vanishing mean energy transfer towards the small scales and realistic alignment properties of vorticity with the eigenframe of the deformation rate.

  14. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  15. The Common Information of N Dependent Random Variables

    CERN Document Server

    Liu, Wei; Chen, Biao

    2010-01-01

    This paper generalizes Wyner's definition of common information of a pair of random variables to that of $N$ random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of $N$ random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to $N$ source squences with $N$ decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and G\\'{a}cs and K\\"{o}rner's common randomness. Examples about the computation of Wyner's common information of $N$ random variables are also given.

  16. Children's Use of Variables and Variable Notation to Represent Their Algebraic Ideas

    Science.gov (United States)

    Brizuela, Bárbara M.; Blanton, Maria; Sawrey, Katharine; Newman-Owens, Ashley; Murphy Gardiner, Angela

    2015-01-01

    In this article, we analyze a first grade classroom episode and individual interviews with students who participated in that classroom event to provide evidence of the variety of understandings about variable and variable notation held by first grade children approximately six years of age. Our findings illustrate that given the opportunity,…

  17. Fuzzy random variables — I. definitions and theorems

    NARCIS (Netherlands)

    Kwakernaak, H.

    1978-01-01

    Fuzziness is discussed in the context of multivalued logic, and a corresponding view of fuzzy sets is given. Fuzzy random variables are introduced as random variables whose values are not real but fuzzy numbers, and subsequently redefined as a particular kind of fuzzy set. Expectations of fuzzy

  18. On complete moment convergence for nonstationary negatively associated random variables

    Directory of Open Access Journals (Sweden)

    Mi-Hwa Ko

    2016-05-01

    Full Text Available Abstract The purpose of this paper is to establish the complete moment convergence for nonstationary negatively associated random variables satisfying the weak mean domination condition. The result is an improvement of complete convergence in Marcinkiewicz-Zygmund-type SLLN for negatively associated random variables in Kuczmaszewska (Acta Math. Hung. 128:116-130, 2010.

  19. Characterizations of Distributions of Ratios of Certain Independent Random Variables

    Directory of Open Access Journals (Sweden)

    Hamedani G.G.

    2013-05-01

    Full Text Available Various characterizations of the distributions of the ratio of two independent gamma and exponential random variables as well as that of two independent Weibull random variables are presented. These characterizations are based, on a simple relationship between two truncated moments ; on hazard function ; and on functions of order statistics.

  20. On the product and ratio of Bessel random variables

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2005-01-01

    Full Text Available The distributions of products and ratios of random variables are of interest in many areas of the sciences. In this paper, the exact distributions of the product |XY| and the ratio |X/Y| are derived when X and Y are independent Bessel function random variables. An application of the results is provided by tabulating the associated percentage points.

  1. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  2. Can we assess representativeness of cross-national surveys using the education variable?

    Directory of Open Access Journals (Sweden)

    Verena Ortmanns

    2016-12-01

    Full Text Available Achieving a representative sample is an important goal for every survey. High response rates are often referred to as an indicator of representativeness in survey methodology research. However, a low response rate does not necessarily imply low representativeness, so that alternative ways of assessing representativeness are needed in times where low response rates are almost ubiquitous. This study asks whether education, a socio-demographic variable covered by virtually every survey of individuals, is a good variable for assessing the representativeness of a realised survey sample. We examine this issue in two steps: Firstly, the distributions of the harmonised education variable in six official and academic cross-national surveys by country-year combination are compared with the respective education distributions in a high-quality reference dataset. Doing so, we identify many substantial inconsistencies. Secondly, we try to identify the sources of these inconsistencies, looking at both measurement errors in the education variables and errors of representation. Since in most instances, inconsistent measurement procedures can probably explain the observed inconsistencies, we conclude that the education variable as currently measured in cross-national surveys is, without further processing, unsuitable for assessing sample representativeness, and constructing nonresponse weights. The paper closes with recommendations for achieving a more comparable measurement of the education variable.

  3. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i.......e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  4. Exponential Inequalities for Positively Associated Random Variables and Applications

    Directory of Open Access Journals (Sweden)

    Yang Shanchao

    2008-01-01

    Full Text Available Abstract We establish some exponential inequalities for positively associated random variables without the boundedness assumption. These inequalities improve the corresponding results obtained by Oliveira (2005. By one of the inequalities, we obtain the convergence rate for the case of geometrically decreasing covariances, which closes to the optimal achievable convergence rate for independent random variables under the Hartman-Wintner law of the iterated logarithm and improves the convergence rate derived by Oliveira (2005 for the above case.

  5. Separation metrics for real-valued random variables

    Directory of Open Access Journals (Sweden)

    Michael D. Taylor

    1984-01-01

    Full Text Available If W is a fixed, real-valued random variable, then there are simple and easily satisfied conditions under which the function dW, where dW(X,Y= the probability that W “separates” the real-valued random variables X and Y, turns out to be a metric. The observation was suggested by work done in [1].

  6. Randomly weighted sums of subexponential random variables with application to ruin theory

    NARCIS (Netherlands)

    Tang, Q.; Tsitsiashvili, G.

    2003-01-01

    Let {X k , 1 k n} be n independent and real-valued random variables with common subexponential distribution function, and let {k, 1 k n} be other n random variables independent of {X k , 1 k n} and satisfying a k b for some 0 < a b < for all 1 k n. This paper proves that the asymptotic relations P

  7. Assessing terpene content variability of whitebark pine in order to estimate representative sample size

    Directory of Open Access Journals (Sweden)

    Stefanović Milena

    2013-01-01

    Full Text Available In studies of population variability, particular attention has to be paid to the selection of a representative sample. The aim of this study was to assess the size of the new representative sample on the basis of the variability of chemical content of the initial sample on the example of a whitebark pine population. Statistical analysis included the content of 19 characteristics (terpene hydrocarbons and their derivates of the initial sample of 10 elements (trees. It was determined that the new sample should contain 20 trees so that the mean value calculated from it represents a basic set with a probability higher than 95 %. Determination of the lower limit of the representative sample size that guarantees a satisfactory reliability of generalization proved to be very important in order to achieve cost efficiency of the research. [Projekat Ministarstva nauke Republike Srbije, br. OI-173011, br. TR-37002 i br. III-43007

  8. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  9. Rates of profit as correlated sums of random variables

    Science.gov (United States)

    Greenblatt, R. E.

    2013-10-01

    Profit realization is the dominant feature of market-based economic systems, determining their dynamics to a large extent. Rather than attaining an equilibrium, profit rates vary widely across firms, and the variation persists over time. Differing definitions of profit result in differing empirical distributions. To study the statistical properties of profit rates, I used data from a publicly available database for the US Economy for 2009-2010 (Risk Management Association). For each of three profit rate measures, the sample space consists of 771 points. Each point represents aggregate data from a small number of US manufacturing firms of similar size and type (NAICS code of principal product). When comparing the empirical distributions of profit rates, significant ‘heavy tails’ were observed, corresponding principally to a number of firms with larger profit rates than would be expected from simple models. An apparently novel correlated sum of random variables statistical model was used to model the data. In the case of operating and net profit rates, a number of firms show negative profits (losses), ruling out simple gamma or lognormal distributions as complete models for these data.

  10. Representing Uncertainty in Graph Edges: An Evaluation of Paired Visual Variables.

    Science.gov (United States)

    Guo, Hua; Huang, Jeff; Laidlaw, David H

    2015-10-01

    When visualizing data with uncertainty, a common approach is to treat uncertainty as an additional dimension and encode it using a visual variable. The effectiveness of this approach depends on how the visual variables chosen for representing uncertainty and other attributes interact to influence the user's perception of each variable. We report a user study on the perception of graph edge attributes when uncertainty associated with each edge and the main edge attribute are visualized simultaneously using two separate visual variables. The study covers four visual variables that are commonly used for visualizing uncertainty on line graphical primitives: lightness, grain, fuzziness, and transparency. We select width, hue, and saturation for visualizing the main edge attribute and hypothesize that we can observe interference between the visual variable chosen to encode the main edge attribute and that to encode uncertainty, as suggested by the concept of dimensional integrality. Grouping the seven visual variables as color-based, focus-based, or geometry-based, we further hypothesize that the degree of interference is affected by the groups to which the two visual variables belong. We consider two further factors in the study: discriminability level for each visual variable as a factor intrinsic to the visual variables and graph-task type (visual search versus comparison) as a factor extrinsic to the visual variables. Our results show that the effectiveness of a visual variable in depicting uncertainty is strongly mediated by all the factors examined here. Focus-based visual variables (fuzziness, grain, and transparency) are robust to the choice of visual variables for encoding the main edge attribute, though fuzziness has stronger negative impact on the perception of width and transparency has stronger negative impact on the perception of hue than the other uncertainty visual variables. We found that interference between hue and lightness is much greater than that

  11. Study on genetic variability of Cassidula aurisfelis (snail) by random ...

    African Journals Online (AJOL)

    The genetic variability among individuals of Cassidula aurisfelis from Setiu Wetland, Terengganu Darul Iman was examined by using the random amplified polymorphic DNA (RAPD) technique. Ten oligonucleotide primers were screened and three primers were selected (OPA 02, OPA 04 and OPA 10) to amplify DNA from ...

  12. Variability in response to albuminuria lowering drugs : true or random?

    NARCIS (Netherlands)

    Petrykiv, Sergei I.; de Zeeuw, Dick; Persson, Frederik; Rossing, Peter; Gansevoort, Ron T.; Laverman, Gozewijn D.; Heerspink, Hiddo J. L.

    AIMS Albuminuria-lowering drugs have shown different effect size in different individuals. Since urine albumin levels are known to vary considerably from day- to-day, we questioned whether the between-individual variability in albuminuria response after therapy initiation reflects a random

  13. How a dependent's variable non-randomness affects taper equation ...

    African Journals Online (AJOL)

    Regression results, for the two methods, were compared using the confidence interval estimates for the regression coefficients, the multicollinearity tests and Fit Index (FI) values as criteria. The comparison of results showed that randomness of the dependent variable (second method) did not improve the estimates, in any of ...

  14. Study on genetic variability of Cassidula aurisfelis (snail) by random ...

    African Journals Online (AJOL)

    PRECIOUS

    2009-11-16

    Nov 16, 2009 ... genetic variability is Random Amplified Polymorphic. DNAs (RAPD) (Williams et al., 1990). The technique requires no prior knowledge of the genome and it needs ... quantity of DNA was measured by obtaining the absorbance read- ... 1994) and Numerical taxonomy and Multivariate Analysis System.

  15. An infinite-dimensional weak KAM theory via random variables

    KAUST Repository

    Gomes, Diogo A.

    2016-08-31

    We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables\\' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.

  16. Limit theorems for multi-indexed sums of random variables

    CERN Document Server

    Klesov, Oleg

    2014-01-01

    Presenting the first unified treatment of limit theorems for multiple sums of independent random variables, this volume fills an important gap in the field. Several new results are introduced, even in the classical setting, as well as some new approaches that are simpler than those already established in the literature. In particular, new proofs of the strong law of large numbers and the Hajek-Renyi inequality are detailed. Applications of the described theory include Gibbs fields, spin glasses, polymer models, image analysis and random shapes. Limit theorems form the backbone of probability theory and statistical theory alike. The theory of multiple sums of random variables is a direct generalization of the classical study of limit theorems, whose importance and wide application in science is unquestionable. However, to date, the subject of multiple sums has only been treated in journals. The results described in this book will be of interest to advanced undergraduates, graduate students and researchers who ...

  17. A review of instrumental variable estimators for Mendelian randomization.

    Science.gov (United States)

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  18. The Variability of the Order Burkholderiales Representatives in the Healthcare Units

    Directory of Open Access Journals (Sweden)

    Olga L. Voronina

    2015-01-01

    Full Text Available Background and Aim. The order Burkholderiales became more abundant in the healthcare units since the late 1970s; it is especially dangerous for intensive care unit patients and patients with chronic lung diseases. The goal of this investigation was to reveal the real variability of the order Burkholderiales representatives and to estimate their phylogenetic relationships. Methods. 16S rDNA and genes of the Burkholderia cenocepacia complex (Bcc Multi Locus Sequence Typing (MLST scheme were used for the bacteria detection. Results. A huge diversity of genome size and organization was revealed in the order Burkholderiales that may prove the adaptability of this taxon’s representatives. The following variability of the Burkholderiales in Russian healthcare units has been revealed: Burkholderiaceae (Burkholderia, Pandoraea, and Lautropia, Alcaligenaceae (Achromobacter, and Comamonadaceae (Variovorax. The Burkholderia genus was the most diverse and was represented by 5 species and 16 sequence types (ST. ST709 and 728 were transmissible and often encountered in cystic fibrosis patients and in hospitals. A. xylosoxidans was estimated by 15 genotypes. The strains of first and second ones were the most numerous. Conclusions. Phylogenetic position of the genus Lautropia with smaller genome is ambiguous. The Bcc MLST scheme is applicable for all Burkholderiales representatives for resolving the epidemiological problems.

  19. The Variability of the Order Burkholderiales Representatives in the Healthcare Units.

    Science.gov (United States)

    Voronina, Olga L; Kunda, Marina S; Ryzhova, Natalia N; Aksenova, Ekaterina I; Semenov, Andrey N; Lasareva, Anna V; Amelina, Elena L; Chuchalin, Alexandr G; Lunin, Vladimir G; Gintsburg, Alexandr L

    2015-01-01

    The order Burkholderiales became more abundant in the healthcare units since the late 1970s; it is especially dangerous for intensive care unit patients and patients with chronic lung diseases. The goal of this investigation was to reveal the real variability of the order Burkholderiales representatives and to estimate their phylogenetic relationships. 16S rDNA and genes of the Burkholderia cenocepacia complex (Bcc) Multi Locus Sequence Typing (MLST) scheme were used for the bacteria detection. . A huge diversity of genome size and organization was revealed in the order Burkholderiales that may prove the adaptability of this taxon's representatives. The following variability of the Burkholderiales in Russian healthcare units has been revealed: Burkholderiaceae (Burkholderia, Pandoraea, and Lautropia), Alcaligenaceae (Achromobacter), and Comamonadaceae (Variovorax). The Burkholderia genus was the most diverse and was represented by 5 species and 16 sequence types (ST). ST709 and 728 were transmissible and often encountered in cystic fibrosis patients and in hospitals. A. xylosoxidans was estimated by 15 genotypes. The strains of first and second ones were the most numerous. Phylogenetic position of the genus Lautropia with smaller genome is ambiguous. The Bcc MLST scheme is applicable for all Burkholderiales representatives for resolving the epidemiological problems.

  20. Instrumental variable analyses. Exploiting natural randomness to understand causal mechanisms.

    Science.gov (United States)

    Iwashyna, Theodore J; Kennedy, Edward H

    2013-06-01

    Instrumental variable analysis is a technique commonly used in the social sciences to provide evidence that a treatment causes an outcome, as contrasted with evidence that a treatment is merely associated with differences in an outcome. To extract such strong evidence from observational data, instrumental variable analysis exploits situations where some degree of randomness affects how patients are selected for a treatment. An instrumental variable is a characteristic of the world that leads some people to be more likely to get the specific treatment we want to study but does not otherwise change those patients' outcomes. This seminar explains, in nonmathematical language, the logic behind instrumental variable analyses, including several examples. It also provides three key questions that readers of instrumental variable analyses should ask to evaluate the quality of the evidence. (1) Does the instrumental variable lead to meaningful differences in the treatment being tested? (2) Other than through the specific treatment being tested, is there any other way the instrumental variable could influence the outcome? (3) Does anything cause patients to both receive the instrumental variable and receive the outcome?

  1. Southern hemisphere climate variability as represented by an ocean-atmosphere coupled model

    CSIR Research Space (South Africa)

    Beraki, A

    2012-09-01

    Full Text Available , 1996: Relationship of air temperature in New Zealand to regional anomalies in sea-surface temperature and atmospheric circulation. Int. J. Climatol., 16, 405?425. Beraki, A., D. DeWitt, W.A. Landman and O. Cobus, 2011: Ocean-Atmosphere Coupled... variability as represented by an ocean-atmosphere coupled model Asmerom Beraki1,2, Willem A. Landman2,3 and David DeWitt4 1South African Weather Service Pretoria, South Africa, asmerom.beraki@weahtersa.co.za 2Departement of Geography...

  2. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Ahmed, Sajid

    2016-01-13

    Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.

  3. Higher moments of Banach space valued random variables

    CERN Document Server

    Janson, Svante

    2015-01-01

    The authors define the k:th moment of a Banach space valued random variable as the expectation of its k:th tensor power; thus the moment (if it exists) is an element of a tensor power of the original Banach space. The authors study both the projective and injective tensor products, and their relation. Moreover, in order to be general and flexible, we study three different types of expectations: Bochner integrals, Pettis integrals and Dunford integrals.

  4. Entropy power inequality for a family of discrete random variables

    CERN Document Server

    Sharma, Naresh; Muthukrishnan, Siddharth

    2010-01-01

    It is known that the Entropy Power Inequality (EPI) always holds if the random variables have density. Not much work has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremo\\"{e}s and Vignat showed that it holds for the pair (B(m,p), B(n,p)), m,n \\in \\mathbb{N}, (where B(n,p) is a Binomial distribution with n trials each with success probability p) for p = 0.5. In this paper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n_0(p) such that for all m,n \\geq n_0(p), the EPI holds for (B(m,p), B(n,p)). We further show that the EPI holds for the discrete random variables that can be expressed as the sum of n independent identical distributed (IID) discrete random variables for large n.

  5. Non-Shannon Information Inequalities in Four Random Variables

    CERN Document Server

    Dougherty, Randall; Zeger, Kenneth

    2011-01-01

    Any unconstrained information inequality in three or fewer random variables can be written as a linear combination of instances of Shannon's inequality I(A;B|C) >= 0 . Such inequalities are sometimes referred to as "Shannon" inequalities. In 1998, Zhang and Yeung gave the first example of a "non-Shannon" information inequality in four variables. Their technique was to add two auxiliary variables with special properties and then apply Shannon inequalities to the enlarged list. Here we will show that the Zhang-Yeung inequality can actually be derived from just one auxiliary variable. Then we use their same basic technique of adding auxiliary variables to give many other non-Shannon inequalities in four variables. Our list includes the inequalities found by Xu, Wang, and Sun, but it is by no means exhaustive. Furthermore, some of the inequalities obtained may be superseded by stronger inequalities that have yet to be found. Indeed, we show that the Zhang-Yeung inequality is one of those that is superseded. We al...

  6. Bipolar I and II disorders in a random and representative Australian population.

    Science.gov (United States)

    Goldney, Robert D; Fisher, Laura J; Grande, Eleonora Dal; Taylor, Anne W; Hawthorne, Graeme

    2005-08-01

    To assess the prevalence of bipolar I and II disorders in an Australian population. The Mood Disorder Questionnaire (MDQ) was administered to 3015 respondents in a random and representative sample in South Australia. Health status, quality of life and demographic data were also collected. There was a 2.5% lifetime prevalence of bipolar I and II disorders delineated by the MDQ. Those people had a significantly greater use of services and a poorer health status and quality of life than those who were MDQ-negative. These results in an Australian population are consistent with other international studies showing a greater prevalence of bipolar disorders than hitherto appreciated.

  7. Generation of correlated finite alphabet waveforms using gaussian random variables

    KAUST Repository

    Jardak, Seifallah

    2014-09-01

    Correlated waveforms have a number of applications in different fields, such as radar and communication. It is very easy to generate correlated waveforms using infinite alphabets, but for some of the applications, it is very challenging to use them in practice. Moreover, to generate infinite alphabet constant envelope correlated waveforms, the available research uses iterative algorithms, which are computationally very expensive. In this work, we propose simple novel methods to generate correlated waveforms using finite alphabet constant and non-constant-envelope symbols. To generate finite alphabet waveforms, the proposed method map the Gaussian random variables onto the phase-shift-keying, pulse-amplitude, and quadrature-amplitude modulation schemes. For such mapping, the probability-density-function of Gaussian random variables is divided into M regions, where M is the number of alphabets in the corresponding modulation scheme. By exploiting the mapping function, the relationship between the cross-correlation of Gaussian and finite alphabet symbols is derived. To generate equiprobable symbols, the area of each region is kept same. If the requirement is to have each symbol with its own unique probability, the proposed scheme allows us that as well. Although, the proposed scheme is general, the main focus of this paper is to generate finite alphabet waveforms for multiple-input multiple-output radar, where correlated waveforms are used to achieve desired beampatterns. © 2014 IEEE.

  8. Analysis of Secret Key Randomness Exploiting the Radio Channel Variability

    Directory of Open Access Journals (Sweden)

    Taghrid Mazloum

    2015-01-01

    Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.

  9. Selection for altruism through random drift in variable size populations.

    Science.gov (United States)

    Houchmandzadeh, Bahram; Vallade, Marcel

    2012-05-10

    Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel) show that altruistic behaviors can have 'hidden' advantages if the 'common good' produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of "selfish" alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  10. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  11. A lower bound on the probability that a binomial random variable is exceeding its mean

    OpenAIRE

    Pelekis, Christos; Ramon, Jan

    2016-01-01

    We provide a lower bound on the probability that a binomial random variable is exceeding its mean. Our proof employs estimates on the mean absolute deviation and the tail conditional expectation of binomial random variables.

  12. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-01

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss common modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges

  13. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    Science.gov (United States)

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286

  14. A descriptive analysis of a representative sample of pediatric randomized controlled trials published in 2007

    Directory of Open Access Journals (Sweden)

    Thomson Denise

    2010-12-01

    Full Text Available Abstract Background Randomized controlled trials (RCTs are the gold standard for trials assessing the effects of therapeutic interventions; therefore it is important to understand how they are conducted. Our objectives were to provide an overview of a representative sample of pediatric RCTs published in 2007 and assess the validity of their results. Methods We searched Cochrane Central Register of Controlled Trials using a pediatric filter and randomly selected 300 RCTs published in 2007. We extracted data on trial characteristics; outcomes; methodological quality; reporting; and registration and protocol characteristics. Trial registration and protocol availability were determined for each study based on the publication, an Internet search and an author survey. Results Most studies (83% were efficacy trials, 40% evaluated drugs, and 30% were placebo-controlled. Primary outcomes were specified in 41%; 43% reported on adverse events. At least one statistically significant outcome was reported in 77% of trials; 63% favored the treatment group. Trial registration was declared in 12% of publications and 23% were found through an Internet search. Risk of bias (ROB was high in 59% of trials, unclear in 33%, and low in 8%. Registered trials were more likely to have low ROB than non-registered trials (16% vs. 5%; p = 0.008. Effect sizes tended to be larger for trials at high vs. low ROB (0.28, 95% CI 0.21,0.35 vs. 0.16, 95% CI 0.07,0.25. Among survey respondents (50% response rate, the most common reason for trial registration was a publication requirement and for non-registration, a lack of familiarity with the process. Conclusions More than half of this random sample of pediatric RCTs published in 2007 was at high ROB and three quarters of trials were not registered. There is an urgent need to improve the design, conduct, and reporting of child health research.

  15. Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables

    KAUST Repository

    Jardak, Seifallah

    2012-11-01

    The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.

  16. Determining the number of test fires needed to represent the variability present within 9mm Luger firearms.

    Science.gov (United States)

    Law, Eric F; Morris, Keith B; Jelsema, Casey M

    2017-07-01

    Many studies have been performed in recent years in the field of firearm examination with the goal of providing an objective method for comparisons of fired cartridge cases. No published research to support the number of test fires needed to represent the variability present within the impressions left on a cartridge case could be found. When a suspect firearm is submitted to a firearm examiner, typically two to four test fires are performed. The recovered cartridge cases are compared to each other to determine which characteristics from the firearm are reproducing, and then compared to any cartridge cases collected at a crime scene. The aim of this research was to determine the number of test fires examiners should perform when a suspect firearm is submitted to the lab to balance cartridge case acquisition time with performance accuracy. Each firearm in the IBIS(®) database at West Virginia University(®) is represented by approximately 100 fired cartridge case entries. Random samples of cartridge cases were taken separately from the breech face match score and firing pin match score lists. This subset was compared to the total match distribution of the firearm using a hybrid equivalence test to determine if the subset of similarity scores were statistically equivalent to the larger distribution of scores. For the sampled distribution to remain above 80% equivalent to the match distribution, a minimum of 15 cartridge cases should be used to model the match distribution, based on IBIS(®) scores. Thirty cartridge cases is a conservative estimate, allowing one to determine that the location and dispersion of the match and sampling distributions are equivalent with nearly 100% probability. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Instrumental variables and Mendelian randomization with invalid instruments

    Science.gov (United States)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with

  18. Representing general theoretical concepts in structural equation models: The role of composite variables

    Science.gov (United States)

    Grace, J.B.; Bollen, K.A.

    2008-01-01

    Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically-based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling heterogeneous concepts of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially-reduced-form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influence of suites of variables are often of interest. ?? Springer Science+Business Media, LLC 2007.

  19. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  20. Equivalent Conditions of Complete Convergence for Weighted Sums of Sequences of Negatively Dependent Random Variables

    Directory of Open Access Journals (Sweden)

    Mingle Guo

    2012-01-01

    Full Text Available The complete convergence for weighted sums of sequences of negatively dependent random variables is investigated. By applying moment inequality and truncation methods, the equivalent conditions of complete convergence for weighted sums of sequences of negatively dependent random variables are established. These results not only extend the corresponding results obtained by Li et al. (1995, Gut (1993, and Liang (2000 to sequences of negatively dependent random variables, but also improve them.

  1. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    National Research Council Canada - National Science Library

    Nadia Mushtaq; Noor Ul Amin; Muhammad Hanif

    2017-01-01

    In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable...

  2. Latent variable indirect response modeling of categorical endpoints representing change from baseline.

    Science.gov (United States)

    Hu, Chuanpu; Xu, Zhenhua; Mendelsohn, Alan M; Zhou, Honghui

    2013-02-01

    Accurate exposure-response modeling is important in drug development. Methods are still evolving in the use of mechanistic, e.g., indirect response (IDR) models to relate discrete endpoints, mostly of the ordered categorical form, to placebo/co-medication effect and drug exposure. When the discrete endpoint is derived using change-from-baseline measurements, a mechanistic exposure-response modeling approach requires adjustment to maintain appropriate interpretation. This manuscript describes a new modeling method that integrates a latent-variable representation of IDR models with standard logistic regression. The new method also extends to general link functions that cover probit regression or continuous clinical endpoint modeling. Compared to an earlier latent variable approach that constrained the baseline probability of response to be 0, placebo effect parameters in the new model formulation are more readily interpretable and can be separately estimated from placebo data, thus allowing convenient and robust model estimation. A general inherent connection of some latent variable representations with baseline-normalized standard IDR models is derived. For describing clinical response endpoints, Type I and Type III IDR models are shown to be equivalent, therefore there are only three identifiable IDR models. This approach was applied to data from two phase III clinical trials of intravenously administered golimumab for the treatment of rheumatoid arthritis, where 20, 50, and 70% improvement in the American College of Rheumatology disease severity criteria were used as efficacy endpoints. Likelihood profiling and visual predictive checks showed reasonable parameter estimation precision and model performance.

  3. Blind estimation of statistical properties of non-stationary random variables

    Science.gov (United States)

    Mansour, Ali; Mesleh, Raed; Aggoune, el-Hadi M.

    2014-12-01

    To identify or equalize wireless transmission channels, or alternatively to evaluate the performance of many wireless communication algorithms, coefficients or statistical properties of the used transmission channels are often assumed to be known or can be estimated at the receiver end. For most of the proposed algorithms, the knowledge of transmission channel statistical properties is essential to detect signals and retrieve data. To the best of our knowledge, most proposed approaches assume that transmission channels are static and can be modeled by stationary random variables (uniform, Gaussian, exponential, Weilbul, Rayleigh, etc.). In the majority of sensor networks or cellular systems applications, transmitters and/or receivers are in motion. Therefore, the validity of static transmission channels and the underlying assumptions may not be valid. In this case, coefficients and statistical properties change and therefore the stationary model falls short of making an accurate representation. In order to estimate the statistical properties (represented by the high-order statistics and probability density function, PDF) of dynamic channels, we firstly assume that the dynamic channels can be modeled by short-term stationary but long-term non-stationary random variable (RV), i.e., the RVs are stationary within unknown successive periods but they may suddenly change their statistical properties between two successive periods. Therefore, this manuscript proposes an algorithm to detect the transition phases of non-stationary random variables and introduces an indicator based on high-order statistics for non-stationary transmission which can be used to alter channel properties and initiate the estimation process. Additionally, PDF estimators based on kernel functions are also developed. The first part of the manuscript provides a brief introduction for unbiased estimators of the second and fourth-order cumulants. Then, the non-stationary indicators are formulated

  4. Vagally-Mediated Heart Rate Variability and Indices of Wellbeing: Results of a Nationally Representative Study

    Science.gov (United States)

    Sloan, Richard P; Schwarz, Emilie; McKinley, Paula S; Weinstein, Maxine; Love, Gayle; Ryff, Carol; Mroczek, Daniel; Choo, Tse; Lee, Seonjoo; Seeman, Teresa

    2016-01-01

    Objective High frequency (HF) heart rate variability (HRV) has long been accepted as an index of cardiac vagal control. Recent studies report relationships between HF-HRV and indices of positive and negative affect, personality traits and wellbeing but these studies generally are based on small and selective samples. Method These relationships were examined using data from 967 participants in the second Midlife in the US (MIDUS II) study. Participants completed survey questionnaires on wellbeing and affect. HF-HRV was measured at rest. A hierarchical series of regression analyses examined relationships between these various indices and HF-HRV before and after adjustment for relevant demographic and biomedical factors. Results Significant inverse relationships were found only between indices of negative affect and HF-HRV. Relationships between indices of psychological and hedonic wellbeing and positive affect failed to reach significance. Conclusions These findings raise questions about relationships between cardiac parasympathetic modulation, emotion regulation, and indices of wellbeing. PMID:27570892

  5. A Novel Method for Increasing the Entropy of a Sequence of Independent, Discrete Random Variables

    Directory of Open Access Journals (Sweden)

    Mieczyslaw Jessa

    2015-10-01

    Full Text Available In this paper, we propose a novel method for increasing the entropy of a sequence of independent, discrete random variables with arbitrary distributions. The method uses an auxiliary table and a novel theorem that concerns the entropy of a sequence in which the elements are a bitwise exclusive-or sum of independent discrete random variables.

  6. Fuzzy random variables — II. Algorithms and examples for the discrete case

    NARCIS (Netherlands)

    Kwakernaak, H.

    1979-01-01

    The results obtained in part I of the paper are specialized to the case of discrete fuzzy random variables. A more intuitive interpretation is given of the notion of fuzzy random variables. Algorithms are derived for determining expectations, fuzzy probabilities, fuzzy conditional expectations and

  7. Complete Moment Convergence and Mean Convergence for Arrays of Rowwise Extended Negatively Dependent Random Variables

    Directory of Open Access Journals (Sweden)

    Yongfeng Wu

    2014-01-01

    Full Text Available The authors first present a Rosenthal inequality for sequence of extended negatively dependent (END random variables. By means of the Rosenthal inequality, the authors obtain some complete moment convergence and mean convergence results for arrays of rowwise END random variables. The results in this paper extend and improve the corresponding theorems by Hu and Taylor (1997.

  8. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    Science.gov (United States)

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  9. CONVERGENCE OF THE FRACTIONAL PARTS OF THE RANDOM VARIABLES TO THE TRUNCATED EXPONENTIAL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Bogdan Gheorghe Munteanu

    2013-01-01

    Full Text Available Using the stochastic approximations, in this paper it was studiedthe convergence in distribution of the fractional parts of the sum of random variables to the truncated exponential distribution with parameter lambda. This fact is feasible by means of the Fourier-Stieltjes sequence (FSS of the random variable.

  10. Strong Laws of Large Numbers for Arrays of Rowwise NA and LNQD Random Variables

    Directory of Open Access Journals (Sweden)

    Jiangfeng Wang

    2011-01-01

    Full Text Available Some strong laws of large numbers and strong convergence properties for arrays of rowwise negatively associated and linearly negative quadrant dependent random variables are obtained. The results obtained not only generalize the result of Hu and Taylor to negatively associated and linearly negative quadrant dependent random variables, but also improve it.

  11. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    Directory of Open Access Journals (Sweden)

    Nadia Mushtaq

    2017-03-01

    Full Text Available In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable based on randomized response technique. Under stratified random sampling without replacement scheme, the expression of bias and mean square error (MSE up to the first-order approximations are derived. Theoretical and empirical results through a simulation study show that the proposed class of estimators is more efficient than the existing estimators, i.e., usual stratified random sample mean estimator, Sousa et al (2014 ratio and regression estimator of the sensitive variable in stratified sampling.

  12. A comparison of methods for representing random taste heterogeneity in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Hess, Stephane

    2009-01-01

    This paper reports the findings of a systematic study using Monte Carlo experiments and a real dataset aimed at comparing the performance of various ways of specifying random taste heterogeneity in a discrete choice model. Specifically, the analysis compares the performance of two recent advanced...

  13. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    OpenAIRE

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to writ...

  14. A New Estimator For Population Mean Using Two Auxiliary Variables in Stratified random Sampling

    OpenAIRE

    Singh, Rajesh; Malik, Sachin

    2014-01-01

    In this paper, we suggest an estimator using two auxiliary variables in stratified random sampling. The propose estimator has an improvement over mean per unit estimator as well as some other considered estimators. Expressions for bias and MSE of the estimator are derived up to first degree of approximation. Moreover, these theoretical findings are supported by a numerical example with original data. Key words: Study variable, auxiliary variable, stratified random sampling, bias and mean squa...

  15. Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift

    DEFF Research Database (Denmark)

    Lehre, Per Kristian; Witt, Carsten

    2014-01-01

    Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...

  16. Some limit theorems for negatively associated random variables

    Indian Academy of Sciences (India)

    Abstract. Let {Xn,n ≥ 1} be a sequence of negatively associated random vari- ables. The aim of this paper is to establish some limit theorems of negatively associated sequence, which include the Lp-convergence theorem and Marcinkiewicz–Zygmund strong law of large numbers. Furthermore, we consider the strong law of ...

  17. Local search methods based on variable focusing for random K -satisfiability

    Science.gov (United States)

    Lemoy, Rémi; Alava, Mikko; Aurell, Erik

    2015-01-01

    We introduce variable focused local search algorithms for satisfiabiliity problems. Usual approaches focus uniformly on unsatisfied clauses. The methods described here work by focusing on random variables in unsatisfied clauses. Variants are considered where variables are selected uniformly and randomly or by introducing a bias towards picking variables participating in several unsatistified clauses. These are studied in the case of the random 3-SAT problem, together with an alternative energy definition, the number of variables in unsatisfied constraints. The variable-based focused Metropolis search (V-FMS) is found to be quite close in performance to the standard clause-based FMS at optimal noise. At infinite noise, instead, the threshold for the linearity of solution times with instance size is improved by picking preferably variables in several UNSAT clauses. Consequences for algorithmic design are discussed.

  18. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Directory of Open Access Journals (Sweden)

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  19. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2016-01-06

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  20. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  1. New Results On the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2015-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented.

  2. CONVERGENCE OF THE FRACTIONAL PARTS OF THE RANDOM VARIABLES TO THE TRUNCATED EXPONENTIAL DISTRIBUTION

    National Research Council Canada - National Science Library

    Bogdan Gheorghe Munteanu

    2013-01-01

    Using the stochastic approximations, in this paper it was studiedthe convergence in distribution of the fractional parts of the sum of random variables to the truncated exponential distribution with parameter lambda...

  3. On bounds in Poisson approximation for distributions of independent negative-binomial distributed random variables.

    Science.gov (United States)

    Hung, Tran Loc; Giang, Le Truong

    2016-01-01

    Using the Stein-Chen method some upper bounds in Poisson approximation for distributions of row-wise triangular arrays of independent negative-binomial distributed random variables are established in this note.

  4. Testing of hypothesis of two-dimensional random variables independence on the basis of algorithm of pattern recognition

    Science.gov (United States)

    Lapko, A. V.; Lapko, V. A.; Yuronen, E. A.

    2016-11-01

    The new technique of testing of hypothesis of random variables independence is offered. Its basis is made by nonparametric algorithm of pattern recognition. The considered technique doesn't demand sampling of area of values of random variables.

  5. The Sum and Difference of Two Lognormal Random Variables

    Directory of Open Access Journals (Sweden)

    C. F. Lo

    2012-01-01

    Full Text Available We have presented a new unified approach to model the dynamics of both the sum and difference of two correlated lognormal stochastic variables. By the Lie-Trotter operator splitting method, both the sum and difference are shown to follow a shifted lognormal stochastic process, and approximate probability distributions are determined in closed form. Illustrative numerical examples are presented to demonstrate the validity and accuracy of these approximate distributions. In terms of the approximate probability distributions, we have also obtained an analytical series expansion of the exact solutions, which can allow us to improve the approximation in a systematic manner. Moreover, we believe that this new approach can be extended to study both (1 the algebraic sum of N lognormals, and (2 the sum and difference of other correlated stochastic processes, for example, two correlated CEV processes, two correlated CIR processes, and two correlated lognormal processes with mean-reversion.

  6. Separating variability in healthcare practice patterns from random error.

    Science.gov (United States)

    Thomas, Laine E; Schulte, Phillip J

    2018-01-01

    Improving the quality of care that patients receive is a major focus of clinical research, particularly in the setting of cardiovascular hospitalization. Quality improvement studies seek to estimate and visualize the degree of variability in dichotomous treatment patterns and outcomes across different providers, whereby naive techniques either over-estimate or under-estimate the actual degree of variation. Various statistical methods have been proposed for similar applications including (1) the Gaussian hierarchical model, (2) the semi-parametric Bayesian hierarchical model with a Dirichlet process prior and (3) the non-parametric empirical Bayes approach of smoothing by roughening. Alternatively, we propose that a recently developed method for density estimation in the presence of measurement error, moment-adjusted imputation, can be adapted for this problem. The methods are compared by an extensive simulation study. In the present context, we find that the Bayesian methods are sensitive to the choice of prior and tuning parameters, whereas moment-adjusted imputation performs well with modest sample size requirements. The alternative approaches are applied to identify disparities in the receipt of early physician follow-up after myocardial infarction across 225 hospitals in the CRUSADE registry.

  7. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    Science.gov (United States)

    Bancroft, Stacie L.; Bourret, Jason C.

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…

  8. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    Science.gov (United States)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  9. A unifying approach in simulating the shot peening process using a 3D random representative volume finite element model

    Directory of Open Access Journals (Sweden)

    Dianyin HU

    2017-08-01

    Full Text Available Using a modified 3D random representative volume (RV finite element model, the effects of model dimensions (impact region and interval between impact and representative regions, model shapes (rectangular, square, and circular, and peening-induced thermal softening on resultant critical quantities (residual stress, Almen intensity, coverage, and arc height after shot peening are systematically examined. A new quantity, i.e., the interval between impact and representative regions, is introduced and its optimal value is first determined to eliminate any boundary effect on shot peening results. Then, model dimensions are respectively assessed for all model shapes to reflect the actual shot peening process, based on which shape-independent critical shot peening quantities are obtained. Further, it is found that thermal softening of the target material due to shot peening leads to variances of the surface residual stress and arc height, demonstrating the necessity of considering the thermal effect in a constitutive material model of shot peeing. Our study clarifies some of the finite element modeling aspects and lays the ground for accurate modeling of the SP process.

  10. Possibility/Necessity-Based Probabilistic Expectation Models for Linear Programming Problems with Discrete Fuzzy Random Variables

    Directory of Open Access Journals (Sweden)

    Hideki Katagiri

    2017-10-01

    Full Text Available This paper considers linear programming problems (LPPs where the objective functions involve discrete fuzzy random variables (fuzzy set-valued discrete random variables. New decision making models, which are useful in fuzzy stochastic environments, are proposed based on both possibility theory and probability theory. In multi-objective cases, Pareto optimal solutions of the proposed models are newly defined. Computational algorithms for obtaining the Pareto optimal solutions of the proposed models are provided. It is shown that problems involving discrete fuzzy random variables can be transformed into deterministic nonlinear mathematical programming problems which can be solved through a conventional mathematical programming solver under practically reasonable assumptions. A numerical example of agriculture production problems is given to demonstrate the applicability of the proposed models to real-world problems in fuzzy stochastic environments.

  11. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    Science.gov (United States)

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Physical Activity, Mindfulness Meditation, or Heart Rate Variability Biofeedback for Stress Reduction: A Randomized Controlled Trial

    OpenAIRE

    van der Zwan, Judith Esi; de Vente, Wieke; Huizink, Anja C.; B?gels, Susan M.; de Bruin, Esther I.

    2015-01-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing stress and its related symptoms. We randomly allocated 126 participants to PA, MM, or HRV-BF upon enrollment, of whom 76 agreed to participate. The interventions consisted of psycho-education and a...

  13. APPROXIMATION TO OPTIMAL STOPPING RULES FOR GUMBEL RANDOM VARIABLES WITH UNKNOWN LOCATION AND SCALE PARAMETERS

    OpenAIRE

    Yeh, Tzu-Sheng; Lee, Shen-Ming

    2006-01-01

    An optimal stopping rule is a rule that stops the sampling process at a sample size n that maximizes the expected reward. In this paper we will study the approximation to optimal stopping rule for Gumbel random variables, because the Gumbel-type distribution is the most commonly referred to in discussions of extreme values. Let $X_1, X_2,\\cdots X_n,\\cdots$ be independent, identically distributed Gumbel random variables with unknown location and scale parameters,$\\alpha$ and $\\beta$. If we def...

  14. $\\Phi$-moment inequalities for independent and freely independent random variables

    OpenAIRE

    Jiao, Yong; Sukochev, Fedor; Xie, Guangheng; Zanin, Dmitriy

    2016-01-01

    This paper is devoted to the study of $\\Phi$-moments of sums of independent/freely independent random variables. More precisely, let $(f_k)_{k=1}^n$ be a sequence of positive (symmetrically distributed) independent random variables and let $\\Phi$ be an Orlicz function with $\\Delta_2$-condition. We provide an equivalent expression for the quantity $\\mathbb{E}(\\Phi(\\sum_{k=1}^n f_k))$ in term of the sum of disjoint copies of the sequence $(f_k)_{k=1}^n.$ We also prove an analogous result in the...

  15. Higher order moments of a sum of random variables: remarks and applications.

    Directory of Open Access Journals (Sweden)

    Luisa Tibiletti

    1996-02-01

    Full Text Available The moments of a sum of random variables depend on both the pure moments of each random addendum and on the addendum mixed moments. In this note we introduce a simple measure to evaluate the relative impedance to attach to the latter. Once the pure moments are fixed, the functional relation between the random addenda leading to the extreme values is also provided. Applications to Finance, Decision Theory and Actuarial Sciences are also suggested.

  16. An analysis of noise reduction in variable reluctance motors using pulse position randomization

    Science.gov (United States)

    Smoot, Melissa C.

    1994-05-01

    The design and implementation of a control system to introduce randomization into the control of a variable reluctance motor (VRM) is presented. The goal is to reduce noise generated by radial vibrations of the stator. Motor phase commutation angles are dithered by 1 or 2 mechanical degrees to investigate the effect of randomization on acoustic noise. VRM commutation points are varied using a uniform probability density function and a 4 state Markov chain among other methods. The theory of VRM and inverter operation and a derivation of the major source of acoustic noise are developed. The experimental results show the effects of randomization. Uniform dithering and Markov chain dithering both tend to spread the noise spectrum, reducing peak noise components. No clear evidence is found to determine which is the optimum randomization scheme. The benefit of commutation angle randomization in reducing VRM loudness as perceived by humans is found to be questionable.

  17. Saddlepoint approximations for the sum of independent non-identically distributed binomial random variables

    NARCIS (Netherlands)

    Eisinga, R.N.; Grotenhuis, H.F. te; Pelzer, B.J.

    2013-01-01

    We discuss saddlepoint approximations to the distribution of the sum of independent non-identically distributed binomial random variables. We examine the accuracy of the saddlepoint methods for a sum of 10 binomials with different sets of parameter values. The numerical results indicate that the

  18. Physical activity, mindfulness meditation, or heart rate variability biofeedback for stress reduction: a randomized controlled trial

    NARCIS (Netherlands)

    van der Zwan, J.E.; de Vente, W.; Huizink, A.C.; Bögels, S.M.; de Bruin, E.I.

    2015-01-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing

  19. Some Generalized Inequalities Involving Local Fractional Integrals and their Applications for Random Variables and Numerical Integration

    Directory of Open Access Journals (Sweden)

    Erden S.

    2016-12-01

    Full Text Available We establish generalized pre-Grüss inequality for local fractional integrals. Then, we obtain some inequalities involving generalized expectation, p−moment, variance and cumulative distribution function of random variable whose probability density function is bounded. Finally, some applications for generalized Ostrowski-Grüss inequality in numerical integration are given.

  20. Bounds for right tails of deterministic and stochastic sums of random variables

    NARCIS (Netherlands)

    Darkiewicz, G.; Deelstra, G.; Dhaene, J.; Hoedemakers, T.; Vanmaele, M.

    2009-01-01

    We investigate lower and upper bounds for right tails (stop-loss premiums) of deterministic and stochastic sums of nonindependent random variables. The bounds are derived using the concepts of comonotonicity, convex order, and conditioning. The performance of the presented approximations is

  1. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa.

    Science.gov (United States)

    Kapwata, Thandi; Gebreslasie, Michael T

    2016-11-16

    Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF) statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI)], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  2. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  3. Functional interpretation of representative soil spatial-temporal variability at the Central region of European territory of Russia

    Science.gov (United States)

    Vasenev, I.

    2012-04-01

    The essential spatial and temporal variability is mutual feature for most natural and man-changed soils at the Central region of European territory of Russia. The original spatial heterogeneity of forest and forest-steppe soils has been further complicated by a specific land-use history and different-direction soil successions due to environmental changes and human impacts. For demand-driven land-use planning and decision making the quantitative analysis, modeling and functional-ecological interpretation of representative soil cover patterns spatial variability is an important and challenging task that receives increasing attention from scientific society, private companies, governmental and environmental bodies. On basis of long-term different-scale soil mapping, key plot investigation, land quality and land-use evaluation, soil forming and degradation processes modeling, functional-ecological typology of the zonal set of elementary soil cover patterns (ESCP) has been done in representative natural and man transformed ecosystems of the forest, forest-steppe and steppe zones at the Central region of European territory of Russia (ETR). The validation and ranging of the limiting factors of functional quality and ecological state have been made for dominating and most dynamical components of ESCP regional-typological forms - with application of local GIS, traditional regression kriging and correlation tree models. Development, zonal-regional differentiation and verification of the basic set of criteria and algorithms for logically formalized distinguishing of the most "stable" & "hot" areas in soil cover patterns make it possible for quantitative assessment of dominating in them elementary landscape, soil-forming and degradation processes. The received data essentially expand known ranges of the soil forming processes (SFP) rate «in situ». In case of mature forests mutual for them the windthrow impacts and lateral processes make SFPs more active and complex both in

  4. Partial summations of stationary sequences of non-Gaussian random variables

    DEFF Research Database (Denmark)

    Mohr, Gunnar; Ditlevsen, Ove Dalager

    1996-01-01

    The distribution of the sum of a finite number of identically distributed random variables is in many cases easily determined given that the variables are independent. The moments of any order of the sum can always be expressed by the moments of the single term without computational problems...... of convergence of the distribution of a sum (or an integral) of mutually dependent random variables to the Gaussian distribution. The paper is closely related to the work in Ditlevsen el al. [Ditlevsen, O., Mohr, G. & Hoffmeyer, P. Integration of non-Gaussian fields. Prob. Engng Mech 11 (1996) 15-23](2)........ However, in the case of dependency between the terms even calculation of a few of the first moments of the sum presents serious computational problems. By use of computerized symbol manipulations it is practicable to obtain exact moments of partial sums of stationary sequences of mutually dependent...

  5. Gradual stiffness versus magnetic imaging-guided variable stiffness colonoscopes: A randomized noninferiority trial.

    Science.gov (United States)

    Garborg, Kjetil; Wiig, Håvard; Hasund, Audun; Matre, Jon; Holme, Øyvind; Noraberg, Geir; Løberg, Magnus; Kalager, Mette; Adami, Hans-Olov; Bretthauer, Michael

    2017-02-01

    Colonoscopes with gradual stiffness have recently been developed to enhance cecal intubation. We aimed to determine if the performance of gradual stiffness colonoscopes is noninferior to that of magnetic endoscopic imaging (MEI)-guided variable stiffness colonoscopes. Consecutive patients were randomized to screening colonoscopy with Fujifilm gradual stiffness or Olympus MEI-guided variable stiffness colonoscopes. The primary endpoint was cecal intubation rate (noninferiority limit 5%). Secondary endpoints included cecal intubation time. We estimated absolute risk differences with 95% confidence intervals (CIs). We enrolled 475 patients: 222 randomized to the gradual stiffness instrument, and 253 to the MEI-guided variable stiffness instrument. Cecal intubation rate was 91.7% in the gradual stiffness group versus 95.6% in the variable stiffness group. The adjusted absolute risk for cecal intubation failure was 4.3% higher in the gradual stiffness group than in the variable stiffness group (upper CI border 8.1%). Median cecal intubation time was 13 minutes in the gradual stiffness group and 10 minutes in the variable stiffness group (p < 0.001). The study is inconclusive with regard to noninferiority because the 95% CI for the difference in cecal intubation rate between the groups crosses the noninferiority margin. (ClinicalTrials.gov identifier: NCT01895504).

  6. Effects of Yoga on Heart Rate Variability and Mood in Women: A Randomized Controlled Trial.

    Science.gov (United States)

    Chu, I-Hua; Lin, Yuh-Jen; Wu, Wen-Lan; Chang, Yu-Kai; Lin, I-Mei

    2015-12-01

    To examine the effects of an 8-week yoga program on heart rate variability and mood in generally healthy women. Randomized controlled trial. Fifty-two healthy women were randomly assigned to a yoga group or a control group. Participants in the yoga group completed an 8-week yoga program, which comprised a 60-minute session twice a week. Each session consisted of breathing exercises, yoga pose practice, and supine meditation/relaxation. The control group was instructed not to engage in any yoga practice and to maintain their usual level of physical activity during the study. Participants' heart rate variability, perceived stress, depressive symptoms, and state and trait anxiety were assessed at baseline (week 0) and after the intervention (week 9). No measures of heart rate variability changed significantly in either the yoga or control group after intervention. State anxiety was reduced significantly in the yoga group but not in the control group. No significant changes were noted in perceived stress, depression, or trait anxiety in either group. An 8-week yoga program was not sufficient to improve heart rate variability. However, such a program appears to be effective in reducing state anxiety in generally healthy women. Future research should involve longer periods of yoga training, include heart rate variability measures both at rest and during yoga practice, and enroll women with higher levels of stress and trait anxiety.

  7. Random and systematic spatial variability of 137Cs inventories at reference sites in South-Central Brazil

    Directory of Open Access Journals (Sweden)

    Correchel Vladia

    2005-01-01

    Full Text Available The precision of the 137Cs fallout redistribution technique for the evaluation of soil erosion rates is strongly dependent on the quality of an average inventory taken at a representative reference site. The knowledge of the sources and of the degree of variation of the 137Cs fallout spatial distribution plays an important role on its use. Four reference sites were selected in the South-Central region of Brazil which were characterized in terms of soil chemical, physical and mineralogical aspects as well as the spatial variability of 137Cs inventories. Some important differences in the patterns of 137Cs depth distribution in the soil profiles of the different sites were found. They are probably associated to chemical, physical, mineralogical and biological differences of the soils but many questions still remain open for future investigation, mainly those regarding the adsorption and dynamics of the 137Cs ions in soil profiles under tropical conditions. The random spatial variability (inside each reference site was higher than the systematic spatial variability (between reference sites but their causes were not clearly identified as possible consequences of chemical, physical, mineralogical variability, and/or precipitation.

  8. Robust design with imprecise random variables and its application in hydrokinetic turbine optimization

    Science.gov (United States)

    Hu, Zhen; Du, Xiaoping; Kolekar, Nitin S.; Banerjee, Arindam

    2014-03-01

    In robust design, uncertainty is commonly modelled with precise probability distributions. In reality, the distribution types and distribution parameters may not always be available owing to limited data. This research develops a robust design methodology to accommodate the mixture of both precise and imprecise random variables. By incorporating the Taguchi quality loss function and the minimax regret criterion, the methodology mitigates the effects of not only uncertain parameters but also uncertainties in the models of the uncertain parameters. Hydrokinetic turbine systems are a relatively new alternative energy technology, and both precise and imprecise random variables exist in the design of such systems. The developed methodology is applied to the robust design optimization of a hydrokinetic turbine system. The results demonstrate the effectiveness of the proposed methodology.

  9. Multi-Variable, Multi-Layer Graphical Knowledge Unit for Storing and Representing Density Clusters of Multi-Dimensional Big Data

    Directory of Open Access Journals (Sweden)

    K. K. L. B. Adikaram

    2016-04-01

    Full Text Available A multi-variable visualization technique on a 2D bitmap for big data is introduced. If A and B are two data points that are represented using two similar shapes with m pixels, where each shape is colored with RGB color of (0, 0, k, when A ∩ B ≠ ɸ, adding the color of A ∩ B gives higher color as (0, 0, 2k and the highlight as a high density cluster, where RGB stands for Red, Green, Blue and k is the blue color. This is the hypothesis behind the single variable graphical knowledge unit (GKU, which uses the entire bit range of a pixel for a single variable. Instead, the available bit range of a pixel is split, and a pixel can be used for representing multiple variables (multi-variables. However, this will limit the bit block for single variables and limit the amount of overlapping. Using the same size k (>1 bitmaps (multi-layers will increase the number of bits per variable (BPV, where each (x, y of an individual layer represents the same data point. Then, one pixel in a four-layer GKU is capable of showing more than four billion overlapping ones when BPV = 8 bits (2(BPV × number of layers Then, the 32-bit pixel format allows the representation of a maximum of up to four dependent variables against one independent variable. Then, a four-layer GKU of w width and h height has the capacity of representing a maximum of (2(BPV × number of layers × m × w × h overlapping occurrences.

  10. Modified Exponential Type Estimator for Population Mean Using Auxiliary Variables in Stratified Random Sampling

    OpenAIRE

    Özel, Gamze

    2015-01-01

    In this paper, a new exponential type estimator is developed in the stratified random sampling for the population mean using auxiliary variable information. In order to evaluate efficiency of the introduced estimator, we first review some estimators and study the optimum property of the suggested strategy. To judge the merits of the suggested class of estimators over others under the optimal condition, simulation study and real data applications are conducted. The results show that the introduc...

  11. Equivalent conditions of complete moment convergence for extended negatively dependent random variables

    Directory of Open Access Journals (Sweden)

    Qunying Wu

    2017-05-01

    Full Text Available Abstract In this paper, we study the equivalent conditions of complete moment convergence for sequences of identically distributed extended negatively dependent random variables. As a result, we extend and generalize some results of complete moment convergence obtained by Chow (Bull. Inst. Math. Acad. Sin. 16:177-201, 1988 and Li and Spătaru (J. Theor. Probab. 18:933-947, 2005 from the i.i.d. case to extended negatively dependent sequences.

  12. An edgeworth expansion for a sum of M-Dependent random variables

    Directory of Open Access Journals (Sweden)

    Wan Soo Rhee

    1985-01-01

    Full Text Available Given a sequence X1,X2,…,Xn of m-dependent random variables with moments of order 3+α (0<α≦1, we give an Edgeworth expansion of the distribution of Sσ−1(S=X1+X2+…+Xn, σ2=ES2 under the assumption that E[exp(it Sσ1] is small away from the origin. The result is of the best possible order.

  13. Residual and Past Entropy for Concomitants of Ordered Random Variables of Morgenstern Family

    Directory of Open Access Journals (Sweden)

    M. M. Mohie EL-Din

    2015-01-01

    Full Text Available For a system, which is observed at time t, the residual and past entropies measure the uncertainty about the remaining and the past life of the distribution, respectively. In this paper, we have presented the residual and past entropy of Morgenstern family based on the concomitants of the different types of generalized order statistics (gos and give the linear transformation of such model. Characterization results for these dynamic entropies for concomitants of ordered random variables have been considered.

  14. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  15. The effect of cluster size variability on statistical power in cluster-randomized trials.

    Directory of Open Access Journals (Sweden)

    Stephen A Lauer

    Full Text Available The frequency of cluster-randomized trials (CRTs in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power.

  16. Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways.

    Science.gov (United States)

    Burgess, Stephen; Daniel, Rhian M; Butterworth, Adam S; Thompson, Simon G

    2015-04-01

    Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes. © The Author 2014. Published by Oxford University Press on behalf of the International Epidemiological Association.

  17. Boundary-layer turbulent processes and mesoscale variability represented by numerical weather prediction models during the BLLAST campaign

    Science.gov (United States)

    Couvreux, Fleur; Bazile, Eric; Canut, Guylaine; Seity, Yann; Lothon, Marie; Lohou, Fabienne; Guichard, Françoise; Nilsson, Erik

    2016-07-01

    This study evaluates the ability of three operational models, with resolution varying from 2.5 to 16 km, to predict the boundary-layer turbulent processes and mesoscale variability observed during the Boundary Layer Late-Afternoon and Sunset Turbulence (BLLAST) field campaign. We analyse the representation of the vertical profiles of temperature and humidity and the time evolution of near-surface atmospheric variables and the radiative and turbulent fluxes over a total of 12 intensive observing periods (IOPs), each lasting 24 h. Special attention is paid to the evolution of the turbulent kinetic energy (TKE), which was sampled by a combination of independent instruments. For the first time, this variable, a central one in the turbulence scheme used in AROME and ARPEGE, is evaluated with observations.In general, the 24 h forecasts succeed in reproducing the variability from one day to another in terms of cloud cover, temperature and boundary-layer depth. However, they exhibit some systematic biases, in particular a cold bias within the daytime boundary layer for all models. An overestimation of the sensible heat flux is noted for two points in ARPEGE and is found to be partly related to an inaccurate simplification of surface characteristics. AROME shows a moist bias within the daytime boundary layer, which is consistent with overestimated latent heat fluxes. ECMWF presents a dry bias at 2 m above the surface and also overestimates the sensible heat flux. The high-resolution model AROME resolves the vertical structures better, in particular the strong daytime inversion and the thin evening stable boundary layer. This model is also able to capture some specific observed features, such as the orographically driven subsidence and a well-defined maximum that arises during the evening of the water vapour mixing ratio in the upper part of the residual layer due to fine-scale advection. The model reproduces the order of magnitude of spatial variability observed at

  18. Estimating Marginal Healthcare Costs Using Genetic Variants as Instrumental Variables: Mendelian Randomization in Economic Evaluation.

    Science.gov (United States)

    Dixon, Padraig; Davey Smith, George; von Hinke, Stephanie; Davies, Neil M; Hollingworth, William

    2016-11-01

    Accurate measurement of the marginal healthcare costs associated with different diseases and health conditions is important, especially for increasingly prevalent conditions such as obesity. However, existing observational study designs cannot identify the causal impact of disease on healthcare costs. This paper explores the possibilities for causal inference offered by Mendelian randomization, a form of instrumental variable analysis that uses genetic variation as a proxy for modifiable risk exposures, to estimate the effect of health conditions on cost. Well-conducted genome-wide association studies provide robust evidence of the associations of genetic variants with health conditions or disease risk factors. The subsequent causal effects of these health conditions on cost can be estimated using genetic variants as instruments for the health conditions. This is because the approximately random allocation of genotypes at conception means that many genetic variants are orthogonal to observable and unobservable confounders. Datasets with linked genotypic and resource use information obtained from electronic medical records or from routinely collected administrative data are now becoming available and will facilitate this form of analysis. We describe some of the methodological issues that arise in this type of analysis, which we illustrate by considering how Mendelian randomization could be used to estimate the causal impact of obesity, a complex trait, on healthcare costs. We describe some of the data sources that could be used for this type of analysis. We conclude by considering the challenges and opportunities offered by Mendelian randomization for economic evaluation.

  19. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  20. Continuous-time random-walk model of transport in variably saturated heterogeneous porous media.

    Science.gov (United States)

    Zoia, Andrea; Néel, Marie-Christine; Cortis, Andrea

    2010-03-01

    We propose a unified physical framework for transport in variably saturated porous media. This approach allows fluid flow and solute migration to be treated as ensemble averages of fluid and solute particles, respectively. We consider the cases of homogeneous and heterogeneous porous materials. Within a fractal mobile-immobile continuous time random-walk framework, the heterogeneity will be characterized by algebraically decaying particle retention times. We derive the corresponding (nonlinear) continuum-limit partial differential equations and we compare their solutions to Monte Carlo simulation results. The proposed methodology is fairly general and can be used to track fluid and solutes particles trajectories for a variety of initial and boundary conditions.

  1. Spatial versus day-to-day within-lake variability in tropical floodplain lake CH4 emissions--developing optimized approaches to representative flux measurements.

    Directory of Open Access Journals (Sweden)

    Roberta B Peixoto

    Full Text Available Inland waters (lakes, rivers and reservoirs are now understood to contribute large amounts of methane (CH4 to the atmosphere. However, fluxes are poorly constrained and there is a need for improved knowledge on spatiotemporal variability and on ways of optimizing sampling efforts to yield representative emission estimates for different types of aquatic ecosystems. Low-latitude floodplain lakes and wetlands are among the most high-emitting environments, and here we provide a detailed investigation of spatial and day-to-day variability in a shallow floodplain lake in the Pantanal in Brazil over a five-day period. CH4 flux was dominated by frequent and ubiquitous ebullition. A strong but predictable spatial variability (decreasing flux with increasing distance to the shore or to littoral vegetation was found, and this pattern can be addressed by sampling along transects from the shore to the center. Although no distinct day-to-day variability were found, a significant increase in flux was identified from measurement day 1 to measurement day 5, which was likely attributable to a simultaneous increase in temperature. Our study demonstrates that representative emission assessments requires consideration of spatial variability, but also that spatial variability patterns are predictable for lakes of this type and may therefore be addressed through limited sampling efforts if designed properly (e.g., fewer chambers may be used if organized along transects. Such optimized assessments of spatial variability are beneficial by allowing more of the available sampling resources to focus on assessing temporal variability, thereby improving overall flux assessments.

  2. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  3. On the Distribution of Indefinite Quadratic Forms in Gaussian Random Variables

    KAUST Repository

    Al-Naffouri, Tareq Y.

    2015-10-30

    © 2015 IEEE. In this work, we propose a unified approach to evaluating the CDF and PDF of indefinite quadratic forms in Gaussian random variables. Such a quantity appears in many applications in communications, signal processing, information theory, and adaptive filtering. For example, this quantity appears in the mean-square-error (MSE) analysis of the normalized least-meansquare (NLMS) adaptive algorithm, and SINR associated with each beam in beam forming applications. The trick of the proposed approach is to replace inequalities that appear in the CDF calculation with unit step functions and to use complex integral representation of the the unit step function. Complex integration allows us then to evaluate the CDF in closed form for the zero mean case and as a single dimensional integral for the non-zero mean case. Utilizing the saddle point technique allows us to closely approximate such integrals in non zero mean case. We demonstrate how our approach can be extended to other scenarios such as the joint distribution of quadratic forms and ratios of such forms, and to characterize quadratic forms in isotropic distributed random variables.We also evaluate the outage probability in multiuser beamforming using our approach to provide an application of indefinite forms in communications.

  4. Physical activity, mindfulness meditation, or heart rate variability biofeedback for stress reduction: a randomized controlled trial.

    Science.gov (United States)

    van der Zwan, Judith Esi; de Vente, Wieke; Huizink, Anja C; Bögels, Susan M; de Bruin, Esther I

    2015-12-01

    In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing stress and its related symptoms. We randomly allocated 126 participants to PA, MM, or HRV-BF upon enrollment, of whom 76 agreed to participate. The interventions consisted of psycho-education and an introduction to the specific intervention techniques and 5 weeks of daily exercises at home. The PA exercises consisted of a vigorous-intensity activity of free choice. The MM exercises consisted of guided mindfulness meditation. The HRV-BF exercises consisted of slow breathing with a heart rate variability biofeedback device. Participants received daily reminders for their exercises and were contacted weekly to monitor their progress. They completed questionnaires prior to, directly after, and 6 weeks after the intervention. Results indicated an overall beneficial effect consisting of reduced stress, anxiety and depressive symptoms, and improved psychological well-being and sleep quality. No significant between-intervention effect was found, suggesting that PA, MM, and HRV-BF are equally effective in reducing stress and its related symptoms. These self-help interventions provide easily accessible help for people with stress complaints.

  5. Events of Borel Sets, Construction of Borel Sets and Random Variables for Stochastic Finance

    Directory of Open Access Journals (Sweden)

    Jaeger Peter

    2014-09-01

    Full Text Available We consider special events of Borel sets with the aim to prove, that the set of the irrational numbers is an event of the Borel sets. The set of the natural numbers, the set of the integer numbers and the set of the rational numbers are countable, so we can use the literature [10] (pp. 78-81 as a basis for the similar construction of the proof. Next we prove, that different sets can construct the Borel sets [16] (pp. 9-10. Literature [16] (pp. 9-10 and [11] (pp. 11-12 gives an overview, that there exists some other sets for this construction. Last we define special functions as random variables for stochastic finance in discrete time. The relevant functions are implemented in the article [15], see [9] (p. 4. The aim is to construct events and random variables, which can easily be used with a probability measure. See as an example theorems (10 and (14 in [20]. Then the formalization is more similar to the presentation used in the book [9]. As a background, further literatures is [3] (pp. 9-12, [13] (pp. 17-20, and [8] (pp.32-35.

  6. Marcinkiewicz-type strong law of large numbers for double arrays of pairwise independent random variables

    Directory of Open Access Journals (Sweden)

    Dug Hun Hong

    1999-01-01

    Full Text Available Let {Xij} be a double sequence of pairwise independent random variables. If P{|Xmn|≥t}≤P{|X|≥t} for all nonnegative real numbers t and E|X|p(log+|X|3<∞, for 1random variables under the conditions E|X|p(log+|X|r+1<∞,E|X|p(log+|X|r−1<∞, respectively, thus, extending Choi and Sung's result [1] of the one-dimensional case.

  7. Multivariate non-normally distributed random variables in climate research – introduction to the copula approach

    Directory of Open Access Journals (Sweden)

    P. Friederichs

    2008-10-01

    Full Text Available Probability distributions of multivariate random variables are generally more complex compared to their univariate counterparts which is due to a possible nonlinear dependence between the random variables. One approach to this problem is the use of copulas, which have become popular over recent years, especially in fields like econometrics, finance, risk management, or insurance. Since this newly emerging field includes various practices, a controversial discussion, and vast field of literature, it is difficult to get an overview. The aim of this paper is therefore to provide an brief overview of copulas for application in meteorology and climate research. We examine the advantages and disadvantages compared to alternative approaches like e.g. mixture models, summarize the current problem of goodness-of-fit (GOF tests for copulas, and discuss the connection with multivariate extremes. An application to station data shows the simplicity and the capabilities as well as the limitations of this approach. Observations of daily precipitation and temperature are fitted to a bivariate model and demonstrate, that copulas are valuable complement to the commonly used methods.

  8. Aerobic and combined exercise sessions reduce glucose variability in type 2 diabetes: crossover randomized trial.

    Directory of Open Access Journals (Sweden)

    Franciele R Figueira

    Full Text Available To evaluate the effects of aerobic (AER or aerobic plus resistance exercise (COMB sessions on glucose levels and glucose variability in patients with type 2 diabetes. Additionally, we assessed conventional and non-conventional methods to analyze glucose variability derived from multiple measurements performed with continuous glucose monitoring system (CGMS.Fourteen patients with type 2 diabetes (56±2 years wore a CGMS during 3 days. Participants randomly performed AER and COMB sessions, both in the morning (24 h after CGMS placement, and at least 7 days apart. Glucose variability was evaluated by glucose standard deviation, glucose variance, mean amplitude of glycemic excursions (MAGE, and glucose coefficient of variation (conventional methods as well as by spectral and symbolic analysis (non-conventional methods.Baseline fasting glycemia was 139±05 mg/dL and HbA1c 7.9±0.7%. Glucose levels decreased immediately after AER and COMB protocols by ∼16%, which was sustained for approximately 3 hours. Comparing the two exercise modalities, responses over a 24-h period after the sessions were similar for glucose levels, glucose variance and glucose coefficient of variation. In the symbolic analysis, increases in 0 V pattern (COMB, 67.0±7.1 vs. 76.0±6.3, P = 0.003 and decreases in 1 V pattern (COMB, 29.1±5.3 vs. 21.5±5.1, P = 0.004 were observed only after the COMB session.Both AER and COMB exercise modalities reduce glucose levels similarly for a short period of time. The use of non-conventional analysis indicates reduction of glucose variability after a single session of combined exercises.Aerobic training, aerobic-resistance training and glucose profile (CGMS in type 2 diabetes (CGMS exercise. ClinicalTrials.gov ID: NCT00887094.

  9. Heuristic methods using variable neighborhood random local search for the clustered traveling salesman problem

    Directory of Open Access Journals (Sweden)

    Mário Mestria

    2014-11-01

    Full Text Available In this paper, we propose new heuristic methods for solver the Clustered Traveling Salesman Problem (CTSP. The CTSP is a generalization of the Traveling Salesman Problem (TSP in which the set of vertices is partitioned into disjoint clusters and objective is to find a minimum cost Hamiltonian cycle such that the vertices of each cluster are visited contiguously. We develop two Variable Neighborhood Random Descent with Iterated Local for solver the CTSP. The heuristic methods proposed were tested in types of instances with data at different level of granularity for the number of vertices and clusters. The computational results showed that the heuristic methods outperform recent existing methods in the literature and they are competitive with an exact algorithm using the Parallel CPLEX software.

  10. Understanding variability in hospital-specific costs of coronary artery bypass grafting represents an opportunity for standardizing care and improving resource use.

    Science.gov (United States)

    Kilic, Arman; Shah, Ashish S; Conte, John V; Mandal, Kaushik; Baumgartner, William A; Cameron, Duke E; Whitman, Glenn J R

    2014-01-01

    This study was undertaken to examine interhospital variability in inpatient costs of coronary artery bypass grafting (CABG). The Nationwide Inpatient Sample was used to identify isolated CABGs performed between 2005 and 2008 in the United States. Charges for inpatient care were supplied by the data set, and hospital charge-to-cost ratios were used to derive inpatient costs for each patient and aggregated at the hospital level. Mixed-effect linear regression models were created to evaluate variability in costs between hospitals adjusting for 34 patient, operative, complication, and hospital-related variables. A total of 633 hospitals performed isolated CABG in 183,973 patients. In unadjusted analysis, there was significant baseline variability in average inpatient costs of CABG between hospitals (SD, $12,130; P cost of performing CABG per hospital ($40,424). After risk adjustment, significant variability in average costs between hospitals persisted (P costs compared with the hospital effect. There is a wide variation in the cost of performing CABG in the United States. We determined that individual hospital centers, independent of multiple patient- and outcome-specific factors, are drivers of these differences. Comparison of hospital-specific behavior with identification of the causes of cost discrepancies represents an opportunity for standardization of care and improvement in resource use. Copyright © 2014. Published by Mosby, Inc.

  11. Effects of Yoga on Heart Rate Variability and Depressive Symptoms in Women: A Randomized Controlled Trial.

    Science.gov (United States)

    Chu, I-Hua; Wu, Wen-Lan; Lin, I-Mei; Chang, Yu-Kai; Lin, Yuh-Jen; Yang, Pin-Chen

    2017-04-01

    The purpose of the study was to investigate the effects of a 12-week yoga program on heart rate variability (HRV) and depressive symptoms in depressed women. This was a randomized controlled trial. Twenty-six sedentary women scoring ≥14 on the Beck Depression Inventory-II were randomized to either the yoga or the control group. The yoga group completed a 12-week yoga program, which took place twice a week for 60 min per session and consisted of breathing exercises, yoga pose practice, and supine meditation/relaxation. The control group was instructed not to engage in any yoga practice and to maintain their usual level of physical activity during the course of the study. Participants' HRV, depressive symptoms, and perceived stress were assessed at baseline and post-test. The yoga group had a significant increase in high-frequency HRV and decreases in low-frequency HRV and low frequency/high frequency ratio after the intervention. The yoga group also reported significantly reduced depressive symptoms and perceived stress. No change was found in the control group. A 12-week yoga program was effective in increasing parasympathetic tone and reducing depressive symptoms and perceived stress in women with elevated depressive symptoms. Regular yoga practice may be recommended for women to cope with their depressive symptoms and stress and to improve their HRV.

  12. Resting heart rate variability after yogic training and swimming: A prospective randomized comparative trial.

    Science.gov (United States)

    Sawane, Manish Vinayak; Gupta, Shilpa Sharad

    2015-01-01

    Resting heart rate variability (HRV) is a measure of the modulation of autonomic nervous system (ANS) at rest. Increased HRV achieved by the exercise is good for the cardiovascular health. However, prospective studies with comparison of the effects of yogic exercises and those of other endurance exercises like walking, running, and swimming on resting HRV are conspicuous by their absence. Study was designed to assess and compare the effects of yogic training and swimming on resting HRV in normal healthy young volunteers. Study was conducted in Department of Physiology in a medical college. Study design was prospective randomized comparative trial. One hundred sedentary volunteers were randomly ascribed to either yoga or swimming group. Baseline recordings of digital electrocardiogram were done for all the subjects in cohorts of 10. After yoga training and swimming for 12 weeks, evaluation for resting HRV was done again. Percentage change for each parameter with yoga and swimming was compared using unpaired t-test for data with normal distribution and using Mann-Whitney U test for data without normal distribution. Most of the HRV parameters improved statistically significantly by both modalities of exercise. However, some of the HRV parameters showed statistically better improvement with yoga as compared to swimming. Practicing yoga seems to be the mode of exercise with better improvement in autonomic functions as suggested by resting HRV.

  13. A general instrumental variable framework for regression analysis with outcome missing not at random.

    Science.gov (United States)

    Tchetgen Tchetgen, Eric J; Wirth, Kathleen E

    2017-02-23

    The instrumental variable (IV) design is a well-known approach for unbiased evaluation of causal effects in the presence of unobserved confounding. In this article, we study the IV approach to account for selection bias in regression analysis with outcome missing not at random. In such a setting, a valid IV is a variable which (i) predicts the nonresponse process, and (ii) is independent of the outcome in the underlying population. We show that under the additional assumption (iii) that the IV is independent of the magnitude of selection bias due to nonresponse, the population regression in view is nonparametrically identified. For point estimation under (i)-(iii), we propose a simple complete-case analysis which modifies the regression of primary interest by carefully incorporating the IV to account for selection bias. The approach is developed for the identity, log and logit link functions. For inferences about the marginal mean of a binary outcome assuming (i) and (ii) only, we describe novel and approximately sharp bounds which unlike Robins-Manski bounds, are smooth in model parameters, therefore allowing for a straightforward approach to account for uncertainty due to sampling variability. These bounds provide a more honest account of uncertainty and allows one to assess the extent to which a violation of the key identifying condition (iii) might affect inferences. For illustration, the methods are used to account for selection bias induced by HIV testing nonparticipation in the evaluation of HIV prevalence in the Zambian Demographic and Health Surveys. © 2017, The International Biometric Society.

  14. Among ten sociodemographic and lifestyle variables, smoking is strongly associated with biomarkers of acrylamide exposure in a representative sample of the US population1,2,3

    Science.gov (United States)

    Vesper, Hubert W.; Sternberg, Maya R.; Frame, Tunde; Pfeiffer, Christine M.

    2016-01-01

    Hemoglobin adducts of acrylamide (HbAA) and glycidamide (HbGA) have been measured as biomarkers of acrylamide exposure and metabolism in a nationally representative sample of the US population in the NHANES 2003–2004. We assessed the association of sociodemographic (age, sex, race-ethnicity, education, and income) and lifestyle variables (smoking, alcohol consumption, BMI, physical activity, and dietary supplement use) with these biomarkers in US adults (≥20 y). We used bivariate and multiple regression models and assessed the magnitude of an estimated change in biomarker level with change in a covariable for 2 biomarkers of acrylamide exposure. Smoking was strongly and significantly correlated with HbAA and HbGA levels (rs=0.51 and 0.42, respectively), with biomarker concentrations being 126% and 101% higher in smokers compared to nonsmokers after adjusting for sociodemographic and lifestyle covariates. Age was moderately and significantly correlated with both biomarkers (rs=−0.21 and −0.22, respectively). BMI (rs=−0.11) and alcohol consumption (rs=0.13) were weakly yet significantly correlated with HbAA levels only. The estimated percent change in biomarker concentration was ≤20% for all variables other than smoking after adjusting for sociodemographic and lifestyle covariates. Using multiple regression models, the sociodemographic variables explained 9% and 7%, while the sociodemographic and lifestyle variables together explained 46% and 25% of the variability in HbAA and HbGA, respectively, showing the importance of considering and adequately controlling for these variables in future studies. Our findings will be useful in the design and analysis of future studies that assess and evaluate exposure to acrylamide and its metabolism to glycidamide. PMID:23596166

  15. Among 10 sociodemographic and lifestyle variables, smoking is strongly associated with biomarkers of acrylamide exposure in a representative sample of the U.S. Population.

    Science.gov (United States)

    Vesper, Hubert W; Sternberg, Maya R; Frame, Tunde; Pfeiffer, Christine M

    2013-06-01

    Hemoglobin adducts of acrylamide (HbAA) and glycidamide (HbGA) have been measured as biomarkers of acrylamide exposure and metabolism in a nationally representative sample of the U.S. population in the NHANES 2003-2004. We assessed the association of sociodemographic (age, sex, race-ethnicity, education, and income) and lifestyle (smoking, alcohol consumption, BMI, physical activity, and dietary supplement use) variables with these biomarkers in U.S. adults (aged ≥ 20 y). We used bivariate and multiple regression models and assessed the magnitude of an estimated change in biomarker concentration with change in a covariable for 2 biomarkers of acrylamide exposure. Smoking was strongly and significantly correlated with HbAA and HbGA concentrations (rs = 0.51 and 0.42, respectively), with biomarker concentrations being 126 and 101% higher in smokers compared with nonsmokers after adjusting for sociodemographic and lifestyle covariates. Age was moderately and significantly correlated with both biomarkers (rs = -0.21 and -0.22, respectively). BMI (rs = -0.11) and alcohol consumption (rs = 0.13) were weakly yet significantly correlated with HbAA concentrations only. The estimated percentage change in biomarker concentration was ≤ 20% for all variables other than smoking after adjusting for sociodemographic and lifestyle covariates. Using multiple regression models, the sociodemographic variables explained 9 and 7% whereas the sociodemographic and lifestyle variables together explained 46 and 25% of the variability in HbAA and HbGA, respectively, showing the importance of considering and adequately controlling for these variables in future studies. Our findings will be useful in the design and analysis of future studies that assess and evaluate exposure to acrylamide and its metabolism to glycidamide.

  16. Investigation of the reduction in uncertainty due to soil variability when conditioning a random field using Kriging

    NARCIS (Netherlands)

    Lloret Cabot, M.; Hicks, M.A.; Van den Eijnden, A.P.

    2012-01-01

    Spatial variability of soil properties is inherent in soil deposits, whether as a result of natural geological processes or engineering construction. It is therefore important to account for soil variability in geotechnical design in order to represent more realistically a soil’s in situ state. This

  17. A yoga & exercise randomized controlled trial for vasomotor symptoms: Effects on heart rate variability.

    Science.gov (United States)

    Jones, Salene M W; Guthrie, Katherine A; Reed, Susan D; Landis, Carol A; Sternfeld, Barbara; LaCroix, Andrea Z; Dunn, Andrea; Burr, Robert L; Newton, Katherine M

    2016-06-01

    Heart rate variability (HRV) reflects the integration of the parasympathetic nervous system with the rest of the body. Studies on the effects of yoga and exercise on HRV have been mixed but suggest that exercise increases HRV. We conducted a secondary analysis of the effect of yoga and exercise on HRV based on a randomized clinical trial of treatments for vasomotor symptoms in peri/post-menopausal women. Randomized clinical trial of behavioral interventions in women with vasomotor symptoms (n=335), 40-62 years old from three clinical study sites. 12-weeks of a yoga program, designed specifically for mid-life women, or a supervised aerobic exercise-training program with specific intensity and energy expenditure goals, compared to a usual activity group. Time and frequency domain HRV measured at baseline and at 12 weeks for 15min using Holter monitors. Women had a median of 7.6 vasomotor symptoms per 24h. Time and frequency domain HRV measures did not change significantly in either of the intervention groups compared to the change in the usual activity group. HRV results did not differ when the analyses were restricted to post-menopausal women. Although yoga and exercise have been shown to increase parasympathetic-mediated HRV in other populations, neither intervention increased HRV in middle-aged women with vasomotor symptoms. Mixed results in previous research may be due to sample differences. Yoga and exercise likely improve short-term health in middle-aged women through mechanisms other than HRV. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. How Far Is Quasar UV/Optical Variability from a Damped Random Walk at Low Frequency?

    Science.gov (United States)

    Guo, Hengxiao; Wang, Junxian; Cai, Zhenyi; Sun, Mouyuan

    2017-10-01

    Studies have shown that UV/optical light curves of quasars can be described using the prevalent damped random walk (DRW) model, also known as the Ornstein-Uhlenbeck process. A white noise power spectral density (PSD) is expected at low frequency in this model; however, a direct observational constraint to the low-frequency PSD slope is difficult due to the limited lengths of the light curves available. Meanwhile, quasars show scatter in their DRW parameters that is too large to be attributed to uncertainties in the measurements and dependence on the variation of known physical factors. In this work we present simulations showing that, if the low-frequency PSD deviates from the DRW, the red noise leakage can naturally produce large scatter in the variation parameters measured from simulated light curves. The steeper the low-frequency PSD slope, the larger scatter we expect. Based on observations of SDSS Stripe 82 quasars, we find that the low-frequency PSD slope should be no steeper than -1.3. The actual slope could be flatter, which consequently requires that the quasar variabilities should be influenced by other unknown factors. We speculate that the magnetic field and/or metallicity could be such additional factors.

  19. Some Bounds on the Deviation Probability for Sums of Nonnegative Random Variables Using Upper Polynomials, Moment and Probability Generating Functions

    OpenAIRE

    From, Steven G.

    2010-01-01

    We present several new bounds for certain sums of deviation probabilities involving sums of nonnegative random variables. These are based upon upper bounds for the moment generating functions of the sums. We compare these new bounds to those of Maurer [2], Bernstein [4], Pinelis [16], and Bentkus [3]. We also briefly discuss the infinitely divisible distributions case.

  20. LONG-TERM VARIABILITY OF BRONCHIAL RESPONSIVENESS TO HISTAMINE IN A RANDOM-POPULATION SAMPLE OF ADULTS

    NARCIS (Netherlands)

    RIJCKEN, B; SCHOUTEN, JP; WEISS, ST; ROSNER, B; DEVRIES, K; VANDERLENDE, R

    1993-01-01

    Long-term variability of bronchial responsiveness has been studied in a random population sample of adults. During a follow-up period of 18 yr, 2,216 subjects contributed 5,012 observations to the analyses. Each subject could have as many as seven observations. Bronchial responsiveness was assessed

  1. Flux tower in a mixed forest: spatial representativeness of seasonal footprints and the influence of land cover variability on the flux measurement

    Science.gov (United States)

    Kim, J.; Schaaf, C.; Hwang, T.

    2015-12-01

    Flux tower measurements using eddy-covariance techniques are used as the primary data for calibration and validation of remote sensing estimates and ecosystem models. Therefore, understanding the characteristics of the land surface contributing to the flux, the so-called footprint, is critical to upscale tower flux to the regional landscape. This is especially true for the towers locating in heterogeneous ecosystems such as mixed forests. Here we (1) estimated the seasonal footprints of a flux tower, the EMS-tower (US-Ha1) in the Long Term Ecological Research (LTER) Harvard Forest, from 1992 to 2008 with a footprint climatology. The Harvard Forest is a temperate mixed-species ecosystem that is composed of deciduous stands (red oak and red maple) and evergreen coniferous stands (eastern hemlock and white pine). The heterogeneity of the landscape is primarily driven by the phenology of the deciduous stands which are not uniformly distributed over the forest and around the tower. The overall prevailing footprints are known to lie toward the southwest and northwest, but there were profound interannual variability in the extents and the orientations of the seasonal footprints. Furthermore we (2) examined whether vegetation density variation within the tower footprint in each season could adequately represent the vegetation density characteristics of moderate spatial resolution remote sensing estimates and ecosystem models (i.e. 1.0 km and 1.5 km). The footprints were found to cover enough area to be representative of the 1.0 km scale but not 1.5 km scale. Finally we (3) investigated the influence of the interannual variations in the land cover variability in the footprints on the seasonal flux measurements from 1999 to 2008, and found almost half of the interannual anomalies in the summertime GPP flux can be explained by the coniferous stand fraction within the footprint.

  2. Informed decision-making with and for people with dementia - efficacy of the PRODECIDE education program for legal representatives: protocol of a randomized controlled trial (PRODECIDE-RCT).

    Science.gov (United States)

    Lühnen, Julia; Haastert, Burkhard; Mühlhauser, Ingrid; Richter, Tanja

    2017-09-15

    In Germany, the guardianship system provides adults who are no longer able to handle their own affairs a court-appointed legal representative, for support without restriction of legal capacity. Although these representatives only rarely are qualified in healthcare, they nevertheless play decisive roles in the decision-making processes for people with dementia. Previously, we developed an education program (PRODECIDE) to address this shortcoming and tested it for feasibility. Typical, autonomy-restricting decisions in the care of people with dementia-namely, using percutaneous endoscopic gastrostomy (PEG) or physical restrains (PR), or the prescription of antipsychotic drugs (AP)-were the subject areas trained. The training course aims to enhance the competency of legal representatives in informed decision-making. In this study, we will evaluate the efficacy of the PRODECIDE education program. A randomized controlled trial with a six-month follow-up will be conducted to compare the PRODECIDE education program with standard care, enrolling legal representatives (N = 216). The education program lasts 10 h and comprises four modules: A, decision-making processes and methods; and B, C and D, evidence-based knowledge about PEG, PR and AP, respectively. The primary outcome measure is knowledge, which is operationalized as the understanding of decision-making processes in healthcare affairs and in setting realistic expectations about benefits and harms of PEG, PR and AP in people with dementia. Secondary outcomes are sufficient and sustainable knowledge and percentage of persons concerned affected by PEG, FEM or AP. A qualitative process evaluation will be performed. Additionally, to support implementation, a concept for translating the educational contents into e-learning modules will be developed. The study results will show whether the efficacy of the education program could justify its implementation into the regular training curricula for legal representatives

  3. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-05-02

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).

  4. Rationale and study design of ViPS - variable pressure support for weaning from mechanical ventilation: study protocol for an international multicenter randomized controlled open trial.

    Science.gov (United States)

    Kiss, Thomas; Güldner, Andreas; Bluth, Thomas; Uhlig, Christopher; Spieth, Peter Markus; Markstaller, Klaus; Ullrich, Roman; Jaber, Samir; Santos, Jose Alberto; Mancebo, Jordi; Camporota, Luigi; Beale, Richard; Schettino, Guilherme; Saddy, Felipe; Vallverdú, Immaculada; Wiedemann, Bärbel; Koch, Thea; Schultz, Marcus Josephus; Pelosi, Paolo; de Abreu, Marcelo Gama

    2013-10-31

    In pressure support ventilation (PSV), a non-variable level of pressure support is delivered by the ventilator when triggered by the patient. In contrast, variable PSV delivers a level of pressure support that varies in a random fashion, introducing more physiological variability to the respiratory pattern. Experimental studies show that variable PSV improves gas exchange, reduces lung inflammation and the mean pressure support, compared to non-variable PSV. Thus, it can theoretically shorten weaning from the mechanical ventilator. The ViPS (variable pressure support) trial is an international investigator-initiated multicenter randomized controlled open trial comparing variable vs. non-variable PSV. Adult patients on controlled mechanical ventilation for more than 24 hours who are ready to be weaned are eligible for the study. The randomization sequence is blocked per center and performed using a web-based platform. Patients are randomly assigned to one of the two groups: variable PSV or non-variable PSV. In non-variable PSV, breath-by-breath pressure support is kept constant and targeted to achieve a tidal volume of 6 to 8 ml/kg. In variable PSV, the mean pressure support level over a specific time period is targeted at the same mean tidal volume as non-variable PSV, but individual levels vary randomly breath-by-breath. The primary endpoint of the trial is the time to successful weaning, defined as the time from randomization to successful extubation. ViPS is the first randomized controlled trial investigating whether variable, compared to non-variable PSV, shortens the duration of weaning from mechanical ventilation in a mixed population of critically ill patients. This trial aims to determine the role of variable PSV in the intensive care unit. clinicaltrials.gov NCT01769053.

  5. Perspective: Is Random Monoallelic Expression a Contributor to Phenotypic Variability of Autosomal Dominant Disorders?

    National Research Council Canada - National Science Library

    Baoheng Gui; Jesse Slone; Taosheng Huang

    2017-01-01

    Several factors have been proposed as contributors to interfamilial and intrafamilial phenotypic variability in autosomal dominant disorders, including allelic variation, modifier genes, environmental...

  6. Genetic variability of cultivated cowpea in Benin assessed by random amplified polymorphic DNA

    NARCIS (Netherlands)

    Zannou, A.; Kossou, D.K.; Ahanchédé, A.; Zoundjihékpon, J.; Agbicodo, E.; Struik, P.C.; Sanni, A.

    2008-01-01

    Characterization of genetic diversity among cultivated cowpea [Vigna unguiculata (L.) Walp.] varieties is important to optimize the use of available genetic resources by farmers, local communities, researchers and breeders. Random amplified polymorphic DNA (RAPD) markers were used to evaluate the

  7. Testing in a Random Effects Panel Data Model with Spatially Correlated Error Components and Spatially Lagged Dependent Variables

    Directory of Open Access Journals (Sweden)

    Ming He

    2015-11-01

    Full Text Available We propose a random effects panel data model with both spatially correlated error components and spatially lagged dependent variables. We focus on diagnostic testing procedures and derive Lagrange multiplier (LM test statistics for a variety of hypotheses within this model. We first construct the joint LM test for both the individual random effects and the two spatial effects (spatial error correlation and spatial lag dependence. We then provide LM tests for the individual random effects and for the two spatial effects separately. In addition, in order to guard against local model misspecification, we derive locally adjusted (robust LM tests based on the Bera and Yoon principle (Bera and Yoon, 1993. We conduct a small Monte Carlo simulation to show the good finite sample performances of these LM test statistics and revisit the cigarette demand example in Baltagi and Levin (1992 to illustrate our testing procedures.

  8. Heart rate variability biofeedback in patients with alcohol dependence: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Penzlin AI

    2015-10-01

    Full Text Available Ana Isabel Penzlin,1 Timo Siepmann,2 Ben Min-Woo Illigens,3 Kerstin Weidner,4 Martin Siepmann4 1Institute of Clinical Pharmacology, 2Department of Neurology, University Hospital Carl Gustav Carus, Technische Universität Dresden, Dresden, Saxony, Germany; 3Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA; 4Department of Psychotherapy and Psychosomatic Medicine, University Hospital Carl Gustav Carus, Technische Universität Dresden, Dresden, Saxony, Germany Background and objective: In patients with alcohol dependence, ethyl-toxic damage of vasomotor and cardiac autonomic nerve fibers leads to autonomic imbalance with neurovascular and cardiac dysfunction, the latter resulting in reduced heart rate variability (HRV. Autonomic imbalance is linked to increased craving and cardiovascular mortality. In this study, we sought to assess the effects of HRV biofeedback training on HRV, vasomotor function, craving, and anxiety. Methods: We conducted a randomized controlled study in 48 patients (14 females, ages 25–59 years undergoing inpatient rehabilitation treatment. In the treatment group, patients (n=24 attended six sessions of HRV biofeedback over 2 weeks in addition to standard rehabilitative care, whereas, in the control group, subjects received standard care only. Psychometric testing for craving (Obsessive Compulsive Drinking Scale, anxiety (Symptom Checklist-90-Revised, HRV assessment using coefficient of variation of R-R intervals (CVNN analysis, and vasomotor function assessment using laser Doppler flowmetry were performed at baseline, immediately after completion of treatment or control period, and 3 and 6 weeks afterward (follow-ups 1 and 2. Results: Psychometric testing showed decreased craving in the biofeedback group immediately postintervention (OCDS scores: 8.6±7.9 post-biofeedback versus 13.7±11.0 baseline [mean ± standard deviation], P<0.05, whereas craving was unchanged at

  9. Variability in fusarium head blight epidemics in relation to global climate fluctuations as represented by the El Niño-Southern Oscillation and other atmospheric patterns.

    Science.gov (United States)

    Kriss, A B; Paul, P A; Madden, L V

    2012-01-01

    Cross-spectral analysis was used to characterize the relationship between climate variability, represented by atmospheric patterns, and annual fluctuations of Fusarium head blight (FHB) disease intensity in wheat. Time series investigated were the Oceanic Niño Index (ONI), which is a measure of the El Niño-Southern Oscillation (ENSO), the Pacific-North American (PNA) pattern and the North Atlantic Oscillation (NAO), which are known to have strong influences on the Northern Hemisphere climate, and FHB disease intensity observations in Ohio from 1965 to 2010 and in Indiana from 1973 to 2008. For each climate variable, mean climate index values for the boreal winter (December to February) and spring (March to May) were utilized. The spectral density of each time series and the (squared) coherency of each pair of FHB-climate-index series were estimated. Significance for coherency was determined by a nonparametric permutation procedure. Results showed that winter and spring ONI were significantly coherent with FHB in Ohio, with a period of about 5.1 years (as well as for some adjacent periods). The estimated phase-shift distribution indicated that there was a generally negative relation between the two series, with high values of FHB (an indication of a major epidemic) estimated to occur about 1 year following low values of ONI (indication of a La Niña); equivalently, low values of FHB were estimated to occur about 1 year after high values of ONI (El Niño). There was also limited evidence that winter ONI had significant coherency with FHB in Indiana. At periods between 2 and 7 years, the PNA and NAO indices were coherent with FHB in both Ohio and Indiana, although results for phase shift and period depended on the specific location, climate index, and time span used in calculating the climate index. Differences in results for Ohio and Indiana were expected because the FHB disease series for the two states were not similar. Results suggest that global climate indices

  10. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    , we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  11. Development of a 3-D Variable-Direction Anisotropy program, VDA-3D, to represent normal and tangential fluxes, in 3-D groundwater flow modeling

    Science.gov (United States)

    Umari, A. M.; Kipp, K. L.

    2013-12-01

    A computer program, VDA-3D, for groundwater flow simulation with a 3-dimensional anisotropic hydraulic conductivity tensor [K] has been developed, which represents normal fluxes with the Kxx, Kyy, Kzz components of [K], and tangential fluxes with the Kxy, Kxz, Kyz components. The need to simulate tangential fluxes occurs when the principal directions of the hydraulic conductivity tensor are not aligned with the model coordinates. Off-diagonal components of the conductivity tensor relate Darcy flux components to head gradient components that do not point in the same direction as the flux components. The program for 3-Dimensional Variable-Direction Anisotropy (VDA-3D) is based on a method developed by Edwards and Rogers (1998) and is an extension to 3 dimensions of the 2-dimensional Layer Variable-Direction Anisotropy (LVDA) package developed by Anderman and others (2002) for the USGS MODFLOW groundwater modeling program. The Edwards method is based on the traditional mass balance of water for a finite-difference-discretization cell of aquifer material, and enforces continuity of water flux across each of the 6 cell faces. VDA-3D is used to apply the Edwards method to a set of 1-D, 2-D, and 3-D test problems, some homogeneous, one with heterogeneity between two zones of the grid, and one with heterogeneity from cell to cell; each problem has boundary conditions of either constant head or constant flux. One test problem with constant head boundaries uses distributions of sources and sinks that are calculated to represent a problem with a given analytic solution. A second program has been written to implement an alternate method to simulate tangential fluxes, developed by Li and others (2010) and referred to as the Lzgh method. Like VDA-3D, the Lzgh method formulates the finite difference discretization of the flow equation for a medium with heterogeneous anisotropic hydraulic conductivity. In the Lzgh method, the conductivity is not required to be uniform over each

  12. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were

  13. Marginal Distributions of Random Vectors Generated by Affine Transformations of Independent Two-Piece Normal Variables

    Directory of Open Access Journals (Sweden)

    Maximiano Pinheiro

    2012-01-01

    Full Text Available Marginal probability density and cumulative distribution functions are presented for multidimensional variables defined by nonsingular affine transformations of vectors of independent two-piece normal variables, the most important subclass of Ferreira and Steel's general multivariate skewed distributions. The marginal functions are obtained by first expressing the joint density as a mixture of Arellano-Valle and Azzalini's unified skew-normal densities and then using the property of closure under marginalization of the latter class.

  14. Marginal Distributions of Random Vectors Generated by Affine Transformations of Independent Two-Piece Normal Variables

    OpenAIRE

    Maximiano Pinheiro

    2012-01-01

    Marginal probability density and cumulative distribution functions are presented for multidimensional variables defined by nonsingular affine transformations of vectors of independent two-piece normal variables, the most important subclass of Ferreira and Steel's general multivariate skewed distributions. The marginal functions are obtained by first expressing the joint density as a mixture of Arellano-Valle and Azzalini's unified skew-normal densities and then using the property of closure u...

  15. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  16. Rationale and study design of ViPS - variable pressure support for weaning from mechanical ventilation: study protocol for an international multicenter randomized controlled open trial

    NARCIS (Netherlands)

    Kiss, Thomas; Güldner, Andreas; Bluth, Thomas; Uhlig, Christopher; Spieth, Peter Markus; Markstaller, Klaus; Ullrich, Roman; Jaber, Samir; Santos, Jose Alberto; Mancebo, Jordi; Camporota, Luigi; Beale, Richard; Schettino, Guilherme; Saddy, Felipe; Vallverdú, Immaculada; Wiedemann, Bärbel; Koch, Thea; Schultz, Marcus Josephus; Pelosi, Paolo; de Abreu, Marcelo Gama

    2013-01-01

    In pressure support ventilation (PSV), a non-variable level of pressure support is delivered by the ventilator when triggered by the patient. In contrast, variable PSV delivers a level of pressure support that varies in a random fashion, introducing more physiological variability to the respiratory

  17. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    Science.gov (United States)

    Chaibub Neto, Elias

    2016-11-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials.

  18. On The Distribution Of Mixed Sum Of Independent Random Variables One Of Them Associated With Srivastava's Polynomials And H -Function

    Directory of Open Access Journals (Sweden)

    Singh Jagdev

    2014-07-01

    Full Text Available In this paper, we obtain the distribution of mixed sum of two independent random variables with different probability density functions. One with probability density function defined in finite range and the other with probability density function defined in infinite range and associated with product of Srivastava's polynomials and H-function. We use the Laplace transform and its inverse to obtain our main result. The result obtained here is quite general in nature and is capable of yielding a large number of corresponding new and known results merely by specializing the parameters involved therein. To illustrate, some special cases of our main result are also given.

  19. The mesoscopic conductance of disordered rings, its random matrix theory and the generalized variable range hopping picture

    Energy Technology Data Exchange (ETDEWEB)

    Stotland, Alexander; Peer, Tal; Cohen, Doron [Department of Physics, Ben-Gurion University, Beer-Sheva 84005 (Israel); Budoyo, Rangga; Kottos, Tsampikos [Department of Physics, Wesleyan University, Middletown, CT 06459 (United States)

    2008-07-11

    The calculation of the conductance of disordered rings requires a theory that goes beyond the Kubo-Drude formulation. Assuming 'mesoscopic' circumstances the analysis of the electro-driven transitions shows similarities with a percolation problem in energy space. We argue that the texture and the sparsity of the perturbation matrix dictate the value of the conductance, and study its dependence on the disorder strength, ranging from the ballistic to the Anderson localization regime. An improved sparse random matrix model is introduced to capture the essential ingredients of the problem, and leads to a generalized variable range hopping picture. (fast track communication)

  20. Sum of ratios of products forα-μ random variables in wireless multihop relaying and multiple scattering

    KAUST Repository

    Wang, Kezhi

    2014-09-01

    The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.

  1. Impact of different privacy conditions and incentives on survey response rate, participant representativeness, and disclosure of sensitive information: a randomized controlled trial.

    Science.gov (United States)

    Murdoch, Maureen; Simon, Alisha Baines; Polusny, Melissa Anderson; Bangerter, Ann Kay; Grill, Joseph Patrick; Noorbaloochi, Siamak; Partin, Melissa Ruth

    2014-07-16

    Anonymous survey methods appear to promote greater disclosure of sensitive or stigmatizing information compared to non-anonymous methods. Higher disclosure rates have traditionally been interpreted as being more accurate than lower rates. We examined the impact of 3 increasingly private mailed survey conditions-ranging from potentially identifiable to completely anonymous-on survey response and on respondents' representativeness of the underlying sampling frame, completeness in answering sensitive survey items, and disclosure of sensitive information. We also examined the impact of 2 incentives ($10 versus $20) on these outcomes. A 3X2 factorial, randomized controlled trial of 324 representatively selected, male Gulf War I era veterans who had applied for United States Department of Veterans Affairs (VA) disability benefits. Men were asked about past sexual assault experiences, childhood abuse, combat, other traumas, mental health symptoms, and sexual orientation. We used a novel technique, the pre-merged questionnaire, to link anonymous responses to administrative data. Response rates ranged from 56.0% to 63.3% across privacy conditions (p = 0.49) and from 52.8% to 68.1% across incentives (p = 0.007). Respondents' characteristics differed by privacy and by incentive assignments, with completely anonymous respondents and $20 respondents appearing least different from their non-respondent counterparts. Survey completeness did not differ by privacy or by incentive. No clear pattern of disclosing sensitive information by privacy condition or by incentive emerged. For example, although all respondents came from the same sampling frame, estimates of sexual abuse ranged from 13.6% to 33.3% across privacy conditions, with the highest estimate coming from the intermediate privacy condition (p = 0.007). Greater privacy and larger incentives do not necessarily result in higher disclosure rates of sensitive information than lesser privacy and lower incentives. Furthermore

  2. A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.

    Science.gov (United States)

    Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco

    2005-02-01

    Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.

  3. A new mean estimator using auxiliary variables for randomized response models

    Science.gov (United States)

    Ozgul, Nilgun; Cingi, Hulya

    2013-10-01

    Randomized response models are commonly used in surveys dealing with sensitive questions such as abortion, alcoholism, sexual orientation, drug taking, annual income, tax evasion to ensure interviewee anonymity and reduce nonrespondents rates and biased responses. Starting from the pioneering work of Warner [7], many versions of RRM have been developed that can deal with quantitative responses. In this study, new mean estimator is suggested for RRM including quantitative responses. The mean square error is derived and a simulation study is performed to show the efficiency of the proposed estimator to other existing estimators in RRM.

  4. Spontaneous temporal changes and variability of peripheral nerve conduction analyzed using a random effects model

    DEFF Research Database (Denmark)

    Krøigård, Thomas; Gaist, David; Otto, Marit

    2014-01-01

    . Peroneal nerve distal motor latency, motor conduction velocity, and compound motor action potential amplitude; sural nerve sensory action potential amplitude and sensory conduction velocity; and tibial nerve minimal F-wave latency were examined in 51 healthy subjects, aged 40 to 67 years. They were...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity...

  5. Variability of Fiber Elastic Moduli in Composite Random Fiber Networks Makes the Network Softer

    Science.gov (United States)

    Ban, Ehsan; Picu, Catalin

    2015-03-01

    Athermal fiber networks are assemblies of beams or trusses. They have been used to model mechanics of fibrous materials such as biopolymer gels and synthetic nonwovens. Elasticity of these networks has been studied in terms of various microstructural parameters such as the stiffness of their constituent fibers. In this work we investigate the elasticity of composite fiber networks made from fibers with moduli sampled from a distribution function. We use finite elements simulations to study networks made by 3D Voronoi and Delaunay tessellations. The resulting data collapse to power laws showing that variability in fiber stiffness makes fiber networks softer. We also support the findings by analytical arguments. Finally, we apply these results to a network with curved fibers to explain the dependence of the network's modulus on the variation of its structural parameters.

  6. Randomized Trial of a Lifestyle Physical Activity Intervention for Breast Cancer Survivors: Effects on Transtheoretical Model Variables.

    Science.gov (United States)

    Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen

    2018-01-01

    This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.

  7. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    Science.gov (United States)

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P instrumental variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference.

  8. Representing dispositions

    Directory of Open Access Journals (Sweden)

    Röhl Johannes

    2011-08-01

    Full Text Available Abstract Dispositions and tendencies feature significantly in the biomedical domain and therefore in representations of knowledge of that domain. They are not only important for specific applications like an infectious disease ontology, but also as part of a general strategy for modelling knowledge about molecular interactions. But the task of representing dispositions in some formal ontological systems is fraught with several problems, which are partly due to the fact that Description Logics can only deal well with binary relations. The paper will discuss some of the results of the philosophical debate about dispositions, in order to see whether the formal relations needed to represent dispositions can be broken down to binary relations. Finally, we will discuss problems arising from the possibility of the absence of realizations, of multi-track or multi-trigger dispositions and offer suggestions on how to deal with them.

  9. An MGF-based Unified Framework to Determine the Joint Statistics of Partial Sums of Ordered Random Variables

    CERN Document Server

    Nam, Sung Sik; Yang, Hong-Chuan

    2010-01-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks (Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels.

  10. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks(Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels. © 2006 IEEE.

  11. The effects of yoga on psychosocial variables and exercise adherence: a randomized, controlled pilot study.

    Science.gov (United States)

    Bryan, Stephanie; Pinto Zipp, Genevieve; Parasher, Raju

    2012-01-01

    Physical inactivity is a serious issue for the American public. Because of conditions that result from inactivity, individuals incur close to $1 trillion USD in health-care costs, and approximately 250 000 premature deaths occur per year. Researchers have linked engaging in yoga to improved overall fitness, including improved muscular strength, muscular endurance, flexibility, and balance. Researchers have not yet investigated the impact of yoga on exercise adherence. The research team assessed the effects of 10 weeks of yoga classes held twice a week on exercise adherence in previously sedentary adults. The research team designed a randomized controlled pilot trial. The team collected data from the intervention (yoga) and control groups at baseline, midpoint, and posttest (posttest 1) and also collected data pertaining to exercise adherence for the yoga group at 5 weeks posttest (posttest 2). The pilot took place in a yoga studio in central New Jersey in the United States. The pretesting occurred at the yoga studio for all participants. Midpoint testing and posttesting occurred at the studio for the yoga group and by mail for the control group. Participants were 27 adults (mean age 51 y) who had been physically inactive for a period of at least 6 months prior to the study. Interventions The intervention group (yoga group) received hour-long hatha yoga classes that met twice a week for 10 weeks. The control group did not participate in classes during the research study; however, they were offered complimentary post research classes. Outcome Measures The study's primary outcome measure was exercise adherence as measured by the 7-day Physical Activity Recall. The secondary measures included (1) exercise self-efficacy as measured by the Multidimensional Self-Efficacy for Exercise Scale, (2) general well-being as measured by the General Well-Being Schedule, (3) exercise-group cohesion as measured by the Group Environment Questionnaire (GEQ), (4) acute feeling response

  12. Status inconsistency and mental health: A random effects and instrumental variables analysis using 14 annual waves of cohort data.

    Science.gov (United States)

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Petrie, Dennis

    2017-09-01

    Status inconsistency refers to a discrepancy between the position a person holds in one domain of their social environment comparative to their position in another domain. For example, the experience of being overeducated for a job, or not using your skills in your job. We sought to assess the relationship between status inconsistency and mental health using 14 annual waves of cohort data. We used two approaches to measuring status inconsistency: 1) being overeducated for your job (objective measure); and b) not using your skills in your job (subjective measure). We implemented a number of methodological approaches to assess the robustness of our findings, including instrumental variable, random effects, and fixed effects analysis. Mental health was assessed using the Mental Health Inventory-5. The random effects analysis indicates that only the subjective measure of status inconsistency was associated with a slight decrease in mental health (β-1.57, 95% -1.78 to -1.36, p social determinants (such as work and education) and health outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Representing Development

    DEFF Research Database (Denmark)

    Representing Development presents the different social representations that have formed the idea of development in Western thinking over the past three centuries. Offering an acute perspective on the current state of developmental science and providing constructive insights into future pathways...... and development, addressing their contemporary enactments and reflecting on future theoretical and empirical directions. The first section of the book provides an historical account of early representations of development that, having come from life science, has shaped the way in which developmental science has...... approached development. Section two focuses upon the contemporary issues of developmental psychology, neuroscience and developmental science at large. The final section offers a series of commentaries pointing to the questions opened by the previous chapters, looking to outline the future lines...

  14. Analysis of genetic variability within Argulus japonicus from representatives of Africa, Middle East, and Asia revealed by sequences of three mitochondrial DNA genes.

    Science.gov (United States)

    Wadeh, Hicham; Alsarakibi, Muhamd; Li, Guoqing

    2010-08-01

    This study investigated the genetic variability within fish louse Argulus japonicus (Crustacea: Branchiura) from Africa, Middle East, and Asia by polymerase chain reaction in three mitochondrial DNA (mtDNA) regions, namely, cytochrome c oxidase subunit 1 (cox1) and NADH dehydrogenase subunits 1 and 4 (nad1 and nad4). Six different sequences from a portion of the cox1 gene (pcox1) and a portion of the nad1 and nad4 genes (pnad1 and pnad4) for ten adult specimens from infected fish in China, Egypt, and Syria were amplified separately from individual and the amplicons were subjected to direct sequencing. A + T percentages were 68.8-69% for pcox1, 77.1-77.6% for pnad1, and 60.4-60.9% for pnad4. Among all the collected parasites, A. japonicus sequence variations were 0.0-1.9% for cox1, 0.0-2.3% for nad1, and 0.0-0.8% for nad4. In rivers, sequence variations among all individuals were 0.4-0.8% for cox1, 1.0-2.3% for nad1, and 0.4-0.8% for nad4, while sequence variations among all the collected parasites in fish farms were 0.6-1.9% for cox1, 0.0-1.7% for nad1, and 0.2-0.6% for nad4. The nad1 was the most variable gene among selected markers, while nad4 was a more conserved gene than cox1. All isolates of A. japonicus were sister to Argulus americanus in phylogenetic tree and they grouped together in one sub-clade, while isolates from China and Egypt fish farms were closely clustered together. However, moderate genetic drift and slight mutation could be observed among A. japonicus individuals. These findings demonstrated the convenience and attributes of the three selected mtDNA sequences for population genetic studies of A. japonicus where nad1 is a new and reliable marker to detect the sequence variation among A. japonicus individuals.

  15. Large Intra-subject Variability in Caffeine Pharmacokinetics: Randomized Cross-over Study of Single Caffeine Product.

    Science.gov (United States)

    Hammami, Muhammad M; Alvi, Syed N

    2017-09-01

    Background Average bioequivalence has been criticized for not adequately addressing individual variations. Importance of subjects' blinding in bioequivalence studies has not been well studied. We explored the extent of intra-subject pharmacokinetic variability and effect of drug-ingestion unawareness in subjects taking single caffeine product. Methods A single-dose randomized cross-over design was used to compare pharmacokinetics of 200 mg caffeine, described as caffeine (overt) or as placebo (covert). Maximum concentration (Cmax), Cmax first time (Tmax), area-under-the-concentration-time-curve, to last measured concentration (AUCT), extrapolated to infinity (AUCI), or to Tmax of overt caffeine (AUCOverttmax), and Cmax/AUCI were calculated blindly using standard non-compartmental method. Percentages of individual covert/overt ratios that are outside the ±25% range were determined. Covert-vs-overt effect on caffeine pharmacokinetics was evaluated by 90% confidence interval (CI) and 80.00-125.00% bioequivalence range. Results 32 healthy subjects (6% females, mean (SD) age 33.3 (7.2) year) participated in the study (28 analysed). Out of the 28 individual covert/overt ratios, 23% were outside the ±25% range for AUCT, 30% for AUCI, 20% for AUCOverttmax, 30% for Cmax, and 43% for Tmax. There was no significant covert-vs-overt difference in any of the pharmacokinetic parameters studied. Further, the 90% CIs for AUCT, AUCI, Cmax, AUCOverttmax, and Cmax/AUCI were all within the 80.00-125.00% bioequivalence range with mean absolute deviation of covert/overt ratios of 3.31%, 6.29%, 1.43%, 1.87%, and 5.19%, respectively. Conclusions Large intra-subject variability in main caffeine pharmacokinetic parameters was noted when comparing an oral caffeine product to itself. Subjects' blinding may not be important in average bioequivalence studies. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Statistics of α-μ Random Variables and Their Applications inWireless Multihop Relaying and Multiple Scattering Channels

    KAUST Repository

    Wang, Kezhi

    2015-06-01

    Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.

  17. Experimental Evaluation of Novel Master-Slave Configurations for Position Control under Random Network Delay and Variable Load for Teleoperation

    Directory of Open Access Journals (Sweden)

    Ahmet Kuzu

    2014-01-01

    Full Text Available This paper proposes two novel master-slave configurations that provide improvements in both control and communication aspects of teleoperation systems to achieve an overall improved performance in position control. The proposed novel master-slave configurations integrate modular control and communication approaches, consisting of a delay regulator to address problems related to variable network delay common to such systems, and a model tracking control that runs on the slave side for the compensation of uncertainties and model mismatch on the slave side. One of the configurations uses a sliding mode observer and the other one uses a modified Smith predictor scheme on the master side to ensure position transparency between the master and slave, while reference tracking of the slave is ensured by a proportional-differentiator type controller in both configurations. Experiments conducted for the networked position control of a single-link arm under system uncertainties and randomly varying network delays demonstrate significant performance improvements with both configurations over the past literature.

  18. Post traumatic stress symptoms and heart rate variability in Bihar flood survivors following yoga: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Joshi Meesha

    2010-03-01

    Full Text Available Abstract Background An earlier study showed that a week of yoga practice was useful in stress management after a natural calamity. Due to heavy rain and a rift on the banks of the Kosi river, in the state of Bihar in north India, there were floods with loss of life and property. A week of yoga practice was given to the survivors a month after the event and the effect was assessed. Methods Twenty-two volunteers (group average age ± S.D, 31.5 ± 7.5 years; all of them were males were randomly assigned to two groups, yoga and a non-yoga wait-list control group. The yoga group practiced yoga for an hour daily while the control group continued with their routine activities. Both groups' heart rate variability, breath rate, and four symptoms of emotional distress using visual analog scales, were assessed on the first and eighth day of the program. Results There was a significant decrease in sadness in the yoga group (p Conclusions A week of yoga can reduce feelings of sadness and possibly prevent an increase in anxiety in flood survivors a month after the calamity. Trial Registration Clinical Trials Registry of India: CTRI/2009/091/000285

  19. Community representatives: representing the "community"?

    Science.gov (United States)

    Jewkes, R; Murcott, A

    1998-04-01

    This paper takes as its starting point the apparent disjunction between the assumptions of the self-evidence of the meaning of community in major international declarations and strategies which promote community participation and the observation that meanings of "community" are a subject of extensive debate in literatures of social analysis and to some extent health. Given that the word's meaning is not agreed, those working to promote "community participation" in health are forced to adjudicate on competing meanings in order to operationalise the notion. This raises questions about how this is done and what are the implications of particular choices for what may be achieved by the participating "community". This paper presents the findings of an empirical study which examined the manner in which ideas of "community" are operationalised by people engaged in encouraging community participation in health promotion in the context of the selection of members for health for all steering groups in healthy cities projects in the United Kingdom. It argues that the demands of the role of the "community representative" are such that particular interpretations of "community" achieve ascendance. The paper explores the consequences of the interpretation of "community" as part of the "voluntary sector" and argues that this may compromise one of the stated desired outcomes of community participation i.e. extending democracy in health decision-making.

  20. Convergence Analysis of Semi-Implicit Euler Methods for Solving Stochastic Age-Dependent Capital System with Variable Delays and Random Jump Magnitudes

    Directory of Open Access Journals (Sweden)

    Qinghui Du

    2014-01-01

    Full Text Available We consider semi-implicit Euler methods for stochastic age-dependent capital system with variable delays and random jump magnitudes, and investigate the convergence of the numerical approximation. It is proved that the numerical approximate solutions converge to the analytical solutions in the mean-square sense under given conditions.

  1. Effects of Yoga on Stress, Stress Adaption, and Heart Rate Variability Among Mental Health Professionals--A Randomized Controlled Trial.

    Science.gov (United States)

    Lin, Shu-Ling; Huang, Ching-Ya; Shiu, Shau-Ping; Yeh, Shu-Hui

    2015-08-01

    Mental health professionals experiencing work-related stress may experience burn out, leading to a negative impact on their organization and patients. The aim of this study was to examine the effects of yoga classes on work-related stress, stress adaptation, and autonomic nerve activity among mental health professionals. A randomized controlled trial was used, which compared the outcomes between the experimental (e.g., yoga program) and the control groups (e.g., no yoga exercise) for 12 weeks. Work-related stress and stress adaptation were assessed before and after the program. Heart rate variability (HRV) was measured at baseline, midpoint through the weekly yoga classes (6 weeks), and postintervention (after 12 weeks of yoga classes). The results showed that the mental health professionals in the yoga group experienced a significant reduction in work-related stress (t = -6.225, p yoga and control groups, we found the yoga group significantly decreased work-related stress (t = -3.216, p = .002), but there was no significant change in stress adaptation (p = .084). While controlling for the pretest scores of work-related stress, participants in yoga, but not the control group, revealed a significant increase in autonomic nerve activity at midpoint (6 weeks) test (t = -2.799, p = .007), and at posttest (12 weeks; t = -2.099, p = .040). Because mental health professionals experienced a reduction in work-related stress and an increase in autonomic nerve activity in a weekly yoga program for 12 weeks, clinicians, administrators, and educators should offer yoga classes as a strategy to help health professionals reduce their work-related stress and balance autonomic nerve activities. © 2015 The Authors. Worldviews on Evidence-Based Nursing published by Wiley Periodicals, Inc. on behalf of Society for Worldviews on Evidence-Based Nursing.

  2. Night-to-Night Sleep Variability in Older Adults With Chronic Insomnia: Mediators and Moderators in a Randomized Controlled Trial of Brief Behavioral Therapy (BBT-I).

    Science.gov (United States)

    Chan, Wai Sze; Williams, Jacob; Dautovich, Natalie D; McNamara, Joseph P H; Stripling, Ashley; Dzierzewski, Joseph M; Berry, Richard B; McCoy, Karin J M; McCrae, Christina S

    2017-11-15

    Sleep variability is a clinically significant variable in understanding and treating insomnia in older adults. The current study examined changes in sleep variability in the course of brief behavioral therapy for insomnia (BBT-I) in older adults who had chronic insomnia. Additionally, the current study examined the mediating mechanisms underlying reductions of sleep variability and the moderating effects of baseline sleep variability on treatment responsiveness. Sixty-two elderly participants were randomly assigned to either BBT-I or self-monitoring and attention control (SMAC). Sleep was assessed by sleep diaries and actigraphy from baseline to posttreatment and at 3-month follow-up. Mixed models were used to examine changes in sleep variability (within-person standard deviations of weekly sleep parameters) and the hypothesized mediation and moderation effects. Variabilities in sleep diary-assessed sleep onset latency (SOL) and actigraphy-assessed total sleep time (TST) significantly decreased in BBT-I compared to SMAC (Pseudo R(2) = .12, .27; P = .018, .008). These effects were mediated by reductions in bedtime and wake time variability and time in bed. Significant time × group × baseline sleep variability interactions on sleep outcomes indicated that participants who had higher baseline sleep variability were more responsive to BBT-I; their actigraphy-assessed TST, SOL, and sleep efficiency improved to a greater degree (Pseudo R(2) = .15 to .66; P sleep variability in older adults who have chronic insomnia. Increased consistency in bedtime and wake time and decreased time in bed mediate reductions of sleep variability. Baseline sleep variability may serve as a marker of high treatment responsiveness to BBT-I. ClinicalTrials.gov, Identifier: NCT02967185.

  3. Random functions and turbulence

    CERN Document Server

    Panchev, S

    1971-01-01

    International Series of Monographs in Natural Philosophy, Volume 32: Random Functions and Turbulence focuses on the use of random functions as mathematical methods. The manuscript first offers information on the elements of the theory of random functions. Topics include determination of statistical moments by characteristic functions; functional transformations of random variables; multidimensional random variables with spherical symmetry; and random variables and distribution functions. The book then discusses random processes and random fields, including stationarity and ergodicity of random

  4. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial

    NARCIS (Netherlands)

    Spieth, Peter M.; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J.; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-01-01

    General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary

  5. Variable versus conventional lung protective mechanical ventilation during open abdominal surgery: study protocol for a randomized controlled trial

    OpenAIRE

    Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J.; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo

    2014-01-01

    Background General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventila...

  6. Effect of an office worksite-based yoga program on heart rate variability: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Chang Dennis

    2011-07-01

    Full Text Available Abstract Background Chronic work-related stress is a significant and independent risk factor for cardiovascular and metabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. Heart rate variability (HRV provides an estimate of parasympathetic and sympathetic autonomic control, and can serve as a marker of physiological stress. Hatha yoga is a physically demanding practice that can help to reduce stress; however, time constraints incurred by work and family life may limit participation. The purpose of the present study is to determine if a 10-week, worksite-based yoga program delivered during lunch hour can improve resting HRV and related physical and psychological parameters in sedentary office workers. Methods and design This is a parallel-arm RCT that will compare the outcomes of participants assigned to the experimental treatment group (yoga to those assigned to a no-treatment control group. Participants randomized to the experimental condition will engage in a 10-week yoga program delivered at their place of work. The yoga sessions will be group-based, prescribed three times per week during lunch hour, and will be led by an experienced yoga instructor. The program will involve teaching beginner students safely and progressively over 10 weeks a yoga sequence that incorporates asanas (poses and postures, vinyasa (exercises, pranayama (breathing control and meditation. The primary outcome of this study is the high frequency (HF spectral power component of HRV (measured in absolute units; i.e. ms2, a measure of parasympathetic autonomic control. Secondary outcomes include additional frequency and time domains of HRV, and measures of physical functioning and psychological health status. Measures will be collected prior to and following the intervention period, and at 6 months follow-up to determine the effect of intervention withdrawal. Discussion This study will determine the effect of worksite

  7. Effect of an office worksite-based yoga program on heart rate variability: outcomes of a randomized controlled trial

    Science.gov (United States)

    2013-01-01

    Background Chronic work-related stress is an independent risk factor for cardiometabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. The purpose of this study was to determine if an office worksite-based hatha yoga program could improve physiological stress, evaluated via heart rate variability (HRV), and associated health-related outcomes in a cohort of office workers. Methods Thirty-seven adults employed in university-based office positions were randomized upon the completion of baseline testing to an experimental or control group. The experimental group completed a 10-week yoga program prescribed three sessions per week during lunch hour (50 min per session). An experienced instructor led the sessions, which emphasized asanas (postures) and vinyasa (exercises). The primary outcome was the high frequency (HF) power component of HRV. Secondary outcomes included additional HRV parameters, musculoskeletal fitness (i.e. push-up, side-bridge, and sit & reach tests) and psychological indices (i.e. state and trait anxiety, quality of life and job satisfaction). Results All measures of HRV failed to change in the experimental group versus the control group, except that the experimental group significantly increased LF:HF (p = 0.04) and reduced pNN50 (p = 0.04) versus control, contrary to our hypotheses. Flexibility, evaluated via sit & reach test increased in the experimental group versus the control group (p yoga sessions (n = 11) to control (n = 19) yielded the same findings, except that the high adherers also reduced state anxiety (p = 0.02) and RMSSD (p = 0.05), and tended to improve the push-up test (p = 0.07) versus control. Conclusions A 10-week hatha yoga intervention delivered at the office worksite during lunch hour did not improve HF power or other HRV parameters. However, improvements in flexibility, state anxiety and musculoskeletal fitness were noted with high adherence

  8. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  9. Random Initialisation of the Spectral Variables: an Alternate Approach for Initiating Multivariate Curve Resolution Alternating Least Square (MCR-ALS) Analysis.

    Science.gov (United States)

    Kumar, Keshav

    2017-06-23

    Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.

  10. PEF variability, bronchial responsiveness and their relation to allergy markers in a random population (20-70 yr)

    NARCIS (Netherlands)

    Boezen, HM; Postma, DS; Schouten, JP; Kerstjens, HAM; Rijcken, B

    We investigated the coherence of bronchial hyperresponsiveness (BHR) and peak expiratory flow (PEF) variability in their relation to allergy markers and respiratory symptoms in 399 subjects (20-70 yr). Bronchial hyperresponsiveness to methacholine was defined by both the provocative dose causing a

  11. A randomized clinical trial comparing the effect of basal insulin and inhaled mealtime insulin on glucose variability and oxidative stress

    NARCIS (Netherlands)

    Siegelaar, S. E.; Kulik, W.; van Lenthe, H.; Mukherjee, R.; Hoekstra, J. B. L.; DeVries, J. H.

    2009-01-01

    To assess the effect of three times daily mealtime inhaled insulin therapy compared with once daily basal insulin glargine therapy on 72-h glucose profiles, glucose variability and oxidative stress in type 2 diabetes patients. In an inpatient crossover study, 40 subjects with type 2 diabetes were

  12. Chinese Herbal Medicine Combined with Conventional Therapy for Blood Pressure Variability in Hypertension Patients: A Systematic Review of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Zhuo Chen

    2015-01-01

    Full Text Available Objective. The aim of this systematic review is to evaluate effect of Chinese medicine combined with conventional therapy on blood pressure variability (BPV in hypertension patients. Methods. All randomized clinical trials (RCTs comparing Chinese medicine with no intervention or placebo on the basis of conventional therapy were included. Data extraction, analyses, and quality assessment were performed according to the Cochrane standards. Results. We included 13 RCTs and assessed risk of bias for all the trials. Chinese medicine has a significant effect in lowering blood pressure (BP, reducing BPV in the form of standard deviation (SD or coefficient of variability (CV, improving nighttime BP decreased rate, and reversing abnormal rhythm of BP. Conclusions. Chinese medicine was safe and showed beneficial effects on BPV in hypertension patients. However, more rigorous trials with high quality are warranted to give high level of evidence before recommending Chinese medicine as an alternative or complementary medicine to improve BPV in hypertension patients.

  13. Tai Chi training may reduce dual task gait variability, a potential mediator of fall risk, in healthy older adults: cross-sectional and randomized trial studies

    Directory of Open Access Journals (Sweden)

    Peter M Wayne

    2015-06-01

    Full Text Available BACKGROUND: Tai Chi (TC exercise improves balance and reduces falls in older, health-impaired adults. TC’s impact on dual task (DT gait parameters predictive of falls, especially in healthy active older adults, however, is unknown.PURPOSE: To compare differences in usual and DT gait between long-term TC-expert practitioners and age-/gender-matched TC-naïve adults, and to determine the effects of short-term TC training on gait in healthy, non-sedentary older adults. METHODS: A cross-sectional study compared gait in healthy TC-naïve and TC-expert (24.5±12 yrs experience older adults. TC-naïve adults then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Gait speed and stride time variability (% was assessed during 90 sec trials of undisturbed and cognitive DT (serial-subtractions conditions. RESULTS: During DT, gait speed decreased (p<0.003 and stride time variability increased (p<0.004 in all groups. Cross-sectional comparisons indicated that stride time variability was lower in the TC-expert vs. TC-naïve group, significantly so during DT (2.11% vs. 2.55%; p=0.027; in contrast, gait speed during both undisturbed and DT conditions did not differ between groups. Longitudinal analyses of TC-naïve adults randomized to 6 months of TC training or usual care identified improvement in DT gait speed in both groups. A small improvement in DT stride time variability (effect size = 0.2 was estimated with TC training, but no significant differences between groups were observed. Potentially important improvements after TC training could not be excluded in this small study. CONCLUSIONS: In healthy active older adults, positive effects of short- and long-term TC were observed only under cognitively challenging DT conditions and only for stride time variability. DT stride variability offers a potentially sensitive metric for monitoring TC’s impact on fall risk with healthy older adults.

  14. Comparison of structured and unstructured physical activity training on predicted VO2max and heart rate variability in adolescents - a randomized control trial.

    Science.gov (United States)

    Sharma, Vivek Kumar; Subramanian, Senthil Kumar; Radhakrishnan, Krishnakumar; Rajendran, Rajathi; Ravindran, Balasubramanian Sulur; Arunachalam, Vinayathan

    2017-05-01

    Physical inactivity contributes to many health issues. The WHO-recommended physical activity for adolescents encompasses aerobic, resistance, and bone strengthening exercises aimed at achieving health-related physical fitness. Heart rate variability (HRV) and maximal aerobic capacity (VO2max) are considered as noninvasive measures of cardiovascular health. The objective of this study is to compare the effect of structured and unstructured physical training on maximal aerobic capacity and HRV among adolescents. We designed a single blinded, parallel, randomized active-controlled trial (Registration No. CTRI/2013/08/003897) to compare the physiological effects of 6 months of globally recommended structured physical activity (SPA), with that of unstructured physical activity (USPA) in healthy school-going adolescents. We recruited 439 healthy student volunteers (boys: 250, girls: 189) in the age group of 12-17 years. Randomization across the groups was done using age and gender stratified randomization method, and the participants were divided into two groups: SPA (n=219, boys: 117, girls: 102) and USPA (n=220, boys: 119, girls: 101). Depending on their training status and gender the participants in both SPA and USPA groups were further subdivided into the following four sub-groups: SPA athlete boys (n=22) and girls (n=17), SPA nonathlete boys (n=95) and girls (n=85), USPA athlete boys (n=23) and girls (n=17), and USPA nonathlete boys (n=96) and girls (n=84). We recorded HRV, body fat%, and VO2 max using Rockport Walk Fitness test before and after the intervention. Maximum aerobic capacity and heart rate variability increased significantly while heart rate, systolic blood pressure, diastolic blood pressure, and body fat percentage decreased significantly after both SPA and USPA intervention. However, the improvement was more in SPA as compared to USPA. SPA is more beneficial for improving cardiorespiratory fitness, HRV, and reducing body fat percentage in terms of

  15. Acute Effects of Caffeine on Heart Rate Variability, Blood Pressure and Tidal Volume in Paraplegic and Tetraplegic Compared to Able-Bodied Individuals: A Randomized, Blinded Trial.

    Science.gov (United States)

    Flueck, Joelle Leonie; Schaufelberger, Fabienne; Lienert, Martina; Schäfer Olstad, Daniela; Wilhelm, Matthias; Perret, Claudio

    2016-01-01

    Caffeine increases sympathetic nerve activity in healthy individuals. Such modulation of nervous system activity can be tracked by assessing the heart rate variability. This study aimed to investigate the influence of caffeine on time- and frequency-domain heart rate variability parameters, blood pressure and tidal volume in paraplegic and tetraplegic compared to able-bodied participants. Heart rate variability was measured in supine and sitting position pre and post ingestion of either placebo or 6 mg caffeine in 12 able-bodied, 9 paraplegic and 7 tetraplegic participants in a placebo-controlled, randomized and double-blind study design. Metronomic breathing was applied (0.25 Hz) and tidal volume was recorded during heart rate variability assessment. Blood pressure, plasma caffeine and epinephrine concentrations were analyzed pre and post ingestion. Most parameters of heart rate variability did not significantly change post caffeine ingestion compared to placebo. Tidal volume significantly increased post caffeine ingestion in able-bodied (p = 0.021) and paraplegic (p = 0.036) but not in tetraplegic participants (p = 0.34). Systolic and diastolic blood pressure increased significantly post caffeine in able-bodied (systolic: p = 0.003; diastolic: p = 0.021) and tetraplegic (systolic: p = 0.043; diastolic: p = 0.042) but not in paraplegic participants (systolic: p = 0.09; diastolic: p = 0.33). Plasma caffeine concentrations were significantly increased post caffeine ingestion in all three groups of participants (pcaffeine on the autonomic nervous system seems to depend on the level of lesion and the extent of the impairment. Therefore, tetraplegic participants may be less influenced by caffeine ingestion. ClinicalTrials.gov NCT02083328.

  16. Critical points of quadratic renormalizations of random variables and phase transitions of disordered polymer models on diamond lattices.

    Science.gov (United States)

    Monthus, Cécile; Garel, Thomas

    2008-02-01

    We study the wetting transition and the directed polymer delocalization transition on diamond hierarchical lattices. These two phase transitions with frozen disorder correspond to the critical points of quadratic renormalizations of the partition function. (These exact renormalizations on diamond lattices can also be considered as approximate Migdal-Kadanoff renormalizations for hypercubic lattices.) In terms of the rescaled partition function z=Z/Z(typ) , we find that the critical point corresponds to a fixed point distribution with a power-law tail P(c)(z) ~ Phi(ln z)/z(1+mu) as z-->+infinity [up to some subleading logarithmic correction Phi(ln z)], so that all moments z(n) with n>mu diverge. For the wetting transition, the first moment diverges z=+infinity (case 0infinity (case 1fixed point distribution coincides with the transfer matrix describing a directed polymer on the Cayley tree, but the random weights determined by the fixed point distribution P(c)(z) are broadly distributed. This induces some changes in the traveling wave solutions with respect to the usual case of more narrow distributions.

  17. Effect of Attention Training on Attention Bias Variability and PTSD Symptoms: Randomized Controlled Trials in Israeli and U.S. Combat Veterans.

    Science.gov (United States)

    Badura-Brack, Amy S; Naim, Reut; Ryan, Tara J; Levy, Ofir; Abend, Rany; Khanna, Maya M; McDermott, Timothy J; Pine, Daniel S; Bar-Haim, Yair

    2015-12-01

    Attention allocation to threat is perturbed in patients with posttraumatic stress disorder (PTSD), with some studies indicating excess attention to threat and others indicating fluctuations between threat vigilance and threat avoidance. The authors tested the efficacy of two alternative computerized protocols, attention bias modification and attention control training, for rectifying threat attendance patterns and reducing PTSD symptoms. Two randomized controlled trials compared the efficacy of attention bias modification and attention control training for PTSD: one in Israel Defense Forces veterans and one in U.S. military veterans. Both utilized variants of the dot-probe task, with attention bias modification designed to shift attention away from threat and attention control training balancing attention allocation between threat and neutral stimuli. PTSD symptoms, attention bias, and attention bias variability were measured before and after treatment. Both studies indicated significant symptom improvement after treatment, favoring attention control training. Additionally, both studies found that attention control training, but not attention bias modification, significantly reduced attention bias variability. Finally, a combined analysis of the two samples suggested that reductions in attention bias variability partially mediated improvement in PTSD symptoms. Attention control training may address aberrant fluctuations in attention allocation in PTSD, thereby reducing PTSD symptoms. Further study of treatment efficacy and its underlying neurocognitive mechanisms is warranted.

  18. Predictive modeling of groundwater nitrate pollution using Random Forest and multisource variables related to intrinsic and specific vulnerability: a case study in an agricultural setting (Southern Spain).

    Science.gov (United States)

    Rodriguez-Galiano, Victor; Mendes, Maria Paula; Garcia-Soldado, Maria Jose; Chica-Olmo, Mario; Ribeiro, Luis

    2014-04-01

    Watershed management decisions need robust methods, which allow an accurate predictive modeling of pollutant occurrences. Random Forest (RF) is a powerful machine learning data driven method that is rarely used in water resources studies, and thus has not been evaluated thoroughly in this field, when compared to more conventional pattern recognition techniques key advantages of RF include: its non-parametric nature; high predictive accuracy; and capability to determine variable importance. This last characteristic can be used to better understand the individual role and the combined effect of explanatory variables in both protecting and exposing groundwater from and to a pollutant. In this paper, the performance of the RF regression for predictive modeling of nitrate pollution is explored, based on intrinsic and specific vulnerability assessment of the Vega de Granada aquifer. The applicability of this new machine learning technique is demonstrated in an agriculture-dominated area where nitrate concentrations in groundwater can exceed the trigger value of 50 mg/L, at many locations. A comprehensive GIS database of twenty-four parameters related to intrinsic hydrogeologic proprieties, driving forces, remotely sensed variables and physical-chemical variables measured in "situ", were used as inputs to build different predictive models of nitrate pollution. RF measures of importance were also used to define the most significant predictors of nitrate pollution in groundwater, allowing the establishment of the pollution sources (pressures). The potential of RF for generating a vulnerability map to nitrate pollution is assessed considering multiple criteria related to variations in the algorithm parameters and the accuracy of the maps. The performance of the RF is also evaluated in comparison to the logistic regression (LR) method using different efficiency measures to ensure their generalization ability. Prediction results show the ability of RF to build accurate models

  19. Dealing with randomness and vagueness in business and management sciences: the fuzzy-probabilistic approach as a tool for the study of statistical relationships between imprecise variables

    Directory of Open Access Journals (Sweden)

    Fabrizio Maturo

    2016-06-01

    Full Text Available In practical applications relating to business and management sciences, there are many variables that, for their own nature, are better described by a pair of ordered values (i.e. financial data. By summarizing this measurement with a single value, there is a loss of information; thus, in these situations, data are better described by interval values rather than by single values. Interval arithmetic studies and analyzes this type of imprecision; however, if the intervals has no sharp boundaries, fuzzy set theory is the most suitable instrument. Moreover, fuzzy regression models are able to overcome some typical limitation of classical regression because they do not need the same strong assumptions. In this paper, we present a review of the main methods introduced in the literature on this topic and introduce some recent developments regarding the concept of randomness in fuzzy regression.

  20. Nonvolatile Static Random Access Memory Using Resistive Switching Devices: Variable-Transconductance Metal-Oxide-Semiconductor Field-Effect-Transistor Approach

    Science.gov (United States)

    Yamamoto, Shuu'ichirou; Shuto, Yusuke; Sugahara, Satoshi

    2010-04-01

    In this paper, we present a variable-transconductance (gm) metal-oxide-semiconductor field-effect-transistor (VGm-MOSFET) architecture using a nonpolar resistive switching device (RSD) for nonvolatile bistable circuit applications. The architecture can be achieved by connecting an RSD to the source terminal of an ordinary MOSFET. The current drive capability of the VGm-MOSFET can be modified by resistance states of the connected RSD, which is a very useful function for nonvolatile bistable circuits, such as nonvolatile static random access memory (NV-SRAM) and nonvolatile flip-flop (NV-FF). NV-SRAM can be easily configured by connecting two additional VGm-MOSFETs to the storage nodes of a standard SRAM cell. Using our developed SPICE macromodel for nonpolar RSDs, successful circuit operations of the proposed NV-SRAM cell were confirmed.

  1. Impact of Flavonols on Cardiometabolic Biomarkers: A Meta-Analysis of Randomized Controlled Human Trials to Explore the Role of Inter-Individual Variability

    Science.gov (United States)

    Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R.; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula

    2017-01-01

    Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD). However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta-analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta-analysis model and reported as difference in means (DM). Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01), LDL cholesterol (DM = −0.14 mmol/L; 95% CI: −0.21, 0.07), and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03), and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95% CI: −0.29, −0.08), and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: −4.09, −2.55). Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of origin and health status may influence the effect of flavonol intake on blood lipid levels. PMID:28208791

  2. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Nuria eRuffini

    2015-08-01

    Full Text Available Context: Heart Rate Variability (HRV indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT on ANS activity through changes of High Frequency, a heart rate variability index indicating the parasympathetic activity, in healthy subjects, compared with sham therapy and control group.Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults, both smokers and non-smokers and not on medications. At enrollment subjects were randomized in 3 groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920.Main Outcomes Measures: HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 minutes.Results: OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency rate (p<0.001, and decrease of sympathetic activity, as revealed by Low Frequency rate (p<0.01; results also showed a reduction of Low Frequency/High Frequency ratio (p<0.001 and Detrended fluctuation scaling exponent (p<0.05. Conclusions: Findings suggested that OMT can influence ANS activity increasing parasympathetic function and decreasing sympathetic activity, compared to sham therapy and control group.

  3. Differentials in variables associated with past history of artificial abortion and current contraception by age: Results of a randomized national survey in Japan.

    Science.gov (United States)

    Kojo, Takao; Ae, Ryusuke; Tsuboi, Satoshi; Nakamura, Yosikazu; Kitamura, Kunio

    2017-03-01

    This study analyzes differentials in the variables associated with the experience of artificial abortion (abortion) and use of contraception by age among women in Japan. The 2010 National Lifestyle and Attitudes Towards Sexual Behavior Survey was distributed to 2693 men and women aged 16-49 selected from the Japanese population using a two-stage random sampling procedure. From the 1540 respondents, we selected 700 women who reported having had sexual intercourse at least once. We used logistic regression to analyze how social and demographic factors were associated with the experience of abortion and contraceptive use. The abortion rate according to the survey was 19.3%. Of the 700 women in the sample, 6.9% had experienced two or more abortions. Logistic regression revealed that, although significant variables depended on age, a high level of education and discussions about contraceptive use with partners were negatively associated with the experience of abortion. Self-injury, approval of abortion and first sexual intercourse between the age of 10 and 19 were positively associated with the experience of abortion. Marriage, smoking and first sexual intercourse between the age of 10 and 19 were negatively associated with contraceptive use. Higher education and discussion of contraception with partners were positively associated with contraceptive use. To prevent unwanted pregnancy and abortion, social support and sexual education should be age-appropriate. It is vital to educate young people of the importance of discussing contraceptive use with their partners. © 2016 Japan Society of Obstetrics and Gynecology.

  4. Non-inferiority of insulin glargine versus insulin detemir on blood glucose variability in type 1 diabetes patients: a multicenter, randomized, crossover study.

    Science.gov (United States)

    Renard, Eric; Dubois-Laforgue, Danièle; Guerci, Bruno

    2011-12-01

    This study compared the effects of insulin glargine and insulin detemir on blood glucose variability under clinical practice conditions in patients with type 1 diabetes (T1D) using glulisine as the mealtime insulin. This was a multicenter, crossover trial in 88 randomized T1D patients: 54 men and 34 women, 46.8±13.7 years old, with a duration of diabetes of 18±9 years and hemoglobin A1c level of 7.1±0.7%. The per-protocol population included 78 patients: 44 received glargine/detemir and 34 detemir/glargine in the first/second 16-week period, respectively. The primary end point was the coefficient of variation (CV) of fasting blood glucose (FBG). Secondary end points included variability of pre-dinner blood glucose, mean amplitude of glycemic excursions, mean of daily differences, and doses and number of daily insulin injections. The non-inferiority criterion was an insulin glargine/insulin detemir FBG CV ratio with a 95% confidence interval (CI) upper limit ≤1.25. The non-inferiority criterion was satisfied with a mean value of 1.016 (95% CI=0.970-1.065). Intention-to-treat analysis confirmed the non-inferiority with a 95% CI upper limit=1.062. No significant differences were found on secondary objectives, but there was a trend to higher doses and number of daily injections with insulin detemir. A total of eight (four glargine and four detemir) patients reported nine serious adverse events (including one severe episode of hypoglycemia). None of them was considered as related to basal insulins. Serious adverse events led to treatment discontinuation in two patients of the detemir group and none in the glargine group. In T1D patients under clinical practice conditions, insulin glargine was non-inferior to insulin detemir regarding blood glucose variability, as assessed by CV of FBG.

  5. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial.

    Science.gov (United States)

    Ruffini, Nuria; D'Alessandro, Giandomenico; Mariani, Nicolò; Pollastrelli, Alberto; Cardinali, Lucia; Cerritelli, Francesco

    2015-01-01

    Heart Rate Variability (HRV) indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS) function. To investigate the influence of osteopathic manipulative treatment (OMT) on cardiac autonomic modulation in healthy subjects, compared with sham therapy and control group. Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults (26.7 ± 8.4 y, 51% male, BMI 18.5 ± 4.8), both smokers and non-smokers and not on medications. At enrollment subjects were randomized in three groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920. HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 min and considering frequency domain as well as linear and non-linear methods as outcome measures. OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency power (p ANS activity increasing parasympathetic function and decreasing sympathetic activity, compared to sham therapy and control group.

  6. The effects of weight loss and salt reduction on visit-to-visit blood pressure variability: results from a multicenter randomized controlled trial.

    Science.gov (United States)

    Diaz, Keith M; Muntner, Paul; Levitan, Emily B; Brown, Michael D; Babbitt, Dianne M; Shimbo, Daichi

    2014-04-01

    As evidence suggests visit-to-visit variability (VVV) of blood pressure (BP) is associated with cardiovascular events and mortality, there is increasing interest in identifying interventions that reduce VVV of BP. We investigated the effects of weight loss and sodium reduction, alone or in combination, on VVV of BP in participants enrolled in phase II of the Trials of Hypertension Prevention. BP readings were taken at 6-month intervals for 36 months in 1820 participants with high-normal DBP who were randomized to weight loss, sodium reduction, combination (weight loss and sodium reduction), or usual care groups. VVV of BP was defined as the SD of BP across six follow-up visits. VVV of SBP was not significantly different between participants randomized to the weight loss (7.2 ± 3.1  mmHg), sodium reduction (7.1 ± 3.0  mmHg), or combined (6.9 ± 2.9  mmHg) intervention groups vs. the usual care group (6.9 ± 2.9  mmHg). In a fully adjusted model, no difference (0.0 ± 0.2  mmHg) in VVV of SBP was present between individuals who successfully maintained their weight loss vs. individuals who did not lose weight during follow-up (P = 0.93). Also, those who maintained a reduced sodium intake throughout follow-up did not have lower VVV of SBP compared to those who did not reduce their sodium intake (0.1 ± 0.3  mmHg; P = 0.77). Results were similar for VVV of DBP. These findings suggest that weight loss and sodium reduction may not be effective interventions for lowering VVV of BP in individuals with high-normal DBP.

  7. Effects of Person-Centered Physical Therapy on Fatigue-Related Variables in Persons With Rheumatoid Arthritis: A Randomized Controlled Trial.

    Science.gov (United States)

    Feldthusen, Caroline; Dean, Elizabeth; Forsblad-d'Elia, Helena; Mannerkorpi, Kaisa

    2016-01-01

    To examine effects of person-centered physical therapy on fatigue and related variables in persons with rheumatoid arthritis (RA). Randomized controlled trial. Hospital outpatient rheumatology clinic. Persons with RA aged 20 to 65 years (N=70): intervention group (n=36) and reference group (n=34). The 12-week intervention, with 6-month follow-up, focused on partnership between participant and physical therapist and tailored health-enhancing physical activity and balancing life activities. The reference group continued with regular activities; both groups received usual health care. Primary outcome was general fatigue (visual analog scale). Secondary outcomes included multidimensional fatigue (Bristol Rheumatoid Arthritis Fatigue Multi-Dimensional Questionnaire) and fatigue-related variables (ie, disease, health, function). At posttest, general fatigue improved more in the intervention group than the reference group (P=.042). Improvement in median general fatigue reached minimal clinically important differences between and within groups at posttest and follow-up. Improvement was also observed for anxiety (P=.0099), and trends toward improvements were observed for most multidimensional aspects of fatigue (P=.023-.048), leg strength/endurance (P=.024), and physical activity (P=.023). Compared with the reference group at follow-up, the intervention group improvement was observed for leg strength/endurance (P=.001), and the trends toward improvements persisted for physical (P=.041) and living-related (P=.031) aspects of fatigue, physical activity (P=.019), anxiety (P=.015), self-rated health (P=.010), and self-efficacy (P=.046). Person-centered physical therapy focused on health-enhancing physical activity and balancing life activities showed significant benefits on fatigue in persons with RA. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Does long term sport rock climbing training affect on echocardiography and heart rate variability in sedentary adults? A randomized, and controlled study.

    Directory of Open Access Journals (Sweden)

    Aras Dicle

    2016-03-01

    Full Text Available ABSTRACT: Regular physical activity can cause some long term effects on human body. The purpose of this research was to examine the effect of sport rock climbing (SRC training at 70 % HRmax level on echocardiography (ECHO and heart rate variability (HRV for one hour a day and three days a week in an eight-week period. A total of 19 adults participated in this study voluntarily. The subjects were randomly divided into two groups as experimental (EG and control (CG. While the EG went and did climbing training by using the top-rope method for 60 minutes a day, three days a week for 8 weeks and didn’t join any other physical activity programs, CG didn’t train and take part in any physical activity during the course of the study. Same measurements were repeated at the end of eight weeks. According to the findings, no significant change was observed in any of the ECHO and HRV parameters. However, an improvement was seen in some HRV parameters [average heart rate (HRave, standard deviation of all NN intervals (SDNN, standard deviation of the averages of NN intervals in all five-minute segments of the entire recording (SDANN, percent of difference between adjacent NN intervals that are greater than 50 ms (PNN50, square root of the mean of the sum of the squares of differences between adjacent NN interval (RMSSD] in EG. An exercise program based on SRC should be made more than eight weeks in order to have statistically significant changes with the purpose of observing an improvement in heart structure and functions. Keywords: Echocardiography, heart rate variability, sport rock climbing

  9. A randomized trial of high-dairy-protein, variable-carbohydrate diets and exercise on body composition in adults with obesity.

    Science.gov (United States)

    Parr, Evelyn B; Coffey, Vernon G; Cato, Louise E; Phillips, Stuart M; Burke, Louise M; Hawley, John A

    2016-05-01

    This study determined the effects of 16-week high-dairy-protein, variable-carbohydrate (CHO) diets and exercise training (EXT) on body composition in men and women with overweight/obesity. One hundred and eleven participants (age 47 ± 6 years, body mass 90.9 ± 11.7 kg, BMI 33 ± 4 kg/m(2) , values mean ± SD) were randomly stratified to diets with either: high dairy protein, moderate CHO (40% CHO: 30% protein: 30% fat; ∼4 dairy servings); high dairy protein, high CHO (55%: 30%: 15%; ∼4 dairy servings); or control (55%: 15%: 30%; ∼1 dairy serving). Energy restriction (500 kcal/day) was achieved through diet (∼250 kcal/day) and EXT (∼250 kcal/day). Body composition was measured using dual-energy X-ray absorptiometry before, midway, and upon completion of the intervention. Eighty-nine (25 M/64 F) of 115 participants completed the 16-week intervention, losing 7.7 ± 3.2 kg fat mass (P composition (fat mass or lean mass) between groups. Compared to a healthy control diet, energy-restricted high-protein diets containing different proportions of fat and CHO confer no advantage to weight loss or change in body composition in the presence of an appropriate exercise stimulus. © 2016 The Obesity Society.

  10. New Closed-Form Results on Ordered Statistics of Partial Sums of Gamma Random Variables and its Application to Performance Evaluation in the Presence of Nakagami Fading

    KAUST Repository

    Nam, Sung Sik

    2017-06-19

    Complex wireless transmission systems require multi-dimensional joint statistical techniques for performance evaluation. Here, we first present the exact closed-form results on order statistics of any arbitrary partial sums of Gamma random variables with the closedform results of core functions specialized for independent and identically distributed Nakagami-m fading channels based on a moment generating function-based unified analytical framework. These both exact closed-form results have never been published in the literature. In addition, as a feasible application example in which our new offered derived closed-form results can be applied is presented. In particular, we analyze the outage performance of the finger replacement schemes over Nakagami fading channels as an application of our method. Note that these analysis results are directly applicable to several applications, such as millimeter-wave communication systems in which an antenna diversity scheme operates using an finger replacement schemes-like combining scheme, and other fading scenarios. Note also that the statistical results can provide potential solutions for ordered statistics in any other research topics based on Gamma distributions or other advanced wireless communications research topics in the presence of Nakagami fading.

  11. Tai Chi Training may Reduce Dual Task Gait Variability, a Potential Mediator of Fall Risk, in Healthy Older Adults: Cross-Sectional and Randomized Trial Studies.

    Science.gov (United States)

    Wayne, Peter M; Hausdorff, Jeffrey M; Lough, Matthew; Gow, Brian J; Lipsitz, Lewis; Novak, Vera; Macklin, Eric A; Peng, Chung-Kang; Manor, Brad

    2015-01-01

    Tai Chi (TC) exercise improves balance and reduces falls in older, health-impaired adults. TC's impact on dual task (DT) gait parameters predictive of falls, especially in healthy active older adults, however, is unknown. To compare differences in usual and DT gait between long-term TC-expert practitioners and age-/gender-matched TC-naïve adults, and to determine the effects of short-term TC training on gait in healthy, non-sedentary older adults. A cross-sectional study compared gait in healthy TC-naïve and TC-expert (24.5 ± 12 years experience) older adults. TC-naïve adults then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Gait speed and stride time variability (Coefficient of Variation %) were assessed during 90 s trials of undisturbed and cognitive DT (serial subtractions) conditions. During DT, gait speed decreased (p sensitive metric for monitoring TC's impact on fall risk with healthy older adults.

  12. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial

    Science.gov (United States)

    Ruffini, Nuria; D'Alessandro, Giandomenico; Mariani, Nicolò; Pollastrelli, Alberto; Cardinali, Lucia; Cerritelli, Francesco

    2015-01-01

    Context: Heart Rate Variability (HRV) indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS) function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT) on cardiac autonomic modulation in healthy subjects, compared with sham therapy and control group. Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults (26.7 ± 8.4 y, 51% male, BMI 18.5 ± 4.8), both smokers and non-smokers and not on medications. At enrollment subjects were randomized in three groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920. Main Outcomes Measures: HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 min and considering frequency domain as well as linear and non-linear methods as outcome measures. Results: OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency power (p < 0.001), expressed in normalized and absolute unit, and possibly decrease of sympathetic activity, as revealed by Low Frequency power (p < 0.01); results also showed a reduction of Low Frequency/High Frequency ratio (p < 0.001) and Detrended fluctuation scaling exponent (p < 0.05). Conclusions: Findings suggested that OMT can influence ANS activity increasing

  13. Chronic kidney disease in the type 2 diabetic patients: prevalence and associated variables in a random sample of 2642 patients of a Mediterranean area

    Directory of Open Access Journals (Sweden)

    Coll-de-Tuero Gabriel

    2012-08-01

    Full Text Available Abstract Background Kidney disease is associated with an increased total mortality and cardiovascular morbimortality in the general population and in patients with Type 2 diabetes. The aim of this study is to determine the prevalence of kidney disease and different types of renal disease in patients with type 2 diabetes (T2DM. Methods Cross-sectional study in a random sample of 2,642 T2DM patients cared for in primary care during 2007. Studied variables: demographic and clinical characteristics, pharmacological treatments and T2DM complications (diabetic foot, retinopathy, coronary heart disease and stroke. Variables of renal function were defined as follows: 1 Microalbuminuria: albumin excretion rate & 30 mg/g or 3.5 mg/mmol, 2 Macroalbuminuria: albumin excretion rate & 300 mg/g or 35 mg/mmol, 3 Kidney disease (KD: glomerular filtration rate according to Modification of Diet in Renal Disease 2 and/or the presence of albuminuria, 4 Renal impairment (RI: glomerular filtration rate 2, 5 Nonalbuminuric RI: glomerular filtration rate 2 without albuminuria and, 5 Diabetic nephropathy (DN: macroalbuminuria or microalbuminuria plus diabetic retinopathy. Results The prevalence of different types of renal disease in patients was: 34.1% KD, 22.9% RI, 19.5% albuminuria and 16.4% diabetic nephropathy (DN. The prevalence of albuminuria without RI (13.5% and nonalbuminuric RI (14.7% was similar. After adjusting per age, BMI, cholesterol, blood pressure and macrovascular disease, RI was significantly associated with the female gender (OR 2.20; CI 95% 1.86–2.59, microvascular disease (OR 2.14; CI 95% 1.8–2.54 and insulin treatment (OR 1.82; CI 95% 1.39–2.38, and inversely associated with HbA1c (OR 0.85 for every 1% increase; CI 95% 0.80–0.91. Albuminuria without RI was inversely associated with the female gender (OR 0.27; CI 95% 0.21–0.35, duration of diabetes (OR 0.94 per year; CI 95% 0.91–0.97 and directly associated with HbA1c (OR 1.19 for every

  14. Rye-Based Evening Meals Favorably Affected Glucose Regulation and Appetite Variables at the Following Breakfast; A Randomized Controlled Study in Healthy Subjects.

    Directory of Open Access Journals (Sweden)

    Jonna C Sandberg

    Full Text Available Whole grain has shown potential to prevent obesity, cardiovascular disease and type 2 diabetes. Possible mechanism could be related to colonic fermentation of specific indigestible carbohydrates, i.e. dietary fiber (DF. The aim of this study was to investigate effects on cardiometabolic risk factors and appetite regulation the next day when ingesting rye kernel bread rich in DF as an evening meal.Whole grain rye kernel test bread (RKB or a white wheat flour based bread (reference product, WWB was provided as late evening meals to healthy young adults in a randomized cross-over design. The test products RKB and WWB were provided in two priming settings: as a single evening meal or as three consecutive evening meals prior to the experimental days. Test variables were measured in the morning, 10.5-13.5 hours after ingestion of RKB or WWB. The postprandial phase was analyzed for measures of glucose metabolism, inflammatory markers, appetite regulating hormones and short chain fatty acids (SCFA in blood, hydrogen excretion in breath and subjective appetite ratings.With the exception of serum CRP, no significant differences in test variables were observed depending on length of priming (P>0.05. The RKB evening meal increased plasma concentrations of PYY (0-120 min, P<0.001, GLP-1 (0-90 min, P<0.05 and fasting SCFA (acetate and butyrate, P<0.05, propionate, P = 0.05, compared to WWB. Moreover, RKB decreased blood glucose (0-120 min, P = 0.001, serum insulin response (0-120 min, P<0.05 and fasting FFA concentrations (P<0.05. Additionally, RKB improved subjective appetite ratings during the whole experimental period (P<0.05, and increased breath hydrogen excretion (P<0.001, indicating increased colonic fermentation activity.The results indicate that RKB evening meal has an anti-diabetic potential and that the increased release of satiety hormones and improvements of appetite sensation could be beneficial in preventing obesity. These effects could

  15. Effects of additional repeated sprint training during preseason on performance, heart rate variability, and stress symptoms in futsal players: a randomized controlled trial.

    Science.gov (United States)

    Soares-Caldeira, Lúcio F; de Souza, Eberton A; de Freitas, Victor H; de Moraes, Solange M F; Leicht, Anthony S; Nakamura, Fábio Y

    2014-10-01

    The aim of this study was to investigate whether supplementing regular preseason futsal training with weekly sessions of repeated sprints (RS) training would have positive effects on repeated sprint ability (RSA) and field test performance. Thirteen players from a professional futsal team (22.6 ± 6.7 years, 72.8 ± 8.7 kg, 173.2 ± 6.2 cm) were divided randomly into 2 groups (AddT: n = 6 and normal training group: n = 7). Both groups performed a RSA test, Yo-Yo intermittent recovery test level 1 (YoYo IR1), squat (SJ) and countermovement jumps (CMJ), body composition, and heart rate variability (HRV) measures at rest before and after 4 weeks of preseason training. Athletes weekly stress symptoms were recorded by psychometric responses using the Daily Analysis of Life Demands for Athletes questionnaire and subjective ratings of well-being scale, respectively. The daily training load (arbitrary units) was assessed using the session of rating perceived exertion method. After the preseason training, there were no significant changes for body composition, SJ, CMJ, and RSAbest. The YoYo IR1, RSAmean, RSAworst, and RSAdecreament were significantly improved for both groups (p ≤ 0.05). The HRV parameters improved significantly within both groups (p ≤ 0.05) except for high frequency (HF, absolute and normalized units, [n.u.]), low frequency (LF) (n.u.), and the LF/HF ratio. A moderate effect size for the AddT group was observed for resting heart rate and several HRV measures. Training load and psychometric responses were similar between both groups. Additional RS training resulted in slightly greater positive changes for vagal-related HRV with similar improvements in performance and training stress during the preseason training in futsal players.

  16. Pilot Randomized Study of a Gratitude Journaling Intervention on Heart Rate Variability and Inflammatory Biomarkers in Patients With Stage B Heart Failure.

    Science.gov (United States)

    Redwine, Laura S; Henry, Brook L; Pung, Meredith A; Wilson, Kathleen; Chinh, Kelly; Knight, Brian; Jain, Shamini; Rutledge, Thomas; Greenberg, Barry; Maisel, Alan; Mills, Paul J

    2016-01-01

    Stage B, asymptomatic heart failure (HF) presents a therapeutic window for attenuating disease progression and development of HF symptoms, and improving quality of life. Gratitude, the practice of appreciating positive life features, is highly related to quality of life, leading to development of promising clinical interventions. However, few gratitude studies have investigated objective measures of physical health; most relied on self-report measures. We conducted a pilot study in Stage B HF patients to examine whether gratitude journaling improved biomarkers related to HF prognosis. Patients (n = 70; mean [standard deviation] age = 66.2 [7.6] years) were randomized to an 8-week gratitude journaling intervention or treatment as usual. Baseline (T1) assessments included the six-item Gratitude Questionnaire, resting heart rate variability (HRV), and an inflammatory biomarker index. At T2 (midintervention), the six-item Gratitude Questionnaire was measured. At T3 (postintervention), T1 measures were repeated but also included a gratitude journaling task. The gratitude intervention was associated with improved trait gratitude scores (F = 6.0, p = .017, η = 0.10), reduced inflammatory biomarker index score over time (F = 9.7, p = .004, η = 0.21), and increased parasympathetic HRV responses during the gratitude journaling task (F = 4.2, p = .036, η = 0.15), compared with treatment as usual. However, there were no resting preintervention to postintervention group differences in HRV (p values > .10). Gratitude journaling may improve biomarkers related to HF morbidity, such as reduced inflammation; large-scale studies with active control conditions are needed to confirm these findings. Clinicaltrials.govidentifier:NCT01615094.

  17. List of Accredited Representatives

    Data.gov (United States)

    Department of Veterans Affairs — VA accreditation is for the sole purpose of providing representation services to claimants before VA and does not imply that a representative is qualified to provide...

  18. The prognostic value of long-term visit-to-visit blood pressure variability on stroke in real-world practice: a dynamic cohort study in a large representative sample of Chinese hypertensive population.

    Science.gov (United States)

    Yu, Jin-ming; Kong, Qun-yu; Schoenhagen, Paul; Shen, Tian; He, Yu-song; Wang, Ji-wei; Zhao, Yan-ping; Shi, Dan-ni; Zhong, Bao-liang

    2014-12-20

    The prognostic significance of long-term visit-to-visit blood pressure variability (BPV) has not yet been validated in "real world" hypertensive patients. The aim of the current study is to explore the prognostic value of BPV on stroke in hypertensive patients. This was a dynamic prospective cohort study based on electronic medical records in Shanghai, China. Hypertensive patients (N=122,636) without history of stroke at baseline, were followed up from 2005 to 2011. The cohort comprised of 4522 stroke patients and 118,114 non-stroke patients during a mean follow-up duration of 48 months. BPV was measured by standard deviation (SD) and the coefficient of variation (CV) of blood pressure. The visit-to-visit variability of both systolic blood pressure (SBP) and diastolic blood pressure (DBP) was independently associated with the occurrence of stroke [SD: the hazard ratios (95% confidence intervals) of SBP and DBP were 1.042 (1.021 to 1.064) and 1.052 (1.040 to 1.065); CV: the hazard ratios (95% confidence intervals) of SBP and DBP were 1.183 (1.010 to 1.356) and 1.151 (1.005 to 1.317), respectively]. The hazard ratio values increased along with an increase of the BPV levels of SBP and DBP. The increment effect remained significant after controlling the blood pressure control status of subjects. Increased BPV of both SBP and DBP, independent of the average blood pressure, is a predictor of stroke among community hypertensive patients in real-world clinical practice. The risk of stroke increased along with increased BPV. Stabilizing BPV might be a therapeutic target in hypertension. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. [Advance directives. Representatives' opinions].

    Science.gov (United States)

    Busquets I Font, J M; Hernando Robles, P; Font I Canals, R; Diestre Ortin, G; Quintana, S

    The use and usefulness of Advance Directives has led to a lot of controversy about their validity and effectiveness. Those areas are unexplored in our country from the perspective of representatives. To determine the opinion of the representatives appointed in a registered Statement of Advance Directives (SAD) on the use of this document. Telephone survey of representatives of 146 already dead people and who, since February 2012, had registered a SAD document. More the two-thirds (98) of respondents recalled that the SAD was consulted, with 86 (58.9%) saying that their opinion as representative was consulted, and 120 (82.1%) believe that the patient's will was respected. Of those interviewed, 102 (69.9%) believe that patients who had previously planned their care using a SAD had a good death, with 33 (22.4%) saying it could have been better, and 10 (6.9%) believe they suffered greatly. The SAD were mostly respected and consulted, and possibly this is related to the fact that most of the representatives declare that the death of those they represented was perceived as comfortable. It would be desirable to conduct further studies addressed at health personnel in order to know their perceptions regarding the use of Advance Directives in the process of dying. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. On Representative Social Capital

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.

    2004-01-01

    This paper analyzes data for a random sample drawn from the Dutch population who reveal their propensity to invest and reward investments in building up social capital by means of an economic experiment.We find substantial heterogeneity and asymmetries in the propensity to invest and in the

  1. Effect of a 16-week Bikram yoga program on heart rate variability and associated cardiovascular disease risk factors in stressed and sedentary adults: A randomized controlled trial.

    Science.gov (United States)

    Hewett, Zoe L; Pumpa, Kate L; Smith, Caroline A; Fahey, Paul P; Cheema, Birinder S

    2017-04-21

    Chronic activation of the stress-response can contribute to cardiovascular disease risk, particularly in sedentary individuals. This study investigated the effect of a Bikram yoga intervention on the high frequency power component of heart rate variability (HRV) and associated cardiovascular disease (CVD) risk factors (i.e. additional domains of HRV, hemodynamic, hematologic, anthropometric and body composition outcome measures) in stressed and sedentary adults. Eligible adults were randomized to an experimental group (n = 29) or a no treatment control group (n = 34). Experimental group participants were instructed to attend three to five supervised Bikram yoga classes per week for 16 weeks at local studios. Outcome measures were assessed at baseline (week 0) and completion (week 17). Sixty-three adults (37.2 ± 10.8 years, 79% women) were included in the intention-to-treat analysis. The experimental group attended 27 ± 18 classes. Analyses of covariance revealed no significant change in the high-frequency component of HRV (p = 0.912, partial η 2 = 0.000) or in any secondary outcome measure between groups over time. However, regression analyses revealed that higher attendance in the experimental group was associated with significant reductions in diastolic blood pressure (p = 0.039; partial η 2 = 0.154), body fat percentage (p = 0.001, partial η 2 = 0.379), fat mass (p = 0.003, partial η 2 = 0.294) and body mass index (p = 0.05, partial η 2 = 0.139). A 16-week Bikram yoga program did not increase the high frequency power component of HRV or any other CVD risk factors investigated. As revealed by post hoc analyses, low adherence likely contributed to the null effects. Future studies are required to address barriers to adherence to better elucidate the dose-response effects of Bikram yoga practice as a medium to lower stress-related CVD risk. Retrospectively registered with Australia New Zealand Clinical Trials Registry ACTRN

  2. Representing vision and blindness.

    Science.gov (United States)

    Ray, Patrick L; Cox, Alexander P; Jensen, Mark; Allen, Travis; Duncan, William; Diehl, Alexander D

    2016-01-01

    There have been relatively few attempts to represent vision or blindness ontologically. This is unsurprising as the related phenomena of sight and blindness are difficult to represent ontologically for a variety of reasons. Blindness has escaped ontological capture at least in part because: blindness or the employment of the term 'blindness' seems to vary from context to context, blindness can present in a myriad of types and degrees, and there is no precedent for representing complex phenomena such as blindness. We explore current attempts to represent vision or blindness, and show how these attempts fail at representing subtypes of blindness (viz., color blindness, flash blindness, and inattentional blindness). We examine the results found through a review of current attempts and identify where they have failed. By analyzing our test cases of different types of blindness along with the strengths and weaknesses of previous attempts, we have identified the general features of blindness and vision. We propose an ontological solution to represent vision and blindness, which capitalizes on resources afforded to one who utilizes the Basic Formal Ontology as an upper-level ontology. The solution we propose here involves specifying the trigger conditions of a disposition as well as the processes that realize that disposition. Once these are specified we can characterize vision as a function that is realized by certain (in this case) biological processes under a range of triggering conditions. When the range of conditions under which the processes can be realized are reduced beyond a certain threshold, we are able to say that blindness is present. We characterize vision as a function that is realized as a seeing process and blindness as a reduction in the conditions under which the sight function is realized. This solution is desirable because it leverages current features of a major upper-level ontology, accurately captures the phenomenon of blindness, and can be

  3. Representing distance, consuming distance

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    Title: Representing Distance, Consuming Distance Abstract: Distance is a condition for corporeal and virtual mobilities, for desired and actual travel, but yet it has received relatively little attention as a theoretical entity in its own right. Understandings of and assumptions about distance...... to mobility and its social context. Such an understanding can be approached through representations, as distance is being represented in various ways, most noticeably in maps and through the notions of space and Otherness. The question this talk subsequently asks is whether these representations of distance...

  4. Prevalence of Diabetes, Glycosuria and Related Variables Among a ...

    African Journals Online (AJOL)

    A representative community of Cape Coloured people were surveyed in order to assess the prevalence of diabetes and related variables. The randomly selected sample consisted of 1 534 persons over the age of 10 years, of whom 63% were persuaded to undergo screening by blood-sugar level and testing for glycosuria ...

  5. Non-stationary random vibration analysis of structures under multiple correlated normal random excitations

    Science.gov (United States)

    Li, Yanbin; Mulani, Sameer B.; Kapania, Rakesh K.; Fei, Qingguo; Wu, Shaoqing

    2017-07-01

    An algorithm that integrates Karhunen-Loeve expansion (KLE) and the finite element method (FEM) is proposed to perform non-stationary random vibration analysis of structures under excitations, represented by multiple random processes that are correlated in both time and spatial domains. In KLE, the auto-covariance functions of random excitations are discretized using orthogonal basis functions. The KLE for multiple correlated random excitations relies on expansions in terms of correlated sets of random variables reflecting the cross-covariance of the random processes. During the response calculations, the eigenfunctions of KLE used to represent excitations are applied as forcing functions to the structure. The proposed algorithm is applied to a 2DOF system, a 2D cantilever beam and a 3D aircraft wing under both stationary and non-stationary correlated random excitations. Two methods are adopted to obtain the structural responses: a) the modal method and b) the direct method. Both the methods provide the statistics of the dynamic response with sufficient accuracy. The structural responses under the same type of correlated random excitations are bounded by the response obtained by perfectly correlated and uncorrelated random excitations. The structural response increases with a decrease in the correlation length and with an increase in the correlation magnitude. The proposed methodology can be applied for the analysis of any complex structure under any type of random excitation.

  6. Non-superiority of Kakkonto, a Japanese herbal medicine, to a representative multiple cold medicine with respect to anti-aggravation effects on the common cold: a randomized controlled trial.

    Science.gov (United States)

    Okabayashi, Satoe; Goto, Masashi; Kawamura, Takashi; Watanabe, Hidetsuna; Kimura, Akira; Uruma, Reiko; Takahashi, Yuko; Taneichi, Setsuko; Musashi, Manabu; Miyaki, Koichi

    2014-01-01

    Kakkonto, a Japanese herbal medicine, is frequently used to treat the common cold not only with a physician's prescription, but also in self-medication situations. This study aimed to examine whether Kakkonto prevents the aggravation of cold symptoms if taken at an early stage of illness compared with a well-selected Western-style multiple cold medicine. This study was a multicenter, active drug-controlled, randomized trial. Adults 18 to 65 years of age who felt a touch of cold symptoms and visited 15 outpatient healthcare facilities within 48 hours of symptoms onset were enrolled. The participants were randomly assigned to two groups: one treated with Kakkonto (Kakkonto Extract-A, 6 g/day) (n=209) and one treated with a Western-style multiple cold medicine (Pabron Gold-A, 3.6 g/day) (n=198) for at most four days. The primary outcome of this study was the aggravation of cold, nasal, throat or bronchial symptoms, scored as moderate or severe and lasting for at least two days within five days after entry into the study. Among the 410 enrollees, 340 (168 in the Kakkonto group and 172 in the Pabron group) were included in the analyses. The proportion of participants whose colds were aggravated was 22.6% in the Kakkonto group and 25.0% in the Pabron group (p=0.66). The overall severity of the cold symptoms was not significantly different between the groups. No harmful adverse events occurred in either group. Kakkonto did not significantly prevent the progression of cold symptoms, even when prescribed at an early stage of the disease.

  7. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    DEFF Research Database (Denmark)

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefore...... during free living conditions. RESULTS: Calibration reduced inter-instrument variability considerably in the mechanical setup, both in the MTI instruments (raw SDbetween units = 195 counts*min-1 vs. calibrated SDbetween units = 65 counts*min-1) and in the CSA instruments (raw SDbetween units = 343 counts......*min-1 vs. calibrated SDbetween units = 67 counts*min-1). However, the effect of applying the derived calibration to children's and adolescents' free living physical activity data did not alter the coefficient of variation (CV) (children: CVraw = 30.2% vs. CVcalibrated = 30.4%, adolescents: CVraw = 36...

  8. Representing variability in a family of MRI scanners

    NARCIS (Netherlands)

    Jaring, M; Krikhaar, RL; Bosch, J

    Promoting software reuse is probably the most promising approach to the cost-effective development and evolution of quality software. An example of reuse is the successful adoption of software product families in industry. In a product family context, software architects anticipate product variation

  9. Effects of hyperthermic baths on depression, sleep and heart rate variability in patients with depressive disorder: a randomized clinical pilot trial.

    Science.gov (United States)

    Naumann, Johannes; Grebe, Julian; Kaifel, Sonja; Weinert, Tomas; Sadaghiani, Catharina; Huber, Roman

    2017-03-28

    Despite advances in the treatment of depression, one-third of depressed patients fail to respond to conventional antidepressant medication. There is a need for more effective treatments with fewer side effects. The primary aim of this study was to determine whether hyperthermic baths reduce depressive symptoms in adults with depressive disorder. Randomized, two-arm placebo-controlled, 8-week pilot trial. Medically stable outpatients with confirmed depressive disorder (ICD-10: F32/F33) who were moderately depressed as determined by the 17-item Hamilton Scale for Depression (HAM-D) score ≥18 were randomly assigned to 2 hyperthermic baths (40 °C) per week for 4 weeks or a sham intervention with green light and follow-up after 4 weeks. Main outcome measure was the change in HAM-D total score from baseline (T0) to the 2-week time point (T1). A total of 36 patients were randomized (hyperthermic baths, n = 17; sham condition, n = 19). The intention-to-treat analysis showed a significant (P = .037) difference in the change in HAM-D total score with 3.14 points after 4 interventions (T1) in favour of the hyperthermic bath group compared to the placebo group. This pilot study suggests that hyperthermic baths do have generalized efficacy in depressed patients. DRKS00004803 at drks-neu.uniklinik-freiburg.de, German Clinical Trials Register (registration date 2016-02-02), retrospectively registered.

  10. Clinical Implications of Glucose Variability: Chronic Complications of Diabetes

    Directory of Open Access Journals (Sweden)

    Hye Seung Jung

    2015-06-01

    Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.

  11. Demographic variables, design characteristics, and effect sizes of randomized, placebo-controlled, monotherapy trials of major depressive disorder and bipolar depression.

    Science.gov (United States)

    Papakostas, George I; Martinson, Max A; Fava, Maurizio; Iovieno, Nadia

    2016-05-01

    The aim of this work is to compare the efficacy of pharmacologic agents for the treatment of major depressive disorder (MDD) and bipolar depression. MEDLINE/PubMed databases were searched for studies published in English between January 1980 and September 2014 by cross-referencing the search term placebo with each of the antidepressant agents identified and with bipolar. The search was supplemented by manual bibliography review. We selected double-blind, randomized, placebo-controlled trials of antidepressant monotherapies for the treatment of MDD and of oral drug monotherapies for the treatment of bipolar depression. 196 trials in MDD and 19 trials in bipolar depression were found eligible for inclusion in our analysis. Data were extracted by one of the authors and checked for accuracy by a second one. Data extracted included year of publication, number of patients randomized, probability of receiving placebo, duration of the trial, baseline symptom severity, dosing schedule, study completion rates, and clinical response rates. Response rates for drug versus placebo in trials of MDD and bipolar depression were 52.7% versus 37.5% and 54.7% versus 40.5%, respectively. The random-effects meta-analysis indicated that drug therapy was more effective than placebo in both MDD (risk ratio for response = 1.373; P depression (risk ratio = 1.257; P depression trials in favor of MDD (P = .008). Although a statistically significantly greater treatment effect size was noted in MDD relative to bipolar depression studies, the absolute magnitude of the difference was numerically small. Therefore, the present study suggests no clinically significant differences in the overall short-term efficacy of pharmacologic monotherapies for MDD and bipolar depression. © Copyright 2016 Physicians Postgraduate Press, Inc.

  12. Effect of a 16-week Bikram yoga program on heart rate variability and associated cardiovascular disease risk factors in stressed and sedentary adults: A randomized controlled trial

    OpenAIRE

    Hewett, Zoe L.; Pumpa, Kate L.; Smith, Caroline A.; Fahey, Paul P.; Birinder S. Cheema

    2017-01-01

    Background Chronic activation of the stress-response can contribute to cardiovascular disease risk, particularly in sedentary individuals. This study investigated the effect of a Bikram yoga intervention on the high frequency power component of heart rate variability (HRV) and associated cardiovascular disease (CVD) risk factors (i.e. additional domains of HRV, hemodynamic, hematologic, anthropometric and body composition outcome measures) in stressed and sedentary adults. Methods Eligible ad...

  13. Variations of high frequency parameter of heart rate variability following osteopathic manipulative treatment in healthy subjects compared to control group and sham therapy: randomized controlled trial

    OpenAIRE

    Nuria eRuffini; Giandomenico eD'alessandro; Nicolò eMariani; Alberto ePollastrelli; Lucia eCardinali; Francesco eCerritelli

    2015-01-01

    Context: Heart Rate Variability (HRV) indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS) function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT) on cardiac autonomic modulation in healthy subjects, compared with sham therapy and control group. Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-ar...

  14. No effect of short-term amino acid supplementation on variables related to skeletal muscle damage in 100 km ultra-runners - a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rosemann Thomas

    2011-04-01

    Full Text Available Abstract Background The purpose of this study was to investigate the effect of short-term supplementation of amino acids before and during a 100 km ultra-marathon on variables of skeletal muscle damage and muscle soreness. We hypothesized that the supplementation of amino acids before and during an ultra-marathon would lead to a reduction in the variables of skeletal muscle damage, a decrease in muscle soreness and an improved performance. Methods Twenty-eight experienced male ultra-runners were divided into two groups, one with amino acid supplementation and the other as a control group. The amino acid group was supplemented a total of 52.5 g of an amino acid concentrate before and during the 100 km ultra-marathon. Pre- and post-race, creatine kinase, urea and myoglobin were determined. At the same time, the athletes were asked for subjective feelings of muscle soreness. Results Race time was not different between the groups when controlled for personal best time in a 100 km ultra-marathon. The increases in creatine kinase, urea and myoglobin were not different in both groups. Subjective feelings of skeletal muscle soreness were not different between the groups. Conclusions We concluded that short-term supplementation of amino acids before and during a 100 km ultra-marathon had no effect on variables of skeletal muscle damage and muscle soreness.

  15. Effects of 16 Weeks of Concurrent Training on Resting Heart Rate Variability and Cardiorespiratory Fitness in People Living With HIV/AIDS Using Antiretroviral Therapy: A Randomized Clinical Trial.

    Science.gov (United States)

    Pedro, Rafael E; Guariglia, Débora A; Okuno, Nilo M; Deminice, Rafael; Peres, Sidney B; Moraes, Solange M F

    2016-12-01

    Pedro, RE, Guariglia, DA, Okuno, NM, Deminice, R, Peres, SB, and Moraes, SMF. Effects of 16 weeks of concurrent training on resting heart rate variability and cardiorespiratory fitness in people living with HIV/AIDS using antiretroviral therapy: a randomized clinical trial. J Strength Cond Res 30(12): 3494-3502, 2016-The study evaluated the effects of concurrent training on resting heart rate variability (HRVrest) and cardiorespiratory fitness in people living with HIV/AIDS undergoing antiretroviral therapy (ART). Fifty-eight participants were randomized into 2 groups (control and training group); however, only 33 were analyzed. The variables studied were HRVrest indices, submaximal values of oxygen uptake (V[Combining Dot Above]O2sub) and heart rate (HR5min), peak speed (Vpeak), and peak oxygen uptake (V[Combining Dot Above]O2peak). The training group performed concurrent training (15-20 minutes of aerobic exercise plus 40 minutes of resistance exercise), 3 times per week, for 16 weeks. Posttraining V[Combining Dot Above]O2peak and Vpeak increased, and HR5min decreased. Resting heart rate variability indices did not present statistical differences posttraining; however, the magnitude-based inferences demonstrated a "possibly positive effect" for high frequency (HF) and low frequency (LF) plus high frequency (LF + HF) and a "likely positive effect" for R-Rmean posttraining. In conclusion, concurrent training was effective at improving cardiorespiratory fitness and endurance performance. Moreover, it led to probably a positive effect on HF and a likely positive effect on R-Rmean in people living with HIV/AIDS undergoing ART.

  16. Can atorvastatin with metformin change the natural history of prostate cancer as characterized by molecular, metabolomic, imaging and pathological variables? A randomized controlled trial protocol.

    Science.gov (United States)

    Roberts, Matthew J; Yaxley, John W; Coughlin, Geoffrey D; Gianduzzo, Troy R J; Esler, Rachel C; Dunglison, Nigel T; Chambers, Suzanne K; Medcraft, Robyn J; Chow, Clement W K; Schirra, Horst Joachim; Richards, Renee S; Kienzle, Nicholas; Lu, Macy; Brereton, Ian; Samaratunga, Hema; Perry-Keene, Joanna; Payton, Diane; Oyama, Chikara; Doi, Suhail A; Lavin, Martin F; Gardiner, Robert A

    2016-09-01

    Atorvastatin and metformin are known energy restricting mimetic agents that act synergistically to produce molecular and metabolic changes in advanced prostate cancer (PCa). This trial seeks to determine whether these drugs favourably alter selected parameters in men with clinically-localized, aggressive PCa. This prospective phase II randomized, controlled window trial is recruiting men with clinically significant PCa, confirmed by biopsy following multiparametric MRI and intending to undergo radical prostatectomy. Ethical approval was granted by the Royal Brisbane and Women's Hospital Human and The University of Queensland Medical Research Ethics Committees. Participants are being randomized into four groups: metformin with placebo; atorvastatin with placebo; metformin with atorvastatin; or placebo alone. Capsules are consumed for 8weeks, a duration selected as the most appropriate period in which histological and biochemical changes may be observed while allowing prompt treatment with curative intent of clinically significant PCa. At recruitment and prior to RP, participants provide blood, urine and seminal fluid. A subset of participants will undergo 7Tesla magnetic resonance spectroscopy to compare metabolites in-vivo with those in seminal fluid and biopsied tissue. The primary end point is biochemical evolution, defined using biomarkers (serum prostate specific antigen; PCA3 and citrate in seminal fluid and prostatic tissue). Standard pathological assessment will be undertaken. This study is designed to assess the potential synergistic action of metformin and atorvastatin on PCa tumour biology. The results may determine simple methods of tumour modulation to reduce disease progression. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Effect of a single session of transcranial direct-current stimulation on balance and spatiotemporal gait variables in children with cerebral palsy: A randomized sham-controlled study.

    Science.gov (United States)

    Grecco, Luanda A C; Duarte, Natália A C; Zanon, Nelci; Galli, Manuela; Fregni, Felipe; Oliveira, Claudia S

    2014-10-10

    Background: Transcranial direct-current stimulation (tDCS) has been widely studied with the aim of enhancing local synaptic efficacy and modulating the electrical activity of the cortex in patients with neurological disorders. Objective: The purpose of the present study was to determine the effect of a single session of tDCS regarding immediate changes in spatiotemporal gait and oscillations of the center of pressure (30 seconds) in children with cerebral palsy (CP). Method: A randomized controlled trial with a blinded evaluator was conducted involving 20 children with CP between six and ten years of age. Gait and balance were evaluated three times: Evaluation 1 (before the stimulation), Evaluation 2 (immediately after stimulation), and Evaluation 3 (20 minutes after the stimulation). The protocol consisted of a 20-minute session of tDCS applied to the primary motor cortex at an intensity of 1 mA. The participants were randomly allocated to two groups: experimental group - anodal stimulation of the primary motor cortex; and control group - placebo transcranial stimulation. Results: Significant reductions were found in the experimental group regarding oscillations during standing in the anteroposterior and mediolateral directions with eyes open and eyes closed in comparison with the control group (pcontrol group among the different evaluations. Conclusion: A single session of tDCS applied to the primary motor cortex promotes positive changes in static balance and gait velocity in children with cerebral palsy.

  18. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  19. Effect of a single session of transcranial direct-current stimulation on balance and spatiotemporal gait variables in children with cerebral palsy: A randomized sham-controlled study

    Directory of Open Access Journals (Sweden)

    Luanda A. C. Grecco

    2014-10-01

    Full Text Available Background: Transcranial direct-current stimulation (tDCS has been widely studied with the aim of enhancing local synaptic efficacy and modulating the electrical activity of the cortex in patients with neurological disorders. Objective: The purpose of the present study was to determine the effect of a single session of tDCS regarding immediate changes in spatiotemporal gait and oscillations of the center of pressure (30 seconds in children with cerebral palsy (CP. Method: A randomized controlled trial with a blinded evaluator was conducted involving 20 children with CP between six and ten years of age. Gait and balance were evaluated three times: Evaluation 1 (before the stimulation, Evaluation 2 (immediately after stimulation, and Evaluation 3 (20 minutes after the stimulation. The protocol consisted of a 20-minute session of tDCS applied to the primary motor cortex at an intensity of 1 mA. The participants were randomly allocated to two groups: experimental group - anodal stimulation of the primary motor cortex; and control group - placebo transcranial stimulation. Results: Significant reductions were found in the experimental group regarding oscillations during standing in the anteroposterior and mediolateral directions with eyes open and eyes closed in comparison with the control group (p<0.05. In the intra-group analysis, the experimental group exhibited significant improvements in gait velocity, cadence, and oscillation in the center of pressure during standing (p<0.05. No significant differences were found in the control group among the different evaluations. Conclusion: A single session of tDCS applied to the primary motor cortex promotes positive changes in static balance and gait velocity in children with cerebral palsy.

  20. Analysis of genetic variability in soursop Annona muricata L populations from Central Java and East Java based on random amplified polymorphic DNA RAPD marker

    Directory of Open Access Journals (Sweden)

    Suratman Suratman

    2014-08-01

    Full Text Available The objective of this research was to determine genetic variability of the soursop (Annona muricata L. populations from Central Java and East Java based on RAPD markers. Leaves of 40 individuals were collected from 4 soursop populations in Central Java and East Java, include : Sukoharjo, Karanganyar (Central Java, and Ngawi, Pacitan (East Java. Genomic DNA was extracted from the leaves by the CTAB extraction procedure with some modifications. A total of 15 RAPD primers were purchased from commercial source and tested to find specific diagnostic markers for each individuals by RAPD-PCR. The measurement of soursop population genetic distance was based on similarity coefficient using method of Group Average Clustering and Unweight Pair Group Method Arithmetic (UPGMA of NTSYS program version 2.02i. Results showed that each soursop population collected from different localities seemed have variability in RAPD profiles by using different primers. Four RAPD polymorphic primer was selected from 15 RAPD primers, namely A18, A20, P10 and P11. A total of 58 bands produced, varying from 9 to 20 bands per primer. The selected four RAPD primers produced 57 polymorphic bands, whereas polymorphism for each primer ranged from 95 % to 100 %. Dendrogram indicated that four soursop populations tend to segregate form two separated clade. The sample collected from Sukoharjo formed a separate cluster while the sample collected from Ngawi, Pacitan and Karanganyar grouped together in other cluster and diverged from population Sukoharjo.

  1. The role of the immunological background of mice in the genetic variability of Schistosoma mansoni as detected by random amplification of polymorphic DNA.

    Science.gov (United States)

    Cossa-Moiane, I L; Mendes, T; Ferreira, T M; Mauricio, I; Calado, M; Afonso, A; Belo, S

    2015-11-01

    Schistosomiasis is a parasitic disease caused by flatworms of the genus Schistosoma. Among the Schistosoma species known to infect humans, S. mansoni is the most frequent cause of intestinal schistosomiasis in sub-Saharan Africa and South America: the World Health Organization estimates that about 200,000 deaths per year result from schistosomiasis in sub-Saharan Africa alone. The Schistosoma life cycle requires two different hosts: a snail as intermediate host and a mammal as definitive host. People become infected when they come into contact with water contaminated with free-living larvae (e.g. when swimming, fishing, washing). Although S. mansoni has mechanisms for escaping the host immune system, only a minority of infecting larvae develop into adults, suggesting that strain selection occurs at the host level. To test this hypothesis, we compared the Belo Horizonte (BH) strain of S. mansoni recovered from definitive hosts with different immunological backgrounds using random amplification of polymorphic DNA-polymerase chain reaction (RAPD-PCR). Schistosoma mansoni DNA profiles of worms obtained from wild-type (CD1 and C57BL/6J) and mutant (Jα18- / - and TGFβRIIdn) mice were analysed. Four primers produced polymorphic profiles, which can therefore potentially be used as reference biomarkers. All male worms were genetically distinct from females isolated from the same host, with female worms showing more specific fragments than males. Of the four host-derived schistosome populations, female and male adults recovered from TGFβRIIdn mice showed RAPD-PCR profiles that were most similar to each other. Altogether, these data indicate that host immunological backgrounds can influence the genetic diversity of parasite populations.

  2. Post-exercise recovery of biological, clinical and metabolic variables after different temperatures and durations of cold water immersion: a randomized clinical trial.

    Science.gov (United States)

    Vanderlei, Franciele M; de Albuquerque, Maíra C; de Almeida, Aline C; Machado, Aryane F; Netto, Jayme; Pastre, Carlos M

    2017-10-01

    Cold water immersion (CWI) is a commonly used recuperative strategy. However there is a lack of standardization of protocols considering the duration and temperature of application of the technique and the stress model. Therefore it is important to study the issue of dose response in a specific stress model. Thus the objective was to analyze and compare the effects of CWI during intense post-exercise recovery using different durations and temperatures of immersion. One hundred and five male individuals were divided into five groups: one control group (CG) and four recovery groups (G1: 5' at 9±1 °C; G2: 5' at 14±1 °C; G3: 15' at 9±1 °C; G4: 15' at 14±1 °C). The volunteers were submitted to an exhaustion protocol that consisted of a jump program and the Wingate Test. Immediately after the exhaustion protocol, the volunteers were directed to a tank with water and ice, where they were immersed for the recovery procedure, during which blood samples were collected for later lactate and creatine kinase (CK) analysis. Variables were collected prior to the exercise and 24, 48, 72, and 96 hours after its completion. For the CK concentration, 15 minutes at 14 °C was the best intervention option, considering the values at 72 hours after exercise, while for the moment of peak lactate an advantage was observed for immersion for 5 minutes at 14 °C. Regarding the perception of recovery, CWI for 5 minutes at 14 °C performed better long-term, from the time of the intervention to 96 hours post-exercise. For pain, no form of immersion responded better than the CG at the immediately post-intervention moment. There were no differences in behavior between the CWI intervention groups for the outcomes studied.

  3. Heart Rate Variability and Hemodynamic Change in the Superior Mesenteric Artery by Acupuncture Stimulation of Lower Limb Points: A Randomized Crossover Trial

    Directory of Open Access Journals (Sweden)

    Soichiro Kaneko

    2013-01-01

    Full Text Available Objective. We investigated the relationship between superior mesenteric artery blood flow volume (SMA BFV and autonomic nerve activity in acupuncture stimulation of lower limb points through heart rate variability (HRV evaluations. Methods. Twenty-six healthy volunteers underwent crossover applications of bilateral manual acupuncture stimulation at ST36 or LR3 or no stimulation. Heart rate, blood pressure, cardiac index, systemic vascular resistance index, SMA BFV, and HRV at rest and 30 min after the intervention were analyzed. Results. SMA BFV showed a significant increase after ST36 stimulation (0% to 14.1% ± 23.4%, P=0.007; very low frequency (VLF, high frequency (HF, low frequency (LF, and LF/HF were significantly greater than those at rest (0% to 479.4% ± 1185.6%, P=0.045; 0% to 78.9% ± 197.6%, P=0.048; 0% to 123.9% ± 217.1%, P=0.006; 0% to 71.5% ± 171.1%, P=0.039. Changes in HF and LF also differed significantly from those resulting from LR3 stimulation (HF: 78.9% ± 197.6% versus −18.2% ± 35.8%, P=0.015; LF: 123.9% ± 217.1% versus 10.6% ± 70.6%, P=0.013. Conclusion. Increased vagus nerve activity after ST36 stimulation resulted in increased SMA BFV. This partly explains the mechanism of acupuncture-induced BFV changes.

  4. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  5. THE EFFECT OF HORMONE THERAPY ON MEAN BLOOD PRESSURE AND VISIT-TO-VISIT BLOOD PRESSURE VARIABILITY IN POSTMENOPAUSAL WOMEN: RESULTS FROM THE WOMEN’S HEALTH INITIATIVE RANDOMIZED CONTROLLED TRIALS

    Science.gov (United States)

    Shimbo, Daichi; Wang, Lu; Lamonte, Michael J.; Allison, Matthew; Wellenius, Gregory A.; Bavry, Anthony A.; Martin, Lisa W.; Aragaki, Aaron; Newman, Jonathan D.; Swica, Yael; Rossouw, Jacques E.; Manson, JoAnn E.; Wassertheil-Smoller, Sylvia

    2014-01-01

    Objectives Mean and visit-to-visit variability (VVV) of blood pressure are associated with an increased cardiovascular disease risk. We examined the effect of hormone therapy on mean and VVV of blood pressure in postmenopausal women from the Women’s Health Initiative (WHI) randomized controlled trials. Methods Blood pressure was measured at baseline and annually in the two WHI hormone therapy trials in which 10,739 and 16,608 postmenopausal women were randomized to conjugated equine estrogens (CEE, 0.625 mg/day) or placebo, and CEE plus medroxyprogesterone acetate (MPA, 2.5 mg/day) or placebo, respectively. Results At the first annual visit (Year 1), mean systolic blood pressure was 1.04 mmHg (95% CI 0.58, 1.50) and 1.35 mmHg (95% CI 0.99, 1.72) higher in the CEE and CEE+MPA arms respectively compared to corresponding placebos. These effects remained stable after Year 1. CEE also increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.03, Pblood pressure increased at Year 1, and the differences in the CEE and CEE+MPA arms vs. placebos also continued to increase after Year 1. Further, both CEE and CEE+MPA significantly increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.04, Pblood pressure. PMID:24991872

  6. Competitive Facility Location with Fuzzy Random Demands

    Science.gov (United States)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2010-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops, with uncertainty and vagueness including demands for the facilities in a plane. By representing the demands for facilities as fuzzy random variables, the location problem can be formulated as a fuzzy random programming problem. For solving the fuzzy random programming problem, first the α-level sets for fuzzy numbers are used for transforming it to a stochastic programming problem, and secondly, by using their expectations and variances, it can be reformulated to a deterministic programming problem. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic oscillation. The efficiency of the proposed method is shown by applying it to numerical examples of the facility location problems.

  7. A double-blind, placebo-controlled, randomized trial of the effects of dark chocolate and cocoa on variables associated with neuropsychological functioning and cardiovascular health: clinical findings from a sample of healthy, cognitively intact older adults.

    Science.gov (United States)

    Crews, W David; Harrison, David W; Wright, James W

    2008-04-01

    In recent years, there has been increased interest in the potential health-related benefits of antioxidant- and phytochemical-rich dark chocolate and cocoa. The objective of the study was to examine the short-term (6 wk) effects of dark chocolate and cocoa on variables associated with neuropsychological functioning and cardiovascular health in healthy older adults. A double-blind, placebo-controlled, fixed-dose, parallel-group clinical trial was used. Participants (n = 101) were randomly assigned to receive a 37-g dark chocolate bar and 8 ounces (237 mL) of an artificially sweetened cocoa beverage or similar placebo products each day for 6 wk. No significant group (dark chocolate and cocoa or placebo)-by-trial (baseline, midpoint, and end-of-treatment assessments) interactions were found for the neuropsychological, hematological, or blood pressure variables examined. In contrast, the midpoint and end-of-treatment mean pulse rate assessments in the dark chocolate and cocoa group were significantly higher than those at baseline and significantly higher than the midpoint and end-of-treatment rates in the control group. Results of a follow-up questionnaire item on the treatment products that participants believed they had consumed during the trial showed that more than half of the participants in both groups correctly identified the products that they had ingested during the experiment. This investigation failed to support the predicted beneficial effects of short-term dark chocolate and cocoa consumption on any of the neuropsychological or cardiovascular health-related variables included in this research. Consumption of dark chocolate and cocoa was, however, associated with significantly higher pulse rates at 3- and 6-wk treatment assessments.

  8. Marketing norm perception among medical representatives in Indian pharmaceutical industry.

    Science.gov (United States)

    Nagashekhara, Molugulu; Agil, Syed Omar Syed; Ramasamy, Ravindran

    2012-03-01

    Study of marketing norm perception among medical representatives is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the perception of marketing norms among medical representatives. The research design is quantitative and cross sectional study with medical representatives as unit of analysis. Data is collected from medical representatives (n=300) using a simple random and cluster sampling using a structured questionnaire. Results indicate that there is no difference in the perception of marketing norms among male and female medical representatives. But there is a difference in opinion among domestic and multinational company's medical representatives. Educational back ground of medical representatives also shows the difference in opinion among medical representatives. Degree holders and multinational company medical representatives have high perception of marketing norms compare to their counterparts. The researchers strongly believe that mandatory training on marketing norms is beneficial in decision making process during the dilemmas in the sales field.

  9. Socioeconomic Representativeness and the Draft

    Science.gov (United States)

    1980-06-01

    responses. There is virtually no way to determine who was in the Armed Forces during the final year of the survey. The cohort for this study was...variable. The figures correspond somewhz with the educacion variable. They indicate that those who had a lower IQ value also tended to have a lesser

  10. Homotopy Measures for Representative Trajectories

    NARCIS (Netherlands)

    Chambers, Erin W.; Kostitsyna, Irina; Löffler, Maarten; Staals, Frank

    2016-01-01

    An important task in trajectory analysis is defining a meaningful representative for a cluster of similar trajectories. Formally defining and computing such a representative r is a challenging problem. We propose and discuss two new definitions, both of which use only the geometry of the input

  11. Spatio-temporal representativeness of euphotic depth in situ sampling in transitional coastal waters

    Science.gov (United States)

    Luhtala, Hanna; Tolvanen, Harri

    2016-06-01

    In dynamic coastal waters, the representativeness of spot sampling is limited to the measurement time and place due to local heterogeneity and irregular water property fluctuations. We assessed the representativeness of in situ sampling by analysing spot-sampled depth profiles of photosynthetically active radiation (PAR) in dynamic coastal archipelago waters in the south-western Finnish coast of the Baltic Sea. First, we assessed the role of spatio-temporality within the underwater light dynamics. As a part of this approach, an anomaly detection procedure was tested on a dataset including a large archipelago area and extensive temporal coverage throughout the ice-free season. The results suggest that euphotic depth variability should be treated as a spatio-temporal process rather than considering spatial and temporal dimensions separately. Second, we assessed the representativeness of spot sampling through statistical analysis of comparative data from spatially denser sampling on three test sites on two optically different occasions. The datasets revealed variability in different dimensions and scales. The suitability of a dataset to reveal wanted phenomena can usually be improved by careful planning and by clearly defining the data sampling objectives beforehand. Nonetheless, conducting a sufficient in situ sampling in dynamic coastal area is still challenging: detecting the general patterns at all the relevant dimensions is complicated by the randomness effect, which reduces the reliability of spot samples on a more detailed scale. Our results indicate that good representativeness of a euphotic depth sampling location is not a stable feature in a highly dynamic environment.

  12. Geometric max stability of Pareto random variables

    OpenAIRE

    Juozulynaitė, Gintarė

    2010-01-01

    Šiame darbe nagrinėjau vienmačių ir dvimačių Pareto atsitiktinių dydžių geometrinį maks stabilumą. Įrodžiau, kad vienmatis Pareto skirstinys yra geometriškai maks stabilus, kai alfa=1. Tačiau nėra geometriškai maks stabilus, kai alfa nelygu 1. Naudodamasi geometrinio maks stabilumo kriterijumi dvimačiams Pareto atsitiktiniams dydžiams, įrodžiau, kad dvimatė Pareto skirstinio funkcija nėra geometriškai maks stabili, kai vektoriaus komponentės nepriklausomos (kai alfa=1, beta=1 ir alfa nelygu 1...

  13. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  14. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and/or...... presented cases of variography either solved the initial problems or served to understand the reasons and causes behind the specific process structures revealed in the variograms. Process Analytical Technologies (PAT) are not complete without process TOS....

  15. Representative mass reduction in sampling

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf

    2004-01-01

    dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... always be representative in the full Theory of Sampling (TOS) sense. This survey also allows empirical verification of the merits of the famous ??Gy?s formula?? for order-of-magnitude estimation of the Fundamental Sampling Error (FSE)....

  16. Drop Spreading with Random Viscosity

    Science.gov (United States)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  17. Competitive Facility Location with Random Demands

    Science.gov (United States)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2009-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops and stores, with uncertain demands in the plane. By representing the demands for facilities as random variables, the location problem is formulated to a stochastic programming problem, and for finding its solution, three deterministic programming problems: expectation maximizing problem, probability maximizing problem, and satisfying level maximizing problem are considered. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic vibration. Efficiency of the solution method is shown by applying to numerical examples of the facility location problems.

  18. Marc Treib: Representing Landscape Architecture

    DEFF Research Database (Denmark)

    Braae, Ellen Marie

    2008-01-01

    The editor of Representing Landscape Architecture, Marc Treib, argues that there is good reason to evaluate the standard practices of representation that landscape architects have been using for so long. In the rush to the promised land of computer design these practices are now in danger of being...... left by the wayside. The 14 often both fitting and well crafted contributions of this publication offer an approach to how landscape architecture has been and is currently represented; in the design study, in presentation, in criticism, and in the creation of landscape architecture....

  19. A Model for Positively Correlated Count Variables

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    2010-01-01

    An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields and their poten......An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields...

  20. Modeling Shared Variables in VHDL

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1994-01-01

    A set of concurrent processes communicating through shared variables is an often used model for hardware systems. This paper presents three modeling techniques for representing such shared variables in VHDL, depending on the acceptable constraints on accesses to the variables. Also a set...... of guidelines for handling atomic updates of multiple shared variables is given. 1 Introduction It is often desirable to partition a computational system into discrete functional units which cooperates to....

  1. Factors Related to Instructional Leadership Perception and Effect of Instructional Leadership on Organizational Variables: A Meta-Analysis

    Science.gov (United States)

    Sisman, Mehmet

    2016-01-01

    In this meta-analysis, effects of teacher characteristics on instructional leadership perceptions and some organizational variables is tested. Findings of the total of 67 independent studies are gathered in the meta-analysis which represents a population of 36,756. According to the findings of this meta-analysis performed by using random effects…

  2. Rationales of a Shift towards Knowledge Economy in Jordan from the Viewpoint of Educational Experts and Relationship with Some Variables

    Science.gov (United States)

    Al Zboon, Mohammad Saleem; Al Ahmad, Suliman Diab Ali; Al Zboon, Saleem Odeh

    2009-01-01

    The purpose of the present study was to identify rationales underlying a shift towards knowledge economy in education as perceived by the educational experts in Jordan and relationship with some variables. The random stratum sample (n = 90) consisted of educational experts representing faculty members in the Jordanian universities and top leaders…

  3. Representing videos in tangible products

    Science.gov (United States)

    Fageth, Reiner; Weiting, Ralf

    2014-03-01

    Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used, the number of images extracted out of the video in order to represent the video, the positions in the book and different design strategies compared to regular books.

  4. Representative shuttle evaporative heat sink

    Science.gov (United States)

    Hixon, C. W.

    1978-01-01

    The design, fabrication, and testing of a representative shuttle evaporative heat sink (RSEHS) system which vaporizes an expendable fluid to provide cooling for the shuttle heat transport fluid loop is reported. The optimized RSEHS minimum weight design meets or exceeds the shuttle flash evaporator system requirements. A cold trap which cryo-pumps flash evaporator exhaust water from the CSD vacuum chamber test facility to prevent water contamination of the chamber pumping equipment is also described.

  5. Representativeness Uncertainty in Chemical Data Assimilation Highlight Mixing Barriers

    Science.gov (United States)

    Lary, David John

    2003-01-01

    When performing chemical data assimilation the observational, representativeness, and theoretical uncertainties have very different characteristics. In this study we have accurately characterized the representativeness uncertainty by studying the probability distribution function (PDF) of the observations. The average deviation has been used as a measure of the width of the PDF and of the variability (representativeness uncertainty) for the grid cell. It turns out that for long-lived tracers such as N2O and CH4 the representativeness uncertainty is markedly different from the observational uncertainty and clearly delineates mixing barriers such as the polar vortex edge, the tropical pipe and the tropopause.

  6. Ashtekar variables

    Science.gov (United States)

    Ashtekar, Abhay

    2015-05-01

    In the spirit of Scholarpedia, this invited article is addressed to students and younger researchers. It provides the motivation and background material, a summary of the main physical ideas, mathematical structures and results, and an outline of applications of the connection variables for general relativity. These variables underlie both the canonical/Hamiltonian and the spinfoam/path integral approaches in loop quantum gravity.

  7. Variability Bugs:

    DEFF Research Database (Denmark)

    Melo, Jean

    2017-01-01

    be exploited. Variability bugs are not confined to any particular type of bug, error-prone feature, or location. In addition to introducing an exponential number of program variants, variability increases the complexity of bugs due to unintended feature interactions, hidden features, combinations of layers...... and bug finding, but not terribly so. This is positive and consistent with the existence of highly-configurable software systems with hundreds, even thousands, of features, testifying that developers in the trenches are able to deal with variability.......Many modern software systems are highly configurable. They embrace variability to increase adaptability and to lower cost. To implement configurable software, developers often use the C preprocessor (CPP), which is a well-known technique, mainly in industry, to deal with variability in code...

  8. Representing culture in interstellar messages

    Science.gov (United States)

    Vakoch, Douglas A.

    2008-09-01

    As scholars involved with the Search for Extraterrestrial Intelligence (SETI) have contemplated how we might portray humankind in any messages sent to civilizations beyond Earth, one of the challenges they face is adequately representing the diversity of human cultures. For example, in a 2003 workshop in Paris sponsored by the SETI Institute, the International Academy of Astronautics (IAA) SETI Permanent Study Group, the International Society for the Arts, Sciences and Technology (ISAST), and the John Templeton Foundation, a varied group of artists, scientists, and scholars from the humanities considered how to encode notions of altruism in interstellar messages art_science/2003>. Though the group represented 10 countries, most were from Europe and North America, leading to the group's recommendation that subsequent discussions on the topic should include more globally representative perspectives. As a result, the IAA Study Group on Interstellar Message Construction and the SETI Institute sponsored a follow-up workshop in Santa Fe, New Mexico, USA in February 2005. The Santa Fe workshop brought together scholars from a range of disciplines including anthropology, archaeology, chemistry, communication science, philosophy, and psychology. Participants included scholars familiar with interstellar message design as well as specialists in cross-cultural research who had participated in the Symposium on Altruism in Cross-cultural Perspective, held just prior to the workshop during the annual conference of the Society for Cross-cultural Research . The workshop included discussion of how cultural understandings of altruism can complement and critique the more biologically based models of altruism proposed for interstellar messages at the 2003 Paris workshop. This paper, written by the chair of both the Paris and Santa Fe workshops, will explore the challenges of communicating concepts of altruism that draw on both biological and cultural models.

  9. Conspicuous Waste and Representativeness Heuristic

    Directory of Open Access Journals (Sweden)

    Tatiana M. Shishkina

    2017-12-01

    Full Text Available The article deals with the similarities between conspicuous waste and representativeness heuristic. The conspicuous waste is analyzed according to the classic Veblen’ interpretation as a strategy to increase social status through conspicuous consumption and conspicuous leisure. In “The Theory of the Leisure Class” Veblen introduced two different types of utility – conspicuous and functional. The article focuses on the possible benefits of the analysis of conspicuous utility not only in terms of institutional economic theory, but also in terms of behavioral economics. To this end, the representativeness heuristics is considered, on the one hand, as a way to optimize the decision-making process, which allows to examine it in comparison with procedural rationality by Simon. On the other hand, it is also analyzed as cognitive bias within the Kahneman and Twersky’ approach. The article provides the analysis of the patterns in the deviations from the rational behavior strategy that could be observed in case of conspicuous waste both in modern market economies in the form of conspicuous consumption and in archaic economies in the form of gift-exchange. The article also focuses on the marketing strategies for luxury consumption’ advertisement. It highlights the impact of the symbolic capital (in Bourdieu’ interpretation on the social and symbolic payments that actors get from the act of conspicuous waste. This allows to perform a analysis of conspicuous consumption both as a rational way to get the particular kind of payments, and, at the same time, as a form of institutionalized cognitive bias.

  10. absolutely regular random sequences

    Directory of Open Access Journals (Sweden)

    Michel Harel

    1996-01-01

    Full Text Available In this paper, the central limit theorems for the density estimator and for the integrated square error are proved for the case when the underlying sequence of random variables is nonstationary. Applications to Markov processes and ARMA processes are provided.

  11. Effects of Upper and Lower Cervical Spinal Manipulative Therapy on Blood Pressure and Heart Rate Variability in Volunteers and Patients With Neck Pain: A Randomized Controlled, Cross-Over, Preliminary Study.

    Science.gov (United States)

    Win, Ni Ni; Jorgensen, Anna Maria S; Chen, Yu Sui; Haneline, Michael T

    2015-03-01

    The aims of this study were to examine autonomic nervous system responses by using heart rate variability analysis (HRV), hemodynamic parameters and numeric pain scale (NPS) when either upper (C1 and C2) or lower (C6 and C7) cervical segments were manipulated in volunteers, and whether such response would be altered in acute mechanical neck pain patients after spinal manipulative therapy (SMT). A randomized controlled, cross-over, preliminary study was conducted on 10 asymptomatic normotensive volunteers and 10 normotensive patients complaining of acute neck pain. HRV, blood pressure (BP) and heart rate (HR), and NPS were recorded after upper cervical and lower cervical segments SMT in volunteer and patient groups. The standard deviation of average normal to normal R-R intervals (SDNN) increased (83.54 ± 22 vs. 105.41 ± 20; P = .02) after upper cervical SMT. The normalized unit of high frequency (nuHF), which shows parasympathetic activity, was predominant (40.18 ± 9 vs. 46.08 ± 14) after upper cervical SMT (P = .03) with a significant decrease (109 ± 10 vs. 98 ± 5) in systolic BP (P = .002). Low frequency to high frequency (LF/HF) ratio, which shows predominance of sympathetic activity increased (1.05 ± 0.7 vs. 1.51 ± 0.5; P = .02) after lower cervical SMT in the healthy volunteers group. However, there was an increase in SDNN (70.48 ± 18 vs. 90.23 ± 20; P = .02 and 75.19 ± 16 vs 97.52 ± 22; P = .01), a decrease in LF/HF ratio (1.33 ± 0.3 vs. 0.81 ± 0.2; P = .001 and 1.22 ± 0.4 vs. 0.86 ± 0.3; P = .02), which was associated with decreased systolic BP (105 ± 10 vs. 95 ± 9; P = .01 and 102 ± 9 vs. 91 ± 10; P = .02) and NPS scores (3 ± 1 vs. 0; P = .01 and 3 ± 1 vs. 1 ± 1; P = .03) following both upper and lower cervical SMT in the patient's group. The baseline HR was 67 ± 9 vs 64 ± 5 (upper cervical) and 65 ± 7 vs 69 ± 11 (lower cervical) in both the healthy volunteer' and patient' groups. Upper cervical SMT enhances dominance of

  12. Short-term effects of a hypocaloric diet with low glycemic index and low glycemic load on body adiposity, metabolic variables, ghrelin, leptin, and pregnancy rate in overweight and obese infertile women: a randomized controlled trial.

    Science.gov (United States)

    Becker, Geórgia F; Passos, Eduardo P; Moulin, Cileide C

    2015-12-01

    Obesity is related to hormonal disorders that affect the reproductive system. Low-glycemic index (LGI) diets seem to exert a positive effect on weight loss and on metabolic changes that result from obesity. We investigated the effects of a hypocaloric diet with an LGI and low glycemic load on anthropometric and metabolic variables, ghrelin and leptin concentrations, and the pregnancy rate in overweight and obese infertile women who were undergoing in vitro fertilization (IVF). The study was a randomized block-design controlled trial in which we analyzed 26 overweight or obese infertile women. Patients were assigned to a hypocaloric LGI-diet group or a control group and followed the protocol for 12 wk. Body weight, body mass index (BMI), percentage of body fat, glucose, insulin, homeostasis model assessment of insulin resistance, serum lipids, reproductive hormones, leptin, acylated ghrelin, number of oocytes retrieved in the IVF cycle, and pregnancy rate were determined. There were greater reductions in body mass, BMI, percentage of body fat, waist:hip ratio, and leptin in the LGI-diet group than in the control group (P < 0.05). Despite a change of 18% in mean values, there was no significant increase in acylated ghrelin concentrations in the LGI group compared with the control group (P = 0.215). The LGI-diet group had 85.4% more oocytes retrieved than did the control group (7.75 ± 1.44 and 4.18 ± 0.87, respectively; P = 0.039) in the IVF cycle. Three patients (21.4%) in the LGI group experienced a spontaneous pregnancy during the follow-up, which generated 3 live births. The hypocaloric LGI diet promoted a decrease in BMI, percentage of body fat, and leptin concentrations, which improved oocyte development and pregnancy rate. These results support the clinical recommendation to advise overweight and obese women to lose weight through a balanced diet before being submitted for treatment with assisted reproduction technologies. A hypocaloric diet combined with LGI

  13. Random regression models

    African Journals Online (AJOL)

    zlukovi

    modelled as a quadratic regression, nested within parity. The previous lactation length was ... This proportion was mainly covered by linear and quadratic coefficients. Results suggest that RRM could .... The multiple trait models in scalar notation are presented by equations (1, 2), while equation. (3) represents the random ...

  14. Want change? Call your representative

    Science.gov (United States)

    Fischhoff, Ilya R.

    2011-07-01

    During my tenure as an AGU Congressional Science Fellow, which began in September 2010 and continues until November 2011, my time has been shared between working with the U.S. House of Representatives Natural Resource Committee Democratic staff and in the office of Rep. Ed Markey (D-Mass., ranking Democrat on the committee). I appreciate getting to work with staff, fellows, and interns who inspire me, make me laugh, and know their issues cold. Much of my work on the committee is related to fish, wildlife, oceans, lands, and water issues and is directly related to my background in ecology and evolutionary biology (I studied zebra ecology and behavior in Kenya). My assignments have included asking the Environmental Protection Agency (EPA) about why it has not changed the allowed usage of certain pesticides that the National Marine Fisheries Service has found to jeopardize the recovery of endangered Pacific salmon; helping to identify research needs and management options to combat the swiftly spreading and catastrophic white nose syndrome in North American bats; and inquiring as to whether a captive-ape welfare bill, if passed without amendment, could thwart development of a vaccine to stop the Ebola virus from continuing to cause mass mortality in endangered wild apes.

  15. Representing Performance in Ethnomusicological Studies

    Directory of Open Access Journals (Sweden)

    Marco Lutzu

    2016-02-01

    Full Text Available Back in the 1970s a number of ethnomusicologists started to elaborate a theoretical reflection on performance as a central issue in the study of music making. This forced them to develop other ways of visualizing music for their analytical purposes. This article deals with how performance has been represented in ethnomusicological studies. I shall discuss how the graphic rendition of a sound recording is simply the mirror of what a scholar perceives, or the consequence of his/her will to emphasise a specific aspect, mediated through the possibilities offered by (and the limits of the Western semiographic system. After presenting a series of examples on how various scholars chose to graphically visualize musical performance, this paper shows how the contemplation of the strategies used to visualize performance in ethnomusicological studies can be a fruitful way of reflecting upon various topics, namely 1 the impassable limits of score transcription for understanding music as a performative phenomenon; 2 the analysis of the graphic solutions adopted by the ethnomusicologist as a way to better understand their idea of what the performance is; 3 the role played by technology in promoting new analytical approaches and methodologies; 4 analysis in ethnomusicology as an "artisanal process".

  16. The German version of the Perceived Stress Scale - psychometric characteristics in a representative German community sample.

    Science.gov (United States)

    Klein, Eva M; Brähler, Elmar; Dreier, Michael; Reinecke, Leonard; Müller, Kai W; Schmutzer, Gabriele; Wölfling, Klaus; Beutel, Manfred E

    2016-05-23

    The Perceived Stress Scale Cohen (J Health Soc Behav 24:385-96, 1983) is a widely and well-established self-report scale measuring perceived stress. However, the German version of the PSS-10 has not yet been validated. Thus, the purposes of this representative study were to psychometrically evaluate the PSS-10, and to provide norm values for the German population. The PSS-10 and standardized scales of depression, anxiety, fatigue, procrastination and life satisfaction were administered to a representative, randomly selected German community sample consisting of 1315 females and 1148 male participants in the age range from 14 to 90 years. The results demonstrated a good internal consistency and construct validity. Perceived stress was consistently associated with depression, anxiety, fatigue, procrastination and reduced life satisfaction. Confirmatory factor analysis revealed a bi-dimensional structure with two related latent factors. Regarding demographic variables, women reported a higher level of stress than men. Perceived stress decreased with higher education, income and employment status. Older and married participants felt less stressed than younger and unmarried participants. The PSS-10 is a reliable, valid and economic instrument for assessing perceived stress. As psychological stress is associated with an increased risk of diseases, identifying subpopulations with higher levels of stress is essential. Due to the dependency of the perceived stress level on demographic variables, particularly age and sex, differentiated norm values are needed, which are provided in this paper.

  17. Importance of randomness in biological networks: A random matrix ...

    Indian Academy of Sciences (India)

    2015-01-29

    Jan 29, 2015 ... We show that in spite of huge differences these interaction networks, representing real-world systems, posses from random matrix models, the spectral properties of the underlying matrices of these networks follow random matrix theory bringing them into the same universality class. We further demonstrate ...

  18. Forecasting Using Random Subspace Methods

    NARCIS (Netherlands)

    T. Boot (Tom); D. Nibbering (Didier)

    2016-01-01

    textabstractRandom subspace methods are a novel approach to obtain accurate forecasts in high-dimensional regression settings. We provide a theoretical justification of the use of random subspace methods and show their usefulness when forecasting monthly macroeconomic variables. We focus on two

  19. Complex variables

    CERN Document Server

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  20. Multi-Agent Methods for the Configuration of Random Nanocomputers

    Science.gov (United States)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  1. The Two Sides of the Representative Coin

    Directory of Open Access Journals (Sweden)

    Keith Sutherland

    2011-12-01

    Full Text Available In Federalist 10 James Madison drew a functional distinction between “parties” (advocates for factional interests and “judgment” (decision-making for the public good and warned of the corrupting effect of combining both functions in a “single body of men.” This paper argues that one way of overcoming “Madisonian corruption” would be by restricting political parties to an advocacy role, reserving the judgment function to an allotted (randomly-selected microcosm of the whole citizenry, who would determine the outcome of parliamentary debates by secret ballot—a division of labour suggested by James Fishkin’s experiments in deliberative polling. The paper then defends this radical constitutional proposal against Bernard Manin’s (1997 claim that an allotted microcosm could not possibly fulfil the “consent” requirement of Natural Right theory. Not only does the proposal challenge Manin’s thesis, but a 28th Amendment implementing it would finally reconcile the competing visions that have bedevilled representative democracy since the Constitutional Convention of 1787.

  2. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    Science.gov (United States)

    Richie, Megan; Josephson, S Andrew

    2018-01-01

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  3. The effects of a fat loss supplement on resting metabolic rate and hemodynamic variables in resistance trained males: a randomized, double-blind, placebo-controlled, cross-over trial.

    Science.gov (United States)

    Campbell, Bill I; Colquhoun, Ryan J; Zito, Gina; Martinez, Nic; Kendall, Kristina; Buchanan, Laura; Lehn, Matt; Johnson, Mallory; St Louis, Courtney; Smith, Yasmin; Cloer, Brad

    2016-01-01

    While it is known that dietary supplements containing a combination of thermogenic ingredients can increase resting metabolic rate (RMR), the magnitude can vary based on the active ingredient and/or combination of active ingredients. The purpose of this study was to examine the effects of a commercially available thermogenic fat loss supplement on RMR and hemodynamic variables in healthy, resistance trained males. Ten resistance-trained male participants (29 ± 9 years; 178 ± 4 cm; 85.7 ± 11 kg, and BMI = 26.8 ± 3.7) volunteered to participate in this randomized, double-blind, placebo controlled cross-over study. Participants underwent two testing sessions separated by at least 24 h. On their first visit, participants arrived to the laboratory after an overnight fast and a 24-h avoidance of exercise, and underwent a baseline RMR, HR, and BP assessment. Next, each participant ingested a thermogenic fat loss supplement (TFLS) or a placebo (PLA) and repeated the RMR, HR, and BP assessments at 60, 120, and 180 min post-ingestion. During the second visit the alternative supplement was ingested and the assessments were repeated in the exact same manner. Data were analyzed via a 2-factor [2x4] within-subjects repeated measures analysis of variance (ANOVA). Post-hoc tests were analyzed via paired samples t-tests. The criterion for significance was set at p ≤ 0.05. A significant main effect for time relative to raw RMR data (p = 0.014) was observed. Post-hoc analysis revealed that the TFLS significantly increased RMR at 60-min, 120-min, and 180-min post ingestion (p  0.05). Specifically, RMR was increased by 7.8 % (from 1,906 to 2,057 kcal), 6.9 % (from 1,906 to 2,037 kcal), and 9.1 % (from 1,906 to 2,081 kcal) in the TFLS, while the PLA treatment increased RMR by 3.3 % (from 1,919 to 1,981 kcal), 3.1 % (from 1,919 to 1,978 kcal), and 2.1 % (from 1,919 to 1,959 kcal) above baseline at 60, 120, and 180-min post ingestion

  4. Marginalization in Random Nonlinear Neural Networks

    Science.gov (United States)

    Vasudeva Raju, Rajkumar; Pitkow, Xaq

    2015-03-01

    Computations involved in tasks like causal reasoning in the brain require a type of probabilistic inference known as marginalization. Marginalization corresponds to averaging over irrelevant variables to obtain the probability of the variables of interest. This is a fundamental operation that arises whenever input stimuli depend on several variables, but only some are task-relevant. Animals often exhibit behavior consistent with marginalizing over some variables, but the neural substrate of this computation is unknown. It has been previously shown (Beck et al. 2011) that marginalization can be performed optimally by a deterministic nonlinear network that implements a quadratic interaction of neural activity with divisive normalization. We show that a simpler network can perform essentially the same computation. These Random Nonlinear Networks (RNN) are feedforward networks with one hidden layer, sigmoidal activation functions, and normally-distributed weights connecting the input and hidden layers. We train the output weights connecting the hidden units to an output population, such that the output model accurately represents a desired marginal probability distribution without significant information loss compared to optimal marginalization. Simulations for the case of linear coordinate transformations show that the RNN model has good marginalization performance, except for highly uncertain inputs that have low amplitude population responses. Behavioral experiments, based on these results, could then be used to identify if this model does indeed explain how the brain performs marginalization.

  5. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  6. Complex variables

    CERN Document Server

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  7. Complex variables

    CERN Document Server

    Taylor, Joseph L

    2011-01-01

    The text covers a broad spectrum between basic and advanced complex variables on the one hand and between theoretical and applied or computational material on the other hand. With careful selection of the emphasis put on the various sections, examples, and exercises, the book can be used in a one- or two-semester course for undergraduate mathematics majors, a one-semester course for engineering or physics majors, or a one-semester course for first-year mathematics graduate students. It has been tested in all three settings at the University of Utah. The exposition is clear, concise, and lively

  8. Can we apply the Mendelian randomization methodology without considering epigenetic effects?

    Directory of Open Access Journals (Sweden)

    Karmaus Wilfried

    2009-05-01

    Full Text Available Abstract Introduction Instrumental variable (IV methods have been used in econometrics for several decades now, but have only recently been introduced into the epidemiologic research frameworks. Similarly, Mendelian randomization studies, which use the IV methodology for analysis and inference in epidemiology, were introduced into the epidemiologist's toolbox only in the last decade. Analysis Mendelian randomization studies using instrumental variables (IVs have the potential to avoid some of the limitations of observational epidemiology (confounding, reverse causality, regression dilution bias for making causal inferences. Certain limitations of randomized controlled trials, such as problems with generalizability, feasibility and ethics for some exposures, and high costs, also make the use of Mendelian randomization in observational studies attractive. Unlike conventional randomized controlled trials (RCTs, Mendelian randomization studies can be conducted in a representative sample without imposing any exclusion criteria or requiring volunteers to be amenable to random treatment allocation. Within the last decade, epigenetics has gained recognition as an independent field of study, and appears to be the new direction for future research into the genetics of complex diseases. Although previous articles have addressed some of the limitations of Mendelian randomization (such as the lack of suitable genetic variants, unreliable associations, population stratification, linkage disequilibrium (LD, pleiotropy, developmental canalization, the need for large sample sizes and some potential problems with binary outcomes, none has directly characterized the impact of epigenetics on Mendelian randomization. The possibility of epigenetic effects (non-Mendelian, heritable changes in gene expression not accompanied by alterations in DNA sequence could alter the core instrumental variable assumptions of Mendelian randomization. This paper applies conceptual

  9. Variational Infinite Hidden Conditional Random Fields

    NARCIS (Netherlands)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin

    2015-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of

  10. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  11. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  12. Variational Infinite Hidden Conditional Random Fields.

    Science.gov (United States)

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin

    2015-09-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of hidden states, which rids us not only of the necessity to specify a priori a fixed number of hidden states available but also of the problem of overfitting. Markov chain Monte Carlo (MCMC) sampling algorithms are often employed for inference in such models. However, convergence of such algorithms is rather difficult to verify, and as the complexity of the task at hand increases the computational cost of such algorithms often becomes prohibitive. These limitations can be overcome by variational techniques. In this paper, we present a generalized framework for infinite HCRF models, and a novel variational inference approach on a model based on coupled Dirichlet Process Mixtures, the HCRF-DPM. We show that the variational HCRF-DPM is able to converge to a correct number of represented hidden states, and performs as well as the best parametric HCRFs-chosen via cross-validation-for the difficult tasks of recognizing instances of agreement, disagreement, and pain in audiovisual sequences.

  13. Linear and Nonlinear Heart Rate Variability Indexes in Clinical Practice

    Directory of Open Access Journals (Sweden)

    Buccelletti Francesco

    2012-01-01

    Full Text Available Biological organisms have intrinsic control systems that act in response to internal and external stimuli maintaining homeostasis. Human heart rate is not regular and varies in time and such variability, also known as heart rate variability (HRV, is not random. HRV depends upon organism's physiologic and/or pathologic state. Physicians are always interested in predicting patient's risk of developing major and life-threatening complications. Understanding biological signals behavior helps to characterize patient's state and might represent a step toward a better care. The main advantage of signals such as HRV indexes is that it can be calculated in real time in noninvasive manner, while all current biomarkers used in clinical practice are discrete and imply blood sample analysis. In this paper HRV linear and nonlinear indexes are reviewed and data from real patients are provided to show how these indexes might be used in clinical practice.

  14. Random thoughts

    Science.gov (United States)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  15. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    NARCIS (Netherlands)

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  16. Transformation of independent variables in polynomial regression ...

    African Journals Online (AJOL)

    In representing a relationship between a response and a number of independent variables, it is preferable when possible to work with a simple functional form in transformed variables rather than with a more complicated form in the original variables. In this paper, it is shown that linear transformations applied to ...

  17. VT Biodiversity Project - Representative Landscapes boundary lines

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) This coverage represents the results of an analysis of landscape diversity in Vermont. Polygons in the dataset represent as much as possible (in a...

  18. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  19. The technology acceptance puzzle. Results of a representative survey in Lower Saxony.

    Science.gov (United States)

    Künemund, Harald; Tanschus, Nele Marie

    2014-12-01

    It is widely taken for granted that the interest in technology decreases with increasing age. Many studies and especially large scale surveys seem to confirm declining technology acceptance; however, it is argued that composition effects (e.g. increasing proportions of women among the older age groups), cohort effects (e.g. experience with different technologies during the lifetime) and various living and health conditions (e.g. living alone, having children in the neighborhood and experience of falls) have to be taken into account and that these factors will have different impacts on the acceptance of different scenarios of assistive technologies. The analyses are based on data from a self-administered questionnaire (n = 2032, a representative random sample of individuals aged 50 years and above in Lower Saxony, Germany). The survey briefly introduced four scenarios of ambient assisted living (AAL) technologies. Multinominal logistic regression was used to explore the correlations of acceptance and the independent variables mentioned. The results show that the simple assumption of an age effect, i.e. technology acceptance generally declines with increasing age, is misleading. An answer to the question whether older people will make use of assistive technologies in the future should consider specific scenarios and also various socioeconomic variables.

  20. 37 CFR 3.61 - Domestic representative.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Domestic representative. 3.61... COMMERCE GENERAL ASSIGNMENT, RECORDING AND RIGHTS OF ASSIGNEE Domestic Representative § 3.61 Domestic... is not domiciled in the United States, the assignee may designate a domestic representative in a...

  1. Universal randomness

    Energy Technology Data Exchange (ETDEWEB)

    Dotsenko, Viktor S [Landau Institute for Theoretical Physics, Russian Academy of Sciences, Moscow (Russian Federation)

    2011-03-31

    In the last two decades, it has been established that a single universal probability distribution function, known as the Tracy-Widom (TW) distribution, in many cases provides a macroscopic-level description of the statistical properties of microscopically different systems, including both purely mathematical ones, such as increasing subsequences in random permutations, and quite physical ones, such as directed polymers in random media or polynuclear crystal growth. In the first part of this review, we use a number of models to examine this phenomenon at a simple qualitative level and then consider the exact solution for one-dimensional directed polymers in a random environment, showing that free energy fluctuations in such a system are described by the universal TW distribution. The second part provides detailed appendix material containing the necessary mathematical background for the first part. (reviews of topical problems)

  2. Random triangles

    OpenAIRE

    Matula, Dominik

    2013-01-01

    The author summarizes some previous results concerning random triangles. He describes the Gaussian triangle and random triangles whose vertices lie in a unit n-dimensional ball, in a rectangle or in a general bounded convex set. In the second part, the author deals with an inscribed triangle in a triangle - let ABC be an equilateral triangle and let M, N, O be three points, each laying on one side of the ABC. We call MNO inscribed triangle (in an equi- laterral triangle). The median triangle ...

  3. Random matrices

    CERN Document Server

    Mehta, Madan Lal

    1990-01-01

    Since the publication of Random Matrices (Academic Press, 1967) so many new results have emerged both in theory and in applications, that this edition is almost completely revised to reflect the developments. For example, the theory of matrices with quaternion elements was developed to compute certain multiple integrals, and the inverse scattering theory was used to derive asymptotic results. The discovery of Selberg's 1944 paper on a multiple integral also gave rise to hundreds of recent publications. This book presents a coherent and detailed analytical treatment of random matrices, leading

  4. Machine learning techniques to select variable stars

    Science.gov (United States)

    García-Varela, Alejandro; Pérez, Muriel; Sabogal, Beatriz; Quiroz, Adolfo

    2017-09-01

    In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.

  5. Time Series Analysis: A New Methodology for Comparing the Temporal Variability of Air Temperature

    Directory of Open Access Journals (Sweden)

    Piia Post

    2013-01-01

    Full Text Available Temporal variability of three different temperature time series was compared by the use of statistical modeling of time series. The three temperature time series represent the same physical process, but are at different levels of spatial averaging: temperatures from point measurements, from regional Baltan65+, and from global ERA-40 reanalyses. The first order integrated average model IMA(0, 1, 1 is used to compare the temporal variability of the time series. The applied IMA(0, 1, 1 model is divisible into a sum of random walk and white noise component, where the variances for both white noises (one of them serving as a generator of the random walk are computable from the parameters of the fitted model. This approach enables us to compare the models fitted independently to the original and restored series using two new parameters. This operation adds a certain new method to the analysis of nonstationary series.

  6. Represented Speech in Qualitative Health Research

    DEFF Research Database (Denmark)

    Musaeus, Peter

    2017-01-01

    Represented speech refers to speech where we reference somebody. Represented speech is an important phenomenon in everyday conversation, health care communication, and qualitative research. This case will draw first from a case study on physicians’ workplace learning and second from a case study...... on nurses’ apprenticeship learning. The aim of the case is to guide the qualitative researcher to use own and others’ voices in the interview and to be sensitive to represented speech in everyday conversation. Moreover, reported speech matters to health professionals who aim to represent the voice...... of their patients. Qualitative researchers and students might learn to encourage interviewees to elaborate different voices or perspectives. Qualitative researchers working with natural speech might pay attention to how people talk and use represented speech. Finally, represented speech might be relevant...

  7. The predictive value of transthoracic echocardiographic variables for sinus rhythm maintenance after electrical cardioversion of atrial fibrillation. Results from the CAPRAF study, a prospective, randomized, placebo-controlled study.

    Science.gov (United States)

    Grundvold, Irene; Tveit, Arnljot; Smith, Pål; Seljeflot, Ingebjørg; Abdelnoor, Michael; Arnesen, Harald

    2008-01-01

    The recurrence rate of atrial fibrillation after electrical cardioversion is disappointingly high. The aim of the present study was to prospectively investigate if standard echocardiographic variables at the day of cardioversion could predict sinus rhythm maintenance. Transthoracic echocardiographic examination was performed within 4 h after cardioversion for all the patients in the CAPRAF (Candesartan in the Prevention of Relapsing Atrial Fibrillation) study. Cardioversion was successful for 137 patients not given specific antiarrhythmic therapy, and only 41 (30%) maintained sinus rhythm at 6-month follow-up. There were significant (p = 0.05) lower transmitral A wave velocities in the group with relapsing atrial fibrillation compared with the group with sinus rhythm at 6-month follow-up. All patients with the lowest A wave velocities had an early recurrence of atrial fibrillation. There were no differences between the groups regarding atrial dimensions or left ventricular function. The use of the angiotensin II receptor antagonist candesartan had no influence on the echocardiographic variables, nor on the recurrence rate of atrial fibrillation after cardioversion. Transthoracic echocardiographic examination performed a short time after electrical cardioversion of atrial fibrillation showed that only A wave peak velocities were significantly predictive of sinus rhythm maintenance 6 months after the procedure. (c) 2008 S. Karger AG, Basel.

  8. Nonparametric estimation in random sum models

    Directory of Open Access Journals (Sweden)

    Hassan S. Bakouch

    2013-05-01

    Full Text Available Let X1,X2,…,XN be independent, identically distributed, non-negative, integervalued random variables and let N be a non-negative, integer-valued random variable independent of X1,X2,…,XN . In this paper, we consider two nonparametric estimation problems for the random sum variable. The first is the estimation of the means of Xi and N based on the second-moment assumptions on distributions of Xi and N . The second is the nonparametric estimation of the distribution of Xi given a parametric model for the distribution of N . Some asymptotic properties of the proposed estimators are discussed.

  9. Biological variation of thromboelastrography variables in 10 clinically healthy horses.

    Science.gov (United States)

    Scruggs, Jennifer L; Flatland, Bente; McCormick, Karen A; Reed, Ann

    2016-01-01

    To assess the utility of population-based reference intervals (PRIs) for interpreting thromboelastography (TEG) variables in horses using biological variation data. Prospective cohort biologic variation study conducted over a 5-week period. Veterinary teaching hospital and research facility. Ten clinically healthy horses randomly selected from a veterinary school research and teaching herd. Horse health was determined using physical examination, CBC, and biochemical and coagulation profiles prior to the start of the study. Subsequently, once weekly blood sampling for TEG testing was performed for 5 weeks. The 4 TEG variables reaction time (R), clot formation time (K), angle, and maximum amplitude (MA) were measured, and coefficient of variation representing within- and between-horse biological variation (CVi and CVg , respectively) and coefficient of variation representing analytical variation (CVa ) were calculated using a nested ANOVA after removing outlier data. The CVi , CVg , and CVa for R were 26.8%, 5.2%, and 5.9%; for K were 31.0%, 0.0%, and 5.9%; for angle were 9.4%, 6.2%, and 21.7%; and for MA were 3.4%, 4.1%, and 4.4%, respectively. Index of individuality (IOI) was then calculated for each variable using the formula {( CVi² + CVa²/CVg²)}¹/². IOI for R was 5.3, for angle was 3.8, and for MA was 1.4; IOI was not assessed for K. PRIs are appropriate for TEG variables, R, angle, and MA when interpreting results from individual horses based on calculated IOI values equal to or greater than 1.4. PRIs are likely appropriate when interpreting K, but IOI could not be calculated for this variable. ©Veterinary Emergency and Critical Care Society 2015.

  10. Representing a public that does not exist

    DEFF Research Database (Denmark)

    Korsgaard, Morten Timmermann

    In her seminal essay ‘The Crisis in Education’ Hannah Arendt presents teachers with the challenge of having to represent a world that is becoming out of joint. This means that the teacher stands as a representative of a world that is becoming increasingly inexplicable and incomprehensible, and th...

  11. Representing Animal-Others in Educational Research

    Science.gov (United States)

    Kuhl, Gail J.

    2011-01-01

    This paper encourages environmental and humane education scholars to consider the ethical implications of how nonhuman animals are represented in research. I argue that research representations of animals can work to either break down processes of "othering," or reinforce them. I explore various options for representing other animals, including…

  12. Speech production variability in fricatives of children and adults: results of functional data analysis.

    Science.gov (United States)

    Koenig, Laura L; Lucero, Jorge C; Perlman, Elizabeth

    2008-11-01

    This study investigates token-to-token variability in fricative production of 5 year olds, 10 year olds, and adults. Previous studies have reported higher intrasubject variability in children than adults, in speech as well as nonspeech tasks, but authors have disagreed on the causes and implications of this finding. The current work assessed the characteristics of age-related variability across articulators (larynx and tongue) as well as in temporal versus spatial domains. Oral airflow signals, which reflect changes in both laryngeal and supralaryngeal apertures, were obtained for multiple productions of /h s z/. The data were processed using functional data analysis, which provides a means of obtaining relatively independent indices of amplitude and temporal (phasing) variability. Consistent with past work, both temporal and amplitude variabilities were higher in children than adults, but the temporal indices were generally less adultlike than the amplitude indices for both groups of children. Quantitative and qualitative analyses showed considerable speaker- and consonant-specific patterns of variability. The data indicate that variability in /s/ may represent laryngeal as well as supralaryngeal control and further that a simple random noise factor, higher in children than in adults, is insufficient to explain developmental differences in speech production variability.

  13. Random Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Amin Coja-Oghlan

    2009-11-01

    Full Text Available Random instances of constraint satisfaction problems such as k-SAT provide challenging benchmarks. If there are m constraints over n variables there is typically a large range of densities r=m/n where solutions are known to exist with probability close to one due to non-constructive arguments. However, no algorithms are known to find solutions efficiently with a non-vanishing probability at even much lower densities. This fact appears to be related to a phase transition in the set of all solutions. The goal of this extended abstract is to provide a perspective on this phenomenon, and on the computational challenge that it poses.

  14. Random Cell Identifiers Assignment

    Directory of Open Access Journals (Sweden)

    Robert Bestak

    2012-01-01

    Full Text Available Despite integration of advanced functions that enable Femto Access Points (FAPs to be deployed in a plug-and-play manner, the femtocell concept still cause several opened issues to be resolved. One of them represents an assignment of Physical Cell Identifiers (PCIs to FAPs. This paper analyses a random based assignment algorithm in LTE systems operating in diverse femtocell scenarios. The performance of the algorithm is evaluated by comparing the number of confusions for various femtocell densities, PCI ranges and knowledge of vicinity. Simulation results show that better knowledge of vicinity can significantly reduce the number of confusions events.

  15. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  16. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  17. Random walks in a random environment

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Abstract. Random walks as well as diffusions in random media are considered. Methods are developed that allow one to establish large deviation results for both the 'quenched' and the 'averaged' case. Keywords. Large deviations; random walks in a random environment. 1. Introduction. A random walk on Zd is a stochastic ...

  18. Use of single-representative reverse-engineered surface-models for RSA does not affect measurement accuracy and precision.

    Science.gov (United States)

    Seehaus, Frank; Schwarze, Michael; Flörkemeier, Thilo; von Lewinski, Gabriela; Kaptein, Bart L; Jakubowitz, Eike; Hurschler, Christof

    2016-05-01

    Implant migration can be accurately quantified by model-based Roentgen stereophotogrammetric analysis (RSA), using an implant surface model to locate the implant relative to the bone. In a clinical situation, a single reverse engineering (RE) model for each implant type and size is used. It is unclear to what extent the accuracy and precision of migration measurement is affected by implant manufacturing variability unaccounted for by a single representative model. Individual RE models were generated for five short-stem hip implants of the same type and size. Two phantom analyses and one clinical analysis were performed: "Accuracy-matched models": one stem was assessed, and the results from the original RE model were compared with randomly selected models. "Accuracy-random model": each of the five stems was assessed and analyzed using one randomly selected RE model. "Precision-clinical setting": implant migration was calculated for eight patients, and all five available RE models were applied to each case. For the two phantom experiments, the 95%CI of the bias ranged from -0.28 mm to 0.30 mm for translation and -2.3° to 2.5° for rotation. In the clinical setting, precision is less than 0.5 mm and 1.2° for translation and rotation, respectively, except for rotations about the proximodistal axis (RSA can be achieved and are not biased by using a single representative RE model. At least for implants similar in shape to the investigated short-stem, individual models are not necessary. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:903-910, 2016. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  19. REFractions: The Representing Equivalent Fractions Game

    Science.gov (United States)

    Tucker, Stephen I.

    2014-01-01

    Stephen Tucker presents a fractions game that addresses a range of fraction concepts including equivalence and computation. The REFractions game also improves students' fluency with representing, comparing and adding fractions.

  20. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data......) that fully cover all practical aspects of sampling and provides a handy “toolbox” for samplers, engineers, laboratory and scientific personnel....

  1. Enhancing policy innovation by redesigning representative democracy

    DEFF Research Database (Denmark)

    Sørensen, Eva

    2016-01-01

    . Two Danish case studies indicate that collaboration between politicians and relevant and affected stakeholders can promote policy innovation, but also that a redesign of representative democracy is needed in order to establish a productive combination of political leadership, competition......Policy innovation is a key aspect of public innovation, which has been largely overlooked. Political leadership, competition and collaboration are key drivers of policy innovation. It is a barrier in traditional models of representative democracy that they provide weak conditions for collaboration...

  2. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve load over many years or decades. CEMs can be computationally complex and are often forced to estimate key parameters using simplified methods to achieve acceptable solve times or for other reasons. In this paper, we discuss one of these parameters -- capacity value (CV). We first provide a high-level motivation for and overview of CV. We next describe existing modeling simplifications and an alternate approach for estimating CV that utilizes hourly '8760' data of load and VG resources. We then apply this 8760 method to an established CEM, the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) model (Eurek et al. 2016). While this alternative approach for CV is not itself novel, it contributes to the broader CEM community by (1) demonstrating how a simplified 8760 hourly method, which can be easily implemented in other power sector models when data is available, more accurately captures CV trends than a statistical method within the ReEDS CEM, and (2) providing a flexible modeling framework from which other 8760-based system elements (e.g., demand response, storage, and transmission) can be added to further capture important dynamic interactions, such as curtailment.

  3. Effects of low-dose versus placebo or conventional-dose postmenopausal hormone therapy on variables related to cardiovascular risk: a systematic review and meta-analyses of randomized clinical trials.

    Science.gov (United States)

    Casanova, Gislaine; Bossardi Ramos, Ramon; Ziegelmann, Patrícia; Spritzer, Poli Mara

    2015-03-01

    Hormone therapy (HT), the most efficient treatment for menopausal symptoms, might have deleterious cardiovascular (CV) effects. This study aimed to evaluate the effects of low-dose estrogen HT on CV risk factors vs conventional-dose HT and placebo in postmenopausal women with no established CV disease. MEDLINE, Cochrane Central, and EMBASE were searched for trials published in 1990-2013; a hand search of reference lists of selected articles was performed; and ClinicalTrials.gov was searched for unpublished trials. Within randomized controlled trials of healthy postmenopausal women comparing low-dose HT to placebo or conventional-dose HT, 11 418 studies were initially identified. Data were independently extracted by two investigators. Disagreements were resolved by a third author. Twenty-eight trials (3360 patients) were included. Low-dose HT vs placebo or conventional-dose HT did not effect weight, body mass index (BMI), blood pressure, C-reactive protein, or high-density lipoprotein cholesterol (HDL-C). Low-dose HT was associated with lower levels of total cholesterol (-12.16 mg/dL, 95% confidence interval [CI], -17.41 - -6.92) and low-density lipoprotein cholesterol (LDL-C) (-12.16 mg/dL; 95% CI, -16.55 - -7.77) vs placebo. Compared with conventional-dose HT, low-dose HT was associated with higher total cholesterol (5.05 mg/dL; 95% CI, 0.88-9.21) and LDL-C (4.49 mg/dL; 95% CI, 0.59-8.39). Low-dose HT was not associated with differences in triglycerides vs placebo. Oral, low-dose HT was associated with lower triglycerides vs conventional-dose HT (-14.09 mg/dL; 95% CI, -24.2 - -3.93). In this population of apparently healthy postmenopausal women, the effect of low-dose HT did not differ from that of placebo or conventional-dose HT regarding weight, BMI, blood pressure, CRP, or HDL-C. In contrast, low-dose HT was associated with better lipid profile vs placebo, and induced higher total and LDL-C and lower triglycerides vs conventional-dose HT.

  4. Effects of neuromuscular electrical stimulation, laser therapy and LED therapy on the masticatory system and the impact on sleep variables in cerebral palsy patients: a randomized, five arms clinical trial

    Directory of Open Access Journals (Sweden)

    Giannasi Lilian

    2012-05-01

    Full Text Available Abstract Background Few studies demonstrate effectiveness of therapies for oral rehabilitation of patients with cerebral palsy (CP, given the difficulties in chewing, swallowing and speech, besides the intellectual, sensory and social limitations. Due to upper airway obstruction, they are also vulnerable to sleep disorders. This study aims to assess the sleep variables, through polysomnography, and masticatory dynamics, using electromiography, before and after neuromuscular electrical stimulation, associated or not with low power laser (Gallium Arsenide- Aluminun, =780 nm and LED (= 660 nm irradiation in CP patients. Methods/design 50 patients with CP, both gender, aged between 19 and 60 years will be enrolled in this study. The inclusion criteria are: voluntary participation, patient with hemiparesis, quadriparesis or diparetic CP, with ability to understand and respond to verbal commands. The exclusion criteria are: patients undergoing/underwent orthodontic, functional maxillary orthopedic or botulinum toxin treatment. Polysomnographic and surface electromyographic exams on masseter, temporalis and suprahyoid will be carry out in all sample. Questionnaire assessing oral characteristics will be applied. The sample will be divided into 5 treatment groups: Group 1: neuromuscular electrical stimulation; Group 2: laser therapy; Group 3: LED therapy; Group 4: neuromuscular electrical stimulation and laser therapy and Group 5: neuromuscular electrical stimulation and LED therapy. All patients will be treated during 8 consecutive weeks. After treatment, polysomnographic and electromiographic exams will be collected again. Discussion This paper describes a five arm clinical trial assessing the examination of sleep quality and masticatory function in patients with CP under non-invasive therapies. Trial registration The protocol for this study is registered with the Brazilian Registry of Clinical Trials - ReBEC RBR-994XFS Descriptors Cerebral Palsy

  5. Classification and prediction of port variables

    Energy Technology Data Exchange (ETDEWEB)

    Molina Serrano, B.

    2016-07-01

    Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)

  6. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  7. Efficiently representing the integer factorization problem using binary decision diagrams

    OpenAIRE

    Skidmore, David

    2017-01-01

    Let p be a prime positive integer and let α be a positive integer greater than 1. A method is given to reduce the problem of finding a nontrivial factorization of α to the problem of finding a solution to a system of modulo p polynomial congruences where each variable in the system is constrained to the set {0,...,p − 1}. In the case that p = 2 it is shown that each polynomial in the system can be represented by an ordered binary decision diagram with size less than 20.25log2(α)3 + 16.5log2(α...

  8. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  9. Fatigue durability under random cyclic loading

    Science.gov (United States)

    Volkov, S. S.; Struzhanov, V. V.

    2017-12-01

    Methods for predicting the durability of metals have been developed while taking into account the effect of random loads acting during the service life of structures. Methods of the theory of functions of random variables are applied. Normal distribution and the Rayleigh distribution are used to estimate the random loads from the experimental histogram. A numerical example illustrates the influence of the statistical parameters of the load distribution on the predicted number of cycles before failure.

  10. Quantifying randomness in real networks

    Science.gov (United States)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  11. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  12. HIV- and AIDS-related stigma: psychosocial aspects in a representative Spanish sample.

    Science.gov (United States)

    Fuster, Maria J; Molero, Fernando; de Montes, Lorena Gil; Agirrezabal, Arrate; Vitoria, Amaia

    2013-01-01

    This study evaluates the prevalence of HIV stigma in Spain and analyzes some variables that may affect its existence. In 2008, we conducted a computer-assisted telephone survey of 1607 people, representative of the Spanish population. Two-wave random stratified sampling was performed, first selecting the home and then the person, depending on the rates of age and sex. About 50% of the population feels discomfort about potential contact with people with HIV and tries to avoid it and 20% advocate discriminatory policies involving physical or social segregation of people with HIV. The belief that HIV is easily transmitted through social contact (15%) and blaming people with HIV for their disease (19.3%) are associated with stigmatization. Degree of proximity to people with HIV, political ideology, educational level, and age are also associated with the degree of stigmatization. According to these results, we suggest that, in order to reduce stigma, we need to modify the erroneous beliefs about the transmission pathways, decrease attributions of blame to people with HIV, and increase contact with them. These interventions should particularly target older people, people with a low educational level, and people with a more conservative political ideology.

  13. 14 CFR 1260.58 - Designation of new technology representative and patent representative.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Designation of new technology... of new technology representative and patent representative. Designation of New Technology... of this grant entitled “New Technology,” the following named representatives are hereby designated by...

  14. 48 CFR 1852.227-72 - Designation of new technology representative and patent representative.

    Science.gov (United States)

    2010-10-01

    ... CONTRACT CLAUSES Texts of Provisions and Clauses 1852.227-72 Designation of new technology representative... of New Technology Representative and Patent Representative (JUL 1997) (a) For purposes of administration of the clause of this contract entitled “New Technology” or “Patent Rights—Retention by the...

  15. Representing object colour in language comprehension.

    Science.gov (United States)

    Connell, Louise

    2007-03-01

    Embodied theories of cognition hold that mentally representing something red engages the neural subsystems that respond to environmental perception of that colour. This paper examines whether implicit perceptual information on object colour is represented during sentence comprehension even though doing so does not necessarily facilitate task performance. After reading a sentence that implied a particular colour for a given object, participants were presented with a picture of the object that either matched or mismatched the implied colour. When asked if the pictured object was mentioned in the preceding sentence, people's responses were faster when the colours mismatched than when they matched, suggesting that object colour is represented differently to other object properties such as shape and orientation. A distinction between stable and unstable embodied representations is proposed to allow embodied theories to account for these findings.

  16. Agrotourism-Representative Issues And Pro Arguments

    Directory of Open Access Journals (Sweden)

    Ramona Ciolac

    2016-05-01

    Full Text Available Many governments from European Union recognize the fact that agrotourism and rural tourism represents one way that can save the agriculture in that in the coming year’s rural tourism and agrotourism will become representative elements for rural area, the arguments of this representativity being the purpose of this paper. The rural area offers great opportunities for development of agrotourism, practicing of its being necessary in the current period. At the majority of rural settlements, the defining emblematic is multiple: the quality of landscape and warmth of the inhabitants, works of art and popular technique, traditional occupations, costumes, customs, traditions, cuisine, resources, etc. To these is added also the awareness, by small farmers, of the need to diversify agricultural activity, both in and outside the farm, by engaging in other activities, with non-agricultural character, from which agrotourism is one of the most circulated.

  17. Representative galaxy age-metallicity relationships

    Science.gov (United States)

    Piatti, Andrés E.; Aparicio, Antonio; Hidalgo, Sebastián L.

    2017-07-01

    The ongoing surveys of galaxies and those for the next generation of telescopes will demand the execution of high-CPU consuming machine codes for recovering detailed star formation histories (SFHs) and hence age-metallicity relationships (AMRs). We present here an expeditive method which provides quick-look AMRs on the basis of representative ages and metallicities obtained from colour-magnitude diagram (CMD) analyses. We have tested its performance by generating synthetic CMDs for a wide variety of galaxy SFHs. The representative AMRs turn out to be reliable down to a magnitude limit with a photometric completeness factor higher than ˜85 per cent, and trace the chemical evolution history for any stellar population (represented by a mean age and an intrinsic age spread) with a total mass within ˜40 per cent of the more massive stellar population in the galaxy.

  18. Comparison of variability in pork carcass composition and quality between barrows and gilts.

    Science.gov (United States)

    Overholt, M F; Arkfeld, E K; Mohrhauser, D A; King, D A; Wheeler, T L; Dilger, A C; Shackelford, S D; Boler, D D

    2016-10-01

    Pigs ( = 8,042) raised in 8 different barns representing 2 seasons (cold and hot) and 2 production focuses (lean growth and meat quality) were used to characterize variability of carcass composition and quality traits between barrows and gilts. Data were collected on 7,684 pigs at the abattoir. Carcass characteristics, subjective loin quality, and fresh ham face color (muscles) were measured on a targeted 100% of carcasses. Fresh belly characteristics, boneless loin weight, instrumental loin color, and ultimate loin pH measurements were collected from 50% of the carcasses each slaughter day. Adipose tissue iodine value (IV), 30-min loin pH, LM slice shear force, and fresh ham muscle characteristic measurements were recorded on 10% of carcasses each slaughter day. Data were analyzed using the MIXED procedure of SAS as a 1-way ANOVA in a randomized complete block design with 2 levels (barrows and gilts). Barn (block), marketing group, production focus, and season were random variables. A 2-variance model was fit using the REPEATED statement of the MIXED procedure, grouped by sex for analysis of least squares means. Homogeneity of variance was tested on raw data using Levene's test of the GLM procedure. Hot carcass weight of pigs (94.6 kg) in this study was similar to U.S. industry average HCW (93.1 kg). Therefore, these data are representative of typical U.S. pork carcasses. There was no difference ( ≥ 0.09) in variability of HCW or loin depth between barrow and gilt carcasses. Back fat depth and estimated carcass lean were more variable ( ≤ 0.0001) and IV was less variable ( = 0.05) in carcasses from barrows than in carcasses from gilts. Fresh belly weight and thickness were more variable ( ≤ 0.01) for bellies of barrows than bellies of gilts, but there was no difference in variability for belly length, width, or flop distance ( ≥ 0.06). Fresh loin subjective color was less variable ( gilts, but there were no differences ( ≥ 0.08) in variability for any

  19. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  20. Risk Gambling and Personality: Results from a Representative Swedish Sample.

    Science.gov (United States)

    Sundqvist, Kristina; Wennberg, Peter

    2015-12-01

    The association between personality and gambling has been explored previously. However, few studies are based on representative populations. This study aimed at examining the association between risk gambling and personality in a representative Swedish population. A random Swedish sample (N = 19,530) was screened for risk gambling using the Lie/Bet questionnaire. The study sample (N = 257) consisted of those screening positive on Lie/Bet and completing a postal questionnaire about gambling and personality (measured with the NODS-PERC and the HP5i respectively). Risk gambling was positively correlated with Negative Affectivity (a facet of Neuroticism) and Impulsivity (an inversely related facet of Conscientiousness), but all associations were weak. When taking age and gender into account, there were no differences in personality across game preference groups, though preferred game correlated with level of risk gambling. Risk gamblers scored lower than the population norm data with respect to Negative Affectivity, but risk gambling men scored higher on Impulsivity. The association between risk gambling and personality found in previous studies was corroborated in this study using a representative sample. We conclude that risk and problem gamblers should not be treated as a homogeneous group, and prevention and treatment interventions should be adapted according to differences in personality, preferred type of game and the risk potential of the games.

  1. Representing Context in Hypermedia Data Models

    DEFF Research Database (Denmark)

    Hansen, Frank Allan

    2005-01-01

    As computers and software systems move beyond the desktopand into the physical environments we live and workin, the systems are required to adapt to these environmentsand the activities taking place within them. Making applicationscontext-aware and representing context informationalong side...... application data can be a challenging task. Thispaper describes how digital context traditionally has beenrepresented in hypermedia data models and how this representationcan scale to also represent physical context. TheHyCon framework and data model, designed for the developmentof mobile context...

  2. On the Inception of Financial Representative Bubbles

    Directory of Open Access Journals (Sweden)

    Massimiliano Ferrara

    2017-11-01

    Full Text Available In this work, we aim to formalize the inception of representative bubbles giving the condition under which they may arise. We will find that representative bubbles may start at any time, depending on the definition of a behavioral component. This result is at odds with the theory of classic rational bubbles, which are those models that rely on the fulfillment of the transversality condition by which a bubble in a financial asset can arise just at its first trade. This means that a classic rational bubble (differently from our model cannot follow a cycle since if a bubble exists, it will burst by definition and never arise again.

  3. Atmospheric variables as driving variables of agricultural and forest ecosystems

    Directory of Open Access Journals (Sweden)

    Luigi Mariani

    Full Text Available Atmospheric variables, which represent meteorology if seen in their instantaneous behavior or climatology if seen in their long time behavior, can be considered among the main driving variables of agricultural and forest ecosystems. In other words meteo-climatic variables determine productivity and quality and territorial specificity of agroforestry productions. On the base of this premise some significant examples are shown in order to describe how different modeling approaches (empirical and mechanistic can improve our degree of description of phenomena and the rationality of our approach to management of agro-ecosystem. The need of strict linkage among agrometeorology and other physical and biological sciences referred to agro-forestry ecosystems is also discussed.

  4. Attributes Heeded When Representing an Osmosis Problem.

    Science.gov (United States)

    Zuckerman, June Trop

    Eighteen high school science students were involved in a study to determine what attributes in the problem statement they need when representing a typical osmosis problem. In order to realize this goal students were asked to solve problems aloud and to explain their answers. Included as a part of the results are the attributes that the students…

  5. Adjustment Following Disability: Representative Case Studies.

    Science.gov (United States)

    Heinemann, Allen W.; Shontz, Franklin C.

    1984-01-01

    Examined adjustment following physical disability using the representative case method with two persons with quadriplegia. Results highlighted the importance of previously established coping styles as well as the role of the environment in adjustment. Willingness to mourn aided in later growth. (JAC)

  6. A Framework for Representing Moving Objects

    DEFF Research Database (Denmark)

    Becker, Ludger; Blunck, Henrik; Hinrichs, Klaus

    2004-01-01

    We present a framework for representing the trajectories of moving objects and the time-varying results of operations on moving objects. This framework supports the realization of discrete data models of moving objects databases, which incorporate representations of moving objects based on non...

  7. A survey of pharmaceutical company representative interactions ...

    African Journals Online (AJOL)

    Objectives: To examine the frequency of pharmaceutical company representative (PCR) interactions with doctors in Libya and review possible associations between these interactions and the personal and practice setting characteristics of doctors. Method: An anonymous survey questionnaire was circulated to 1,000 Libyan ...

  8. 42 CFR 405.910 - Appointed representatives.

    Science.gov (United States)

    2010-10-01

    ... number; (6) Include the appointed representative's professional status or relationship to the party; (7... suppliers. A provider or supplier that furnished the items or services to a beneficiary that are the subject... supplier may not charge the beneficiary any fee associated with the representation. If a provider or...

  9. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  10. A random matrix theory of decoherence

    Energy Technology Data Exchange (ETDEWEB)

    Gorin, T [Departamento de FIsica, Universidad de Guadalajara, Blvd Marcelino GarcIa Barragan y Calzada OlImpica, Guadalajara CP 44840, JalIsco (Mexico); Pineda, C [Institut fuer Physik und Astronomie, University of Potsdam, 14476 Potsdam (Germany); Kohler, H [Fachbereich Physik, Universitaet Duisburg-Essen, D-47057 Duisburg (Germany); Seligman, T H [Instituto de Ciencias FIsicas, Universidad Nacional Autonoma de Mexico (Mexico)], E-mail: thomas.gorin@red.cucei.udg.mx, E-mail: carlospgmat03@gmail.com

    2008-11-15

    Random matrix theory is used to represent generic loss of coherence of a fixed central system coupled to a quantum-chaotic environment, represented by a random matrix ensemble, via random interactions. We study the average density matrix arising from the ensemble induced, in contrast to previous studies where the average values of purity, concurrence and entropy were considered; we further discuss when one or the other approach is relevant. The two approaches agree in the limit of large environments. Analytic results for the average density matrix and its purity are presented in linear response approximation. The two-qubit system is analysed, mainly numerically, in more detail.

  11. Random fixed points and random differential inclusions

    Directory of Open Access Journals (Sweden)

    Nikolaos S. Papageorgiou

    1988-01-01

    Full Text Available In this paper, first, we study random best approximations to random sets, using fixed point techniques, obtaining this way stochastic analogues of earlier deterministic results by Browder-Petryshyn, KyFan and Reich. Then we prove two fixed point theorems for random multifunctions with stochastic domain that satisfy certain tangential conditions. Finally we consider a random differential inclusion with upper semicontinuous orientor field and establish the existence of random solutions.

  12. The Two Sides of the Representative Coin

    OpenAIRE

    Keith Sutherland

    2011-01-01

    In Federalist 10 James Madison drew a functional distinction between “parties” (advocates for factional interests) and “judgment” (decision-making for the public good) and warned of the corrupting effect of combining both functions in a “single body of men.” This paper argues that one way of overcoming “Madisonian corruption” would be by restricting political parties to an advocacy role, reserving the judgment function to an allotted (randomly-selected) microcosm of the whole citizenry, who w...

  13. Variable-bias coin tossing

    Science.gov (United States)

    Colbeck, Roger; Kent, Adrian

    2006-03-01

    Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.

  14. Representing Boolean Functions by Decision Trees

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    A Boolean or discrete function can be represented by a decision tree. A compact form of decision tree named binary decision diagram or branching program is widely known in logic design [2, 40]. This representation is equivalent to other forms, and in some cases it is more compact than values table or even the formula [44]. Representing a function in the form of decision tree allows applying graph algorithms for various transformations [10]. Decision trees and branching programs are used for effective hardware [15] and software [5] implementation of functions. For the implementation to be effective, the function representation should have minimal time and space complexity. The average depth of decision tree characterizes the expected computing time, and the number of nodes in branching program characterizes the number of functional elements required for implementation. Often these two criteria are incompatible, i.e. there is no solution that is optimal on both time and space complexity. © Springer-Verlag Berlin Heidelberg 2011.

  15. Meeting staff representatives of the European Agencies

    CERN Multimedia

    Staff Association

    2014-01-01

      The AASC (Assembly of Agency Staff Committee) held its 27th Meeting of the specialized European Agencies on 26 and 27 May on the premises of the OHIM (Office for Harmonization in the Internal Market) in Alicante, Spain. Two representatives of the CERN Staff Association, in charge of External Relations, attended as observers. This participation is a useful complement to regular contacts we have with FICSA (Federation of International Civil Servants' Associations), which groups staff associations of the UN Agencies, and the annual CSAIO conferences (Conference of Staff Associations of International Organizations), where each Autumn representatives of international organizations based in Europe meet to discuss themes of common interest to better promote and defend the rights of the international civil servants. All these meetings allow us to remain informed on items that are directly or indirectly related to employment and social conditions of our colleagues in other international and Europ...

  16. Climate Change and the Representative Agent

    Energy Technology Data Exchange (ETDEWEB)

    Howarth, R.B. [Environmental Studies Program, Dartmouth College, Hanover, New Hampshire 03755 (United States)

    2000-02-01

    The artifice of an infinitely-lived representative agent is commonly invoked to balance the present costs and future benefits of climate stabilization policies. Since actual economies are populated by overlapping generations of finite-lived persons, this approach begs important questions of welfare aggregation. This paper compares the results of representative agent and overlapping generations models that are numerically calibrated based on standard assumptions regarding climate economy interactions. Under two social choice rules - Pareto efficiency and classical utilitarianism - the models generate closely similar simulation results. In the absence of policies to redistribute income between present and future generations, efficient rates of carbon dioxide emissions abatement rise from 15 to 20% between the years 2000 and 2105. Under classical utilitarianism, in contrast, optimal control rates rise from 48 to 79% this same period. 23 refs.

  17. Data structures and apparatuses for representing knowledge

    Science.gov (United States)

    Hohimer, Ryan E; Thomson, Judi R; Harvey, William J; Paulson, Patrick R; Whiting, Mark A; Tratz, Stephen C; Chappell, Alan R; Butner, Robert S

    2014-02-18

    Data structures and apparatuses to represent knowledge are disclosed. The processes can comprise labeling elements in a knowledge signature according to concepts in an ontology and populating the elements with confidence values. The data structures can comprise knowledge signatures stored on computer-readable media. The knowledge signatures comprise a matrix structure having elements labeled according to concepts in an ontology, wherein the value of the element represents a confidence that the concept is present in an information space. The apparatus can comprise a knowledge representation unit having at least one ontology stored on a computer-readable medium, at least one data-receiving device, and a processor configured to generate knowledge signatures by comparing datasets obtained by the data-receiving devices to the ontologies.

  18. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  19. A representative density profile of the North Greenland snowpack

    Directory of Open Access Journals (Sweden)

    C. F. Schaller

    2016-09-01

    Full Text Available Along a traverse through North Greenland in May 2015 we collected snow cores up to 2 m depth and analyzed their density and water isotopic composition. A new sampling technique and an adapted algorithm for comparing data sets from different sites and aligning stratigraphic features are presented. We find good agreement of the density layering in the snowpack over hundreds of kilometers, which allows the construction of a representative density profile. The results are supported by an empirical statistical density model, which is used to generate sets of random profiles and validate the applied methods. Furthermore we are able to calculate annual accumulation rates, align melt layers and observe isotopic temperatures in the area back to 2010. Distinct relations of δ18O with both accumulation rate and density are deduced. Inter alia the depths of the 2012 melt layers and high-resolution densities are provided for applications in remote sensing.

  20. Machine learning techniques to select variable stars

    Directory of Open Access Journals (Sweden)

    García-Varela Alejandro

    2017-01-01

    Full Text Available In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.

  1. VARIABLE SELECTION FOR CENSORED QUANTILE REGRESION

    OpenAIRE

    Wang, Huixia Judy; Zhou, Jianhui; Li, Yi

    2013-01-01

    Quantile regression has emerged as a powerful tool in survival analysis as it directly links the quantiles of patients’ survival times to their demographic and genomic profiles, facilitating the identification of important prognostic factors. In view of limited work on variable selection in the context, we develop a new adaptive-lasso-based variable selection procedure for quantile regression with censored outcomes. To account for random censoring for data with multivariate covariates, we emp...

  2. Input variable selection and calibration data selection for storm water quality regression models.

    Science.gov (United States)

    Sun, Siao; Bertrand-Krajewski, Jean-Luc

    2013-01-01

    Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.

  3. Representing melancholy: figurative art and feminism

    OpenAIRE

    Reading, Christina

    2015-01-01

    Re-presentations of women's melancholic subjectivity by women figurative artists from different historical moments, canonical images of melancholy and theoretical accounts of melancholy are brought together to address the question: 'What aspects of women's experience of melancholy have women figurative artists chosen to represent historically and contemporaneously, and further what is the importance of these artworks for understanding the nature of women's melancholic subjectivity today? ...

  4. Using semantics for representing experimental protocols.

    Science.gov (United States)

    Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar

    2017-11-13

    An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .

  5. Simulating Ability: Representing Skills in Games

    OpenAIRE

    Hetland, Magnus Lie

    2013-01-01

    Throughout the history of games, representing the abilities of the various agents acting on behalf of the players has been a central concern. With increasingly sophisticated games emerging, these simulations have become more realistic, but the underlying mechanisms are still, to a large extent, of an ad hoc nature. This paper proposes using a logistic model from psychometrics as a unified mechanism for task resolution in simulation-oriented games.

  6. Diversity and representativeness: two key factors

    CERN Multimedia

    Staff Association

    2013-01-01

    In the past few weeks many of you have filled out the questionnaire for preparing the upcoming Five-yearly review. Similarly, Staff Association members have elected their delegates to the Staff Council for the coming two years. Once again we would like to thank all those who have taken the time and effort to make their voice heard on these two occasions. Elections to the Staff Council Below we publish the new Staff Council with its forty delegates who will represent in 2014 and 2015 all CERN staff in the discussions with Management and Member States in the various bodies and committees. Therefore it is important that the Staff Council represents as far as possible the diversity of the CERN population. By construction, the election process with its electoral colleges and two-step voting procedure guarantees that all Departments, even the small ones, and the various staff categories are correctly represented. Figure 1 shows the participation rate in the elections. The average rate is just above 52 %, with ...

  7. A Review on asymptotic normality of sums of associated random ...

    African Journals Online (AJOL)

    Association between random variables is a generalization of independence of these random variables. This concept is more and more commonly used in current trends in any research elds in Statistics. In this paper, we proceed to a simple, clear and rigorous introduction to it. We will present the fundamental asymptotic ...

  8. Random walk of motor planning in task-irrelevant dimensions

    NARCIS (Netherlands)

    van Beers, R.J.; Brenner, E.; Smeets, J.B.J.

    2013-01-01

    The movements that we make are variable. It is well established that at least a part of this variability is caused by noise in central motor planning. Here, we studied how the random effects of planning noise translate into changes in motor planning. Are the random effects independently added to a

  9. Spatial representativeness of ground-based solar radiation measurements

    Science.gov (United States)

    Zyta Hakuba, Maria; Folini, Doris; Wild, Martin

    2013-04-01

    The validation of gridded surface solar radiation (SSR) data, i.e., satellite-derived or climate model calculated, relies on the comparison with ground-based in-situ measurements. Detached from any modeling or temporal averaging biases, the question remains how representative a point measurement is for a larger-scale grid cell. In the present study, we make extensive use of high-resolution (0.03°) SSR data from the Satellite Application Facility on climate monitoring (CM SAF) to study in detail: 1) the spatial variability in SSR over Europe, 2) the sub-grid variability within an example grid of 1° resolution, 3) the representativeness of 143 surface sites (BSRN and GEBA) for their corresponding 1° grid cells, and 4) the point-centered and grid-independent surface sites' representativeness for larger-grid cells up to 3°. These analyses are done on a climatological annual mean basis over the period 2001-2005. Annually, the spatial variability as given in the CM SAF data set is largest in regions of sudden changes in weather conditions and topography, e.g., in Northern Spain, the Alpine region, the Carpathians, and Adriatic coast. The 1° sub-grid variability (mean absolute deviation from grid cell mean, relative to grid cell mean, RMAD) is on average 1.64 % (2.43 Wm-2) over European land, with maximum RMAD of up to 10% in Northern Spain. The surface sites' (GEBA and BSRN) representativeness for larger-grid cells is highly dependent on region and grid size. The difference between the CM SAF value at the GEBA site's location and the grid cell mean (calculated from CM SAF data) can vary from almost 0% to more than 10% for a 1° grid cell, and up to 15% for a 3° grid cell. On average, this spatial sampling error is below 5% even for grid cells of 3° resolution. We show that the latitudinal shift of a point relative to the larger-grid cell center may account for a spatial sampling error of up to +-1.81 Wm-2 (for a maximum distance of +-0.5° within 1° grid cell

  10. Soil variability in engineering applications

    Science.gov (United States)

    Vessia, Giovanna

    2014-05-01

    Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random

  11. Proposal for a short version of the Beck Hopelessness Scale based on a national representative survey in Hungary.

    Science.gov (United States)

    Perczel Forintos, Dóra; Rózsa, Sándor; Pilling, János; Kopp, Mária

    2013-12-01

    In our study we assessed the frequency of reported hopelessness and suicide attempts in the national representative survey Hungarostudy 2002. The randomly selected sample consisted of 14,000 individuals over the age of 18. We created a short version of the widely used Beck Hopelessness Scale for screening purposes in suicide prevention. The short version of the BHS consists of four items and has high internal consistency (Cronbach's alpha = 0.85). Moreover, we conducted an investigation into psychological, somatic, sociological and socio-economic as well as cultural variables that show a positive or negative correlation with hopelessness and important predictors of suicide. The following psychological variables showing a positive correlation with hopelessness were identified: dysfunctional attitudes, exhaustion, psychological distress, hostility, lack of life goals and inability to cope emotionally. Sense of coherence, social support, perceived self-efficiency, subjective well-being and problem-solving coping showed a negative correlation with hopelessness. Concerning the relationship between hopelessness and suicide attempts, we found that participants who attempted suicide in the last year scored higher (mean = 4.86) than participants who attempted suicide more than 3 years ago (mean = 3.57). These results indicate that applying the short version of the BHS could be very useful in general practice and in psychiatric care.

  12. Variable importance in latent variable regression models

    NARCIS (Netherlands)

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  13. Classifying variability modeling techniques

    NARCIS (Netherlands)

    Sinnema, Marco; Deelstra, Sybren

    Variability modeling is important for managing variability in software product families, especially during product derivation. In the past few years, several variability modeling techniques have been developed, each using its own concepts to model the variability provided by a product family. The

  14. Representing Simple Geometry Types in NetCDF-CF

    Science.gov (United States)

    Blodgett, D. L.; Koziol, B. W.; Whiteaker, T. L.; Simons, R.

    2016-12-01

    The Climate and Forecast (CF) metadata convention is well-suited for representing gridded and point-based observational datasets. However, CF currently has no accepted mechanism for representing simple geometry types such as lines and polygons. Lack of support for simple geometries within CF has unintentionally excluded a broad set of geoscientific data types from NetCDF-CF data encodings. For example, hydrologic datasets often contain polygon watershed catchments and polyline stream reaches in addition to point sampling stations and water management infrastructure. The latter has an associated CF specification. In the interest of supporting all simple geometry types within CF, a working group was formed following an EarthCube workshop on Advancing NetCDF-CF [1] to draft a CF specification for simple geometries: points, lines, polygons, and their associated multi-geometry representations [2]. The draft also includes parametric geometry types such as circles and ellipses. This presentation will provide an overview of the scope and content of the proposed specification focusing on mechanisms for representing coordinate arrays using variable length or continuous ragged arrays, capturing multi-geometries, and accounting for type-specific geometry artifacts such as polygon holes/interiors, node ordering, etc. The concepts contained in the specification proposal will be described with a use case representing streamflow in rivers and evapotranspiration from HUC12 watersheds. We will also introduce Python and R reference implementations developed alongside the technical specification. These in-development, open source Python and R libraries convert between commonly used GIS software objects (i.e. GEOS-based primitives) and their associated simple geometry CF representation. [1] http://www.unidata.ucar.edu/events/2016CFWorkshop/[2] https://github.com/bekozi/netCDF-CF-simple-geometry

  15. Asymptotics of sums of lognormal random variables with Gaussian copula

    DEFF Research Database (Denmark)

    Asmussen, Søren; Rojas-Nandayapa, Leonardo

    2008-01-01

    Let (Y1, ..., Yn) have a joint n-dimensional Gaussian distribution with a general mean vector and a general covariance matrix, and let Xi = eYi, Sn = X1 + ⋯ + Xn. The asymptotics of P (Sn > x) as n → ∞ are shown to be the same as for the independent case with the same lognormal marginals. In part...

  16. Stable limits for sums of dependent infinite variance random variables

    DEFF Research Database (Denmark)

    Bartkiewicz, Katarzyna; Jakubowski, Adam; Mikosch, Thomas

    2011-01-01

    The aim of this paper is to provide conditions which ensure that the affinely transformed partial sums of a strictly stationary process converge in distribution to an infinite variance stable distribution. Conditions for this convergence to hold are known in the literature. However, most of these......The aim of this paper is to provide conditions which ensure that the affinely transformed partial sums of a strictly stationary process converge in distribution to an infinite variance stable distribution. Conditions for this convergence to hold are known in the literature. However, most...

  17. Some limit theorems for negatively associated random variables

    Indian Academy of Sciences (India)

    Author Affiliations. Yu Miao1 Wenfei Xu1 Shanshan Chen1 Andre Adler2. College of Mathematics and Information Science, Henan Normal University, Henan Province, 453007, China; Department of Applied Mathematics IIT, Chicago 60657, USA ...

  18. Sparse estimation for structural variability

    Directory of Open Access Journals (Sweden)

    Singh Rohit

    2011-04-01

    Full Text Available Abstract Background Proteins are dynamic molecules that exhibit a wide range of motions; often these conformational changes are important for protein function. Determining biologically relevant conformational changes, or true variability, efficiently is challenging due to the noise present in structure data. Results In this paper we present a novel approach to elucidate conformational variability in structures solved using X-ray crystallography. We first infer an ensemble to represent the experimental data and then formulate the identification of truly variable members of the ensemble (as opposed to those that vary only due to noise as a sparse estimation problem. Our results indicate that the algorithm is able to accurately distinguish genuine conformational changes from variability due to noise. We validate our predictions for structures in the Protein Data Bank by comparing with NMR experiments, as well as on synthetic data. In addition to improved performance over existing methods, the algorithm is robust to the levels of noise present in real data. In the case of Human Ubiquitin-conjugating enzyme Ubc9, variability identified by the algorithm corresponds to functionally important residues implicated by mutagenesis experiments. Our algorithm is also general enough to be integrated into state-of-the-art software tools for structure-inference.

  19. Entropy as a collective variable

    Science.gov (United States)

    Parrinello, Michele

    Sampling complex free energy surfaces that exhibit long lived metastable states separated by kinetic bottlenecks is one of the most pressing issues in the atomistic simulations of matter. Not surprisingly many solutions to this problem have been suggested. Many of them are based on the identification of appropriate collective variables that span the manifold of the slow varying modes of the system. While much effort has been put in devising and even constructing on the fly appropriate collective variables there is still a cogent need of introducing simple, generic, physically transparent, and yet effective collective variables. Motivated by the physical observation that in many case transitions between one metastable state and another result from a trade off between enthalpy and entropy we introduce appropriate collective variables that are able to represent in a simple way these two physical properties. We use these variables in the context of the recently introduced variationally enhanced sampling and apply it them with success to the simulation of crystallization from the liquid and to conformational transitions in protein. Department of Chemistry and Applied Biosciences, ETH Zurich, and Facolta' di Informatica, Istituto di Scienze Computazionali, Universita' della Svizzera Italiana, Via G. Buffi 13, 6900 Lugano, Switzerland.

  20. Random matrix theory within superstatistics

    OpenAIRE

    Abul-Magd, A. Y.

    2005-01-01

    We propose a generalization of the random matrix theory following the basic prescription of the recently suggested concept of superstatistics. Spectral characteristics of systems with mixed regular-chaotic dynamics are expressed as weighted averages of the corresponding quantities in the standard theory assuming that the mean level spacing itself is a stochastic variable. We illustrate the method by calculating the level density, the nearest-neighbor-spacing distributions and the two-level co...

  1. Analyzing Walksat on random formulas

    OpenAIRE

    Coja-Oghlan, Amin; Frieze, Alan

    2011-01-01

    Let F be a uniformly distributed random k-SAT formula with n variables and m clauses. We prove that the Walksat algorithm from Papadimitriou (FOCS 1991)/Schoning (FOCS 1999) finds a satisfying assignment of F in polynomial time w.h.p. if m/n0. This is an improvement by a factor of $\\Theta(k)$ over the best previous analysis of Walksat from Coja-Oghlan, Feige, Frieze, Krivelevich, Vilenchik (SODA 2009).

  2. Detecting Random, Partially Random, and Nonrandom Minnesota Multiphasic Personality Inventory--Adolescent Protocols

    Science.gov (United States)

    Pinsoneault, Terry B.

    2005-01-01

    The ability of the Minnesota Multiphasic Personality Inventory-Adolescent (MMPI-A; J. N. Butcher et al., 1992) validity scales to detect random, partially random, and nonrandom MMPI-A protocols was investigated. Investigations included the Variable Response Inconsistency scale (VRIN), F, several potentially useful new F and VRIN subscales, and…

  3. Lectures on random interfaces

    CERN Document Server

    Funaki, Tadahisa

    2016-01-01

    Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...

  4. Identifying Optimal Models to Represent Biochemical Systems

    Science.gov (United States)

    Apri, Mochamad; de Gee, Maarten; van Mourik, Simon; Molenaar, Jaap

    2014-01-01

    Biochemical systems involving a high number of components with intricate interactions often lead to complex models containing a large number of parameters. Although a large model could describe in detail the mechanisms that underlie the system, its very large size may hinder us in understanding the key elements of the system. Also in terms of parameter identification, large models are often problematic. Therefore, a reduced model may be preferred to represent the system. Yet, in order to efficaciously replace the large model, the reduced model should have the same ability as the large model to produce reliable predictions for a broad set of testable experimental conditions. We present a novel method to extract an “optimal” reduced model from a large model to represent biochemical systems by combining a reduction method and a model discrimination method. The former assures that the reduced model contains only those components that are important to produce the dynamics observed in given experiments, whereas the latter ensures that the reduced model gives a good prediction for any feasible experimental conditions that are relevant to answer questions at hand. These two techniques are applied iteratively. The method reveals the biological core of a model mathematically, indicating the processes that are likely to be responsible for certain behavior. We demonstrate the algorithm on two realistic model examples. We show that in both cases the core is substantially smaller than the full model. PMID:24416170

  5. Genetic variability among Andrographis paniculata in Chhattisgarh ...

    African Journals Online (AJOL)

    Preeti minz

    2013-09-25

    Sep 25, 2013 ... Random amplified polymorphic DNA (RAPD) markers were used to estimate the genetic variability and ... Twenty-four (24) plants were collected from five districts of different places of Chhattisgarh region. ... used for genetic diversity study in wild species of approaching value as it is quick, unswerving and.

  6. Telegrapher's equation with variable propagation speeds

    OpenAIRE

    Masoliver, Jaume, 1951-; Weiss, George H. (George Herbert), 1930-

    1994-01-01

    All derivations of the one-dimensional telegraphers equation, based on the persistent random walk model, assume a constant speed of signal propagation. We generalize here the model to allow for a variable propagation speed and study several limiting cases in detail. We also show the connections of this model with anomalous diffusion behavior and with inertial dichotomous processes.

  7. Genetic parameters of variability, correlation and pathcoefficient ...

    African Journals Online (AJOL)

    All the 21 genotypes along with 64 hybrids were evaluated for nine traits in a randomized block design over five replications. Genetic variability, character association and path-coefficient analysis were studied. Grain yield was kept as a dependant character and the results were analyzed. Analysis of variance revealed ...

  8. Travel time variability and rational inattention

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Jiang, Gege

    2017-01-01

    This paper sets up a rational inattention model for the choice of departure time for a traveler facing random travel time. The traveler chooses how much information to acquire about the travel time out-come before choosing departure time. This reduces the cost of travel time variability compared...

  9. [Use of regression analysis in the study of the association between different mercury exposure routes and some functional variables].

    Science.gov (United States)

    Decarli, A; Parrinello, G; Cortesi, I; Lucchini, R

    2002-01-01

    The "Mercury Multicentric Project" data-set was analysed with the aim to identify a group of biological variables associated with different types of Hg exposure (occupational exposure, fish dietary intake, exposure due to amalgam restorations). The distribution of socio-demographic, biological and surrogate outcome variables was rather different among the collaborating Units. Mixed linear models (MLM) were used to overcome the problems related to the heterogeneity of variances among Units. MLM are a generalization of the standard linear models, the generalization being that the data are permitted to exhibit correlation and non constant variability. MLM therefore provide with the flexibility of modelling not only the means of the dependent variable, but also their variances and co-variances as well. This allows to represent the total variability of the dependent variable as a sum of two components: the first attributable to the Units and the other one to the random error. A set of biological variables significantly associated with at least one of the Hg exposure variables or with the HgU/creatinine ratio (as surrogate variable for Hg exposure) was identified by means of MLM. This set includes beta 2-MG, sIL8, CD4+, sPRL, FT dominant, BAMT, and some variables related to neurological behaviour. An extension of this analysis will be performed with a structural equations approach in order to study the dose-response relationships among the various variables according to a hierarchical path defined on biological basis. One of the possible simplified general models including all the selected variables is described.

  10. In vitro characterization of representative clinical South African Staphylococcus aureus isolates from various clonal lineages

    NARCIS (Netherlands)

    Oosthuysen, W F; Orth, H; Lombard, C. J.; Sinha, B; Wasserman, E

    Data concerning the virulence and pathogenesis of South African strains of Staphylococcus aureus are limited. We investigated host-pathogen interactions of randomly selected clinical S. aureus isolates representing various clones. We characterized the ability of isolates to adhere to fibronectin,

  11. 77 FR 72205 - Testing and Labeling Pertaining to Product Certification Regarding Representative Samples for...

    Science.gov (United States)

    2012-12-05

    ... design or manufacturing process, including the sourcing of component parts,'' and the ``testing of random... representative of the product for mechanical tests. For example, if a bicycle handlebar sample is manufactured... concerning untested units versus tested units may be met by a range of probability-based sampling designs...

  12. Frequency of lucid dreaming in a representative German sample.

    Science.gov (United States)

    Schredl, Michael; Erlacher, Daniel

    2011-02-01

    Lucid dreams occur when a person is aware that he is dreaming while he is dreaming. In a representative sample of German adults (N = 919), 51% of the participants reported that they had experienced a lucid dream at least once. Lucid dream recall was significantly higher in women and negatively correlated with age. However, these effects might be explained by the frequency of dream recall, as there was a correlation of .57 between frequency of dream recall and frequency of lucid dreams. Other sociodemographic variables like education, marital status, or monthly income were not related to lucid dream frequency. Given the relatively high prevalence of lucid dreaming reported in the present study, research on lucid dreams might be pursued in the sleep laboratory to expand the knowledge about sleep, dreaming, and consciousness processes in general.

  13. Representing vegetation processes in hydrometeorological simulations using the WRF model

    DEFF Research Database (Denmark)

    Nielsen, Joakim Refslund

    -ments are still needed in the representation of the land surface variability and of some key land surface processes. This thesis explores two possibilities for improving the near-surface model predictions using the mesoscale Weather Research and Forecasting (WRF) model. In the _rst approach, data from satellite......For accurate predictions of weather and climate, it is important that the land surface and its processes are well represented. In a mesoscale model the land surface processes are calculated in a land surface model (LSM). These pro-cesses include exchanges of energy, water and momentum between...... the land surface components, such as vegetation and soil, and their interactions with the atmosphere. The land surface processes are complex and vary in time and space. Signi_cant e_ort by the land surface community has therefore been invested in improving the LSMs over the recent decades. However, improve...

  14. Representing glaciers in a regional climate model

    Energy Technology Data Exchange (ETDEWEB)

    Kotlarski, Sven [Max Planck Institute for Meteorology, Hamburg (Germany); ETH Zurich, Institute for Atmospheric and Climate Science, Zurich (Switzerland); Jacob, Daniela; Podzun, Ralf [Max Planck Institute for Meteorology, Hamburg (Germany); Paul, Frank [University of Zurich, Department of Geography, Zurich (Switzerland)

    2010-01-15

    A glacier parameterization scheme has been developed and implemented into the regional climate model REMO. The new scheme interactively simulates the mass balance as well as changes of the areal extent of glaciers on a subgrid scale. The temporal evolution and the general magnitude of the simulated glacier mass balance in the European Alps are in good accordance with observations for the period 1958-1980, but the strong mass loss towards the end of the twentieth century is systematically underestimated. The simulated decrease of glacier area in the Alps between 1958 and 2003 ranges from -17.1 to -23.6%. The results indicate that observed glacier mass balances can be approximately reproduced within a regional climate model based on simplified concepts of glacier-climate interaction. However, realistic results can only be achieved by explicitly accounting for the subgrid variability of atmospheric parameters within a climate model grid box. (orig.)

  15. Towards a representative periphytic diatom sample

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The need to acquire a representative periphytic diatom sample for river water quality monitoring has been recognised in the development of existing diatom indices, important in the development and employment of diatom monitoring tools for the Water Framework Directive. In this study, a nested design with replication is employed to investigate the magnitude of variation in diatom biomass, composition and Trophic Diatom Index at varying scales within a small chalk river. The study shows that the use of artificial substrates may not result in diatom communities that are typical of the surrounding natural substrates. Periphytic diatom biomass and composition varies between artificial and natural substrates, riffles and glides and between two stretches of the river channel. The study also highlights the existence of high variation in diatom frustule frequency and biovolume at the individual replicate scale which may have implications for the use of diatoms in routine monitoring.

  16. Using resource graphs to represent conceptual change

    Directory of Open Access Journals (Sweden)

    Michael C. Wittmann

    2006-08-01

    Full Text Available We introduce resource graphs, a representation of linked ideas used when reasoning about specific contexts in physics. Our model is consistent with previous descriptions of coordination classes and resources. It represents mesoscopic scales that are neither knowledge-in-pieces nor large-scale concepts. We use resource graphs to describe several forms of conceptual change: incremental, cascade, wholesale, and dual construction. For each, we give evidence from the physics education research literature to show examples of each form of conceptual change. Where possible, we compare our representation to models used by other researchers. Building on our representation, we analyze another form of conceptual change, differentiation, and suggest several experimental studies that would help understand the differences between reform-based curricula.

  17. The French method (of representing noise annoyance)

    Science.gov (United States)

    Collet, F.; Delol, J.

    1980-01-01

    The psophic index used in France for noise exposure from aircraft globally represents the annoyance with the following hypotheses: (1) the global annoyance is a function of the number of aircraft overflights of each type but does not depend on the overflight time; (2) an aircraft flying at night is considered to be just as annoying as 10 aircraft of the same type passing overhead during the day; and (3) and annoyance is only a function of the peak noise levels. Overall, the psophic index appears statistically as good a representation of the average annoyance as methods used in other countries; however, it does seem to reflect poorly the annoyance caused by light aircraft. Noise maps produced for Orly, Roissy, and the area around Paris are described. The range of applications and limitations of the psophic index are discussed.

  18. Endeavors to Represent the Non-Representational

    DEFF Research Database (Denmark)

    Nørgård, Rikke Toft

    and findings emerged through the development of a conceptual understanding of the non-representational and pre-linguistic nature and structure of corporeal-locomotive gameplay. Through the effort of trying to think and talk about games as corporeal-locomotive activities and experiences it quickly became...... the corporeal-locomotive dimension of gameplay where hands and bodies where moving to the (kin)aesthetic rhythms of the game’s choreography. Consequently, I found myself barred from ‘meaningfully’ communicating the expressive, sensuous and (kin)aesthetic meaning and significance of corporeal-locomotive gameplay...... of this hazardous, messy and meticulous endeavor to represent the non-representational nature of corporeal-locomotive gameplay activity and experience. Furthermore, the article points towards the importance of letting the expressive research field or subject dictate the method, rather than letting the method...

  19. Protecting chips against hold time violations due to variability

    CERN Document Server

    Neuberger, Gustavo; Reis, Ricardo

    2013-01-01

    With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus

  20. Solution Methods for Structures with Random Properties Subject to Random Excitation

    DEFF Research Database (Denmark)

    Köylüoglu, H. U.; Nielsen, Søren R. K.; Cakmak, A. S.

    This paper deals with the lower order statistical moments of the response of structures with random stiffness and random damping properties subject to random excitation. The arising stochastic differential equations (SDE) with random coefficients are solved by two methods, a second order...... perturbation approach and a Markovian method. The second order perturbation approach is grounded on the total probability theorem and can be compactly written. Moreover, the problem to be solved is independent of the dimension of the random variables involved. The Markovian approach suggests transforming...... the SDE with random coefficients with deterministic initial conditions to an equivalent nonlinear SDE with deterministic coefficient and random initial conditions. In both methods, the statistical moment equations are used. Hierarchy of statistical moments in the markovian approach is closed...

  1. Potential volcanic impacts on future climate variability

    Science.gov (United States)

    Bethke, Ingo; Outten, Stephen; Otterå, Odd Helge; Hawkins, Ed; Wagner, Sebastian; Sigl, Michael; Thorne, Peter

    2017-11-01

    Volcanic activity plays a strong role in modulating climate variability. Most model projections of the twenty-first century, however, under-sample future volcanic effects by not representing the range of plausible eruption scenarios. Here, we explore how sixty possible volcanic futures, consistent with ice-core records, impact climate variability projections of the Norwegian Earth System Model (NorESM) under RCP4.5 (ref. ). The inclusion of volcanic forcing enhances climate variability on annual-to-decadal timescales. Although decades with negative global temperature trends become ~50% more commonplace with volcanic activity, these are unlikely to be able to mitigate long-term anthropogenic warming. Volcanic activity also impacts probabilistic projections of global radiation, sea level, ocean circulation, and sea-ice variability, the local-scale effects of which are detectable when quantifying the time of emergence. These results highlight the importance and feasibility of representing volcanic uncertainty in future climate assessments.

  2. An analysis of spatial representativeness of air temperature monitoring stations

    Science.gov (United States)

    Liu, Suhua; Su, Hongbo; Tian, Jing; Wang, Weizhen

    2017-04-01

    Surface air temperature is an essential variable for monitoring the atmosphere, and it is generally acquired at meteorological stations that can provide information about only a small area within an r m radius (r-neighborhood) of the station, which is called the representable radius. In studies on a local scale, ground-based observations of surface air temperatures obtained from scattered stations are usually interpolated using a variety of methods without ascertaining their effectiveness. Thus, it is necessary to evaluate the spatial representativeness of ground-based observations of surface air temperature before conducting studies on a local scale. The present study used remote sensing data to estimate the spatial distribution of surface air temperature using the advection-energy balance for air temperature (ADEBAT) model. Two target stations in the study area were selected to conduct an analysis of spatial representativeness. The results showed that one station (AWS 7) had a representable radius of about 400 m with a possible error of less than 1 K, while the other station (AWS 16) had the radius of about 250 m. The representable radius was large when the heterogeneity of land cover around the station was small.

  3. In search of a representative sample of residential building work.

    Science.gov (United States)

    Lobb, Brenda; Woods, Gregory R

    2012-09-01

    Most research investigating injuries in construction work is limited by reliance on work samples unrepresentative of the multiple, variable-cycle tasks involved, resulting in incomplete characterisation of ergonomic exposures. In this case study, a participatory approach was used including hierarchical task analysis and site observations of a typical team of house builders in New Zealand, over several working days, to obtain a representative work sample. The builders' work consisted of 14 goal-defined jobs using varying subsets of 15 task types, each taking from less than 1 s to more than 1 h and performed in a variety of postures. Task type and duration varied within and between participants and days, although all participants spent at least 25% of the time moving from place to place, mostly carrying materials, and more than half the time either reaching up or bending down to work. This research has provided a description of residential building work based on a work sample more nearly representative than those previously published and has demonstrated a simple, low-cost but robust field observation method that can provide a valid basis for further study of hazard exposures. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Good Random Multi-Triangulation of Surfaces.

    Science.gov (United States)

    de Medeiros Filho, Esdras; Siqueira, Marcelo

    2017-05-12

    We introduce the Hierarchical Poisson Disk Sampling Multi-Triangulation (HPDS-MT) of surfaces, a novel structure that combines the power of multi-triangulation (MT) with the benefits of Hierarchical Poisson Disk Sampling (HPDS). MT is a general framework for representing surfaces through variable resolution triangle meshes, while HPDS is a well-spaced random distribution with blue noise characteristics. The distinguishing feature of the HPDS-MT is its ability to extract adaptive meshes whose triangles are guaranteed to have good shape quality. The key idea behind the HPDS-MT is a preprocessed hierarchy of points, which is used in the construction of a MT via incremental simplification. In addition to proving theoretical properties on the shape quality of the triangle meshes extracted by the HPDS-MT, we provide an implementation that computes the HPDS-MT with high accuracy. Our results confirm the theoretical guarantees and outperform similar methods. We also prove that the Hausdorff distance between the original surface and any (extracted) adaptive mesh is bounded by the sampling distribution of the radii of Poisson-disks over the surface. Finally, we illustrate the advantages of the HPDS-MT in some typical problems of geometry processing.

  5. COVAMOF : A framework for modeling variability in software product families

    NARCIS (Netherlands)

    Sinnema, M; Deelstra, S; Nijhuis, J; Bosch, J; Nord, RL

    2004-01-01

    A key aspect of variability management in software product families is the explicit representation of the variability. Experiences at several industrial software development companies have shown that a software variability model should do four things: (1) uniformly represent variation points as

  6. Combining SNPs in latent variables to improve genomic prediction

    DEFF Research Database (Denmark)

    Heuven, Henri C M; Rosa, G J M; Janss, Luc

    The objective of this study was to develop and test hierarchical genomic models with latent variables that represent parts of the genomic values. An interaction model and a chromosome model were compared with a model based on variable selection in a simulated and real dataset. The program Bayz......: Hierarchical genetic model; Predictive value; Gibbs sampling; Variable selection....

  7. Construct Relevant and Irrelevant Variables in Math Problem Solving Assessment

    Science.gov (United States)

    Birk, Lisa E.

    2013-01-01

    In this study, I examined the relation between various construct relevant and irrelevant variables and a math problem solving assessment. I used independent performance measures representing the variables of mathematics content knowledge, general ability, and reading fluency. Non-performance variables included gender, socioeconomic status,…

  8. Beyond distance and direction: the brain represents target locations non-metrically.

    Science.gov (United States)

    Thaler, Lore; Goodale, Melvyn A

    2010-03-23

    In their day-to-day activities human beings are constantly generating behavior, such as pointing, grasping or verbal reports, on the basis of visible target locations. The question arises how the brain represents target locations. One possibility is that the brain represents them metrically, i.e. in terms of distance and direction. Another equally plausible possibility is that the brain represents locations non-metrically, using for example ordered geometry or topology. Here we report two experiments that were designed to test if the brain represents locations metrically or non-metrically. We measured accuracy and variability of visually guided reach-to-point movements (Experiment 1) and probe-stimulus adjustments (Experiment 2). The specific procedure of informing subjects about the relevant response on each trial enabled us to dissociate the use of non-metric target location from the use of metric distance and direction in head/eye-centered, hand-centered and externally defined (allocentric) coordinates. The behavioral data show that subjects' responses are least variable when they can direct their response at a visible target location, the only condition that permitted the use of non-metric information about target location in our experiments. Data from Experiments 1 and 2 correspond well quantitatively. Response variability in non-metric conditions cannot be predicted based on response variability in metric conditions. We conclude that the brain uses non-metric geometrical structure to represent locations.

  9. Representativeness and seasonality of major ion records derived from NEEM firn cores

    Science.gov (United States)

    Gfeller, G.; Fischer, H.; Bigler, M.; Schüpbach, S.; Leuenberger, D.; Mini, O.

    2014-10-01

    The seasonal and annual representativeness of ionic aerosol proxies (among others, calcium, sodium, ammonium and nitrate) in various firn cores in the vicinity of the NEEM drill site in northwest Greenland have been assessed. Seasonal representativeness is very high as one core explains more than 60% of the variability within the area. The inter-annual representativeness, however, can be substantially lower (depending on the species) making replicate coring indispensable to derive the atmospheric variability of aerosol species. A single core at the NEEM site records only 30% of the inter-annual atmospheric variability in some species, while five replicate cores are already needed to cover approximately 70% of the inter-annual atmospheric variability in all species. The spatial representativeness is very high within 60 cm, rapidly decorrelates within 10 m but does not diminish further within 3 km. We attribute this to wind reworking of the snow pack leading to sastrugi formation. Due to the high resolution and seasonal representativeness of the records we can derive accurate seasonalities of the measured species for modern (AD 1990-2010) times as well as for pre-industrial (AD 1623-1750) times. Sodium and calcium show similar seasonality (peaking in February and March respectively) for modern and pre-industrial times, whereas ammonium and nitrate are influenced by anthropogenic activities. Nitrate and ammonium both peak in May during modern times, whereas during pre-industrial times ammonium peaked during July-August and nitrate during June-July.

  10. Representativeness and seasonality of major ion records derived from NEEM firn cores

    Directory of Open Access Journals (Sweden)

    G. Gfeller

    2014-10-01

    Full Text Available The seasonal and annual representativeness of ionic aerosol proxies (among others, calcium, sodium, ammonium and nitrate in various firn cores in the vicinity of the NEEM drill site in northwest Greenland have been assessed. Seasonal representativeness is very high as one core explains more than 60% of the variability within the area. The inter-annual representativeness, however, can be substantially lower (depending on the species making replicate coring indispensable to derive the atmospheric variability of aerosol species. A single core at the NEEM site records only 30% of the inter-annual atmospheric variability in some species, while five replicate cores are already needed to cover approximately 70% of the inter-annual atmospheric variability in all species. The spatial representativeness is very high within 60 cm, rapidly decorrelates within 10 m but does not diminish further within 3 km. We attribute this to wind reworking of the snow pack leading to sastrugi formation. Due to the high resolution and seasonal representativeness of the records we can derive accurate seasonalities of the measured species for modern (AD 1990–2010 times as well as for pre-industrial (AD 1623–1750 times. Sodium and calcium show similar seasonality (peaking in February and March respectively for modern and pre-industrial times, whereas ammonium and nitrate are influenced by anthropogenic activities. Nitrate and ammonium both peak in May during modern times, whereas during pre-industrial times ammonium peaked during July–August and nitrate during June–July.

  11. Model parameters for representative wetland plant functional groups

    Science.gov (United States)

    Williams, Amber S.; Kiniry, James R.; Mushet, David M.; Smith, Loren M.; McMurry, Scott T.; Attebury, Kelly; Lang, Megan; McCarty, Gregory W.; Shaffer, Jill A.; Effland, William R.; Johnson, Mari-Vaughn V.

    2017-01-01

    Wetlands provide a wide variety of ecosystem services including water quality remediation, biodiversity refugia, groundwater recharge, and floodwater storage. Realistic estimation of ecosystem service benefits associated with wetlands requires reasonable simulation of the hydrology of each site and realistic simulation of the upland and wetland plant growth cycles. Objectives of this study were to quantify leaf area index (LAI), light extinction coefficient (k), and plant nitrogen (N), phosphorus (P), and potassium (K) concentrations in natural stands of representative plant species for some major plant functional groups in the United States. Functional groups in this study were based on these parameters and plant growth types to enable process-based modeling. We collected data at four locations representing some of the main wetland regions of the United States. At each site, we collected on-the-ground measurements of fraction of light intercepted, LAI, and dry matter within the 2013–2015 growing seasons. Maximum LAI and k variables showed noticeable variations among sites and years, while overall averages and functional group averages give useful estimates for multisite simulation modeling. Variation within each species gives an indication of what can be expected in such natural ecosystems. For P and K, the concentrations from highest to lowest were spikerush (Eleocharis macrostachya), reed canary grass (Phalaris arundinacea), smartweed (Polygonum spp.), cattail (Typha spp.), and hardstem bulrush (Schoenoplectus acutus). Spikerush had the highest N concentration, followed by smartweed, bulrush, reed canary grass, and then cattail. These parameters will be useful for the actual wetland species measured and for the wetland plant functional groups they represent. These parameters and the associated process-based models offer promise as valuable tools for evaluating environmental benefits of wetlands and for evaluating impacts of various agronomic practices in

  12. Types of biological variables.

    Science.gov (United States)

    Mayya, Shreemathi S; Monteiro, Ashma D; Ganapathy, Sachit

    2017-06-01

    Identification and description of variables used in any study is a necessary component in biomedical research. Statistical analyses rely on the type of variables that are involved in the study. In this short article, we introduce the different types of biological variables. A researcher has to be familiar with the type of variable he/she is dealing with in his/her research to decide about appropriate graphs/diagrams, summary measures and statistical analysis.

  13. Variable mechanical ventilation

    OpenAIRE

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini Jr.,Luiz Alberto; Friedman, Gilberto

    2017-01-01

    Objective To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Methods Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE?, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". Results A total of 36 studies were selected. ...

  14. A biorthogonal decomposition for the identification and simulation of non-stationary and non-Gaussian random fields

    Energy Technology Data Exchange (ETDEWEB)

    Zentner, I. [IMSIA, UMR EDF-ENSTA-CNRS-CEA 9219, Université Paris-Saclay, 828 Boulevard des Maréchaux, 91762 Palaiseau Cedex (France); Ferré, G., E-mail: gregoire.ferre@ponts.org [CERMICS – Ecole des Ponts ParisTech, 6 et 8 avenue Blaise Pascal, Cité Descartes, Champs sur Marne, 77455 Marne la Vallée Cedex 2 (France); Poirion, F. [Department of Structural Dynamics and Aeroelasticity, ONERA, BP 72, 29 avenue de la Division Leclerc, 92322 Chatillon Cedex (France); Benoit, M. [Institut de Recherche sur les Phénomènes Hors Equilibre (IRPHE), UMR 7342 (CNRS, Aix-Marseille Université, Ecole Centrale Marseille), 49 rue Frédéric Joliot-Curie, BP 146, 13384 Marseille Cedex 13 (France)

    2016-06-01

    In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated by applications to earthquakes (seismic ground motion) and sea states (wave heights).

  15. Biologically-variable rhythmic auditory cues are superior to isochronous cues in fostering natural gait variability in Parkinson's disease.

    Science.gov (United States)

    Dotov, D G; Bayard, S; Cochen de Cock, V; Geny, C; Driss, V; Garrigue, G; Bardy, B; Dalla Bella, S

    2017-01-01

    Rhythmic auditory cueing improves certain gait symptoms of Parkinson's disease (PD). Cues are typically stimuli or beats with a fixed inter-beat interval. We show that isochronous cueing has an unwanted side-effect in that it exacerbates one of the motor symptoms characteristic of advanced PD. Whereas the parameters of the stride cycle of healthy walkers and early patients possess a persistent correlation in time, or long-range correlation (LRC), isochronous cueing renders stride-to-stride variability random. Random stride cycle variability is also associated with reduced gait stability and lack of flexibility. To investigate how to prevent patients from acquiring a random stride cycle pattern, we tested rhythmic cueing which mimics the properties of variability found in healthy gait (biological variability). PD patients (n=19) and age-matched healthy participants (n=19) walked with three rhythmic cueing stimuli: isochronous, with random variability, and with biological variability (LRC). Synchronization was not instructed. The persistent correlation in gait was preserved only with stimuli with biological variability, equally for patients and controls (p'scycle. Notably, the individual's tendency to synchronize steps with beats determined the amount of negative effects of isochronous and random cues (p'sgait dynamics during cueing. The beneficial effects of biological variability provide useful guidelines for improving existing cueing treatments. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Representing Water Scarcity in Future Agricultural Assessments

    Science.gov (United States)

    Winter, Jonathan M.; Lopez, Jose R.; Ruane, Alexander C.; Young, Charles A.; Scanlon, Bridget R.; Rosenzweig, Cynthia

    2017-01-01

    Globally, irrigated agriculture is both essential for food production and the largest user of water. A major challenge for hydrologic and agricultural research communities is assessing the sustainability of irrigated croplands under climate variability and change. Simulations of irrigated croplands generally lack key interactions between water supply, water distribution, and agricultural water demand. In this article, we explore the critical interface between water resources and agriculture by motivating, developing, and illustrating the application of an integrated modeling framework to advance simulations of irrigated croplands. We motivate the framework by examining historical dynamics of irrigation water withdrawals in the United States and quantitatively reviewing previous modeling studies of irrigated croplands with a focus on representations of water supply, agricultural water demand, and impacts on crop yields when water demand exceeds water supply. We then describe the integrated modeling framework for simulating irrigated croplands, which links trends and scenarios with water supply, water allocation, and agricultural water demand. Finally, we provide examples of efforts that leverage the framework to improve simulations of irrigated croplands as well as identify opportunities for interventions that increase agricultural productivity, resiliency, and sustainability.

  17. Latent variable theory

    NARCIS (Netherlands)

    Borsboom, D.

    2008-01-01

    This paper formulates a metatheoretical framework for latent variable modeling. It does so by spelling out the difference between observed and latent variables. This difference is argued to be purely epistemic in nature: We treat a variable as observed when the inference from data structure to

  18. Amplification variable factor amplifier

    NARCIS (Netherlands)

    Akitsugu, Oshita; Nauta, Bram

    2007-01-01

    PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ; SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and

  19. Amplification variable factor amplifier

    NARCIS (Netherlands)

    Akitsugu, O.; Nauta, Bram

    2006-01-01

    PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ; SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and

  20. Representing the effects of stratosphere–troposphere ...

    Science.gov (United States)

    Downward transport of ozone (O3) from the stratosphere can be a significant contributor to tropospheric O3 background levels. However, this process often is not well represented in current regional models. In this study, we develop a seasonally and spatially varying potential vorticity (PV)-based function to parameterize upper tropospheric and/or lower stratospheric (UTLS) O3 in a chemistry transport model. This dynamic O3–PV function is developed based on 21-year ozonesonde records from World Ozone and Ultraviolet Radiation Data Centre (WOUDC) with corresponding PV values from a 21-year Weather Research and Forecasting (WRF) simulation across the Northern Hemisphere from 1990 to 2010. The result suggests strong spatial and seasonal variations of O3 ∕ PV ratios which exhibits large values in the upper layers and in high-latitude regions, with highest values in spring and the lowest values in autumn over an annual cycle. The newly developed O3 ∕ PV function was then applied in the Community Multiscale Air Quality (CMAQ) model for an annual simulation of the year 2006. The simulated UTLS O3 agrees much better with observations in both magnitude and seasonality after the implementation of the new parameterization. Considerable impacts on surface O3 model performance were found in the comparison with observations from three observational networks, i.e., EMEP, CASTNET and WDCGG. With the new parameterization, the negative bias in spring is reduced from

  1. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  2. The underground electromagnetic pulse: Four representative models

    Energy Technology Data Exchange (ETDEWEB)

    Wouters, L.F.

    1989-06-01

    I describe four phenomenological models by which an underground nuclear explosion may generate electromagnetic pulses: Compton current asymmetry (or ''Compton dipole''); Uphole conductor currents (or ''casing currents''); Diamagnetic cavity plasma (or ''magnetic bubble''); and Large-scale ground motion (or ''magneto-acoustic wave''). I outline the corresponding analytic exercises and summarize the principal results of the computations. I used a 10-kt contained explosion as the fiducial case. Each analytic sequence developed an equivalent source dipole and calculated signal waveforms at representative ground-surface locations. As a comparative summary, the Compton dipole generates a peak source current moment of about 12,000 A/center dot/m in the submicrosecond time domain. The casing-current source model obtains an equivalent peak moment of about 2 /times/ 10/sup 5/ A/center dot/m in the 10- to 30-/mu/s domain. The magnetic bubble produces a magnetic dipole moment of about 7 /times/ 10/sup 6/ A/center dot/m/sup 2/, characterized by a 30-ms time structure. Finally, the magneto-acoustic wave corresponds to a magnetic dipole moment of about 600 A/center dot/m/sup 2/, with a waveform showing 0.5-s periodicities. 8 refs., 35 figs., 7 tabs.

  3. Intraspecific chromosome variability

    Directory of Open Access Journals (Sweden)

    N Dubinin

    2010-12-01

    Full Text Available (Editorial preface. The publication is presented in order to remind us of one of dramatic pages of the history of genetics. It re-opens for the contemporary reader a comprehensive work marking the priority change from plant cytogenetics to animal cytogenetics led by wide population studies which were conducted on Drosophila polytene chromosomes. The year of the publication (1937 became the point of irretrievable branching between the directions of Old World and New World genetics connected with the problems of chromosome variability and its significance for the evolution of the species. The famous book of T. Dobzhansky (1937 was published by Columbia University in the US under the title “Genetics and the origin of species”, and in the shadow of this American ‘skybuilding’ all other works grew dim. It is remarkable that both Dobzhansky and Dubinin come to similar conclusions about the role of chromosomes in speciation. This is not surprising given that they both might be considered as representatives of the Russian genetic school, by their birth and education. Interestingly, Dobzhansky had never referred to the full paper of Dubinin et al. (1937, though a previous short communication in Nature (1936 was included together with all former papers on the related subject. In full, the volume of the original publication printed in the Biological Journal in Moscow comprised 47 pages, in that number 41 pages of the Russian text accompanied by 16 Figs, a table and reference list, and, above all, 6 pages of the English summary. This final part in English is now reproduced in the authors’ version with the only addition being the reference list in the originally printed form.

  4. Gaussian random bridges and a geometric model for information equilibrium

    Science.gov (United States)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  5. Statistical conditional sampling for variable-resolution video compression.

    Directory of Open Access Journals (Sweden)

    Alexander Wong

    Full Text Available In this study, we investigate a variable-resolution approach to video compression based on Conditional Random Field and statistical conditional sampling in order to further improve compression rate while maintaining high-quality video. In the proposed approach, representative key-frames within a video shot are identified and stored at full resolution. The remaining frames within the video shot are stored and compressed at a reduced resolution. At the decompression stage, a region-based dictionary is constructed from the key-frames and used to restore the reduced resolution frames to the original resolution via statistical conditional sampling. The sampling approach is based on the conditional probability of the CRF modeling by use of the constructed dictionary. Experimental results show that the proposed variable-resolution approach via statistical conditional sampling has potential for improving compression rates when compared to compressing the video at full resolution, while achieving higher video quality when compared to compressing the video at reduced resolution.

  6. Variability in prefrontal hemodynamic response during exposure to repeated self-selected music excerpts, a near-infrared spectroscopy study.

    Science.gov (United States)

    Moghimi, Saba; Schudlo, Larissa; Chau, Tom; Guerguerian, Anne-Marie

    2015-01-01

    Music-induced brain activity modulations in areas involved in emotion regulation may be useful in achieving therapeutic outcomes. Clinical applications of music may involve prolonged or repeated exposures to music. However, the variability of the observed brain activity patterns in repeated exposures to music is not well understood. We hypothesized that multiple exposures to the same music would elicit more consistent activity patterns than exposure to different music. In this study, the temporal and spatial variability of cerebral prefrontal hemodynamic response was investigated across multiple exposures to self-selected musical excerpts in 10 healthy adults. The hemodynamic changes were measured using prefrontal cortex near infrared spectroscopy and represented by instantaneous phase values. Based on spatial and temporal characteristics of these observed hemodynamic changes, we defined a consistency index to represent variability across these domains. The consistency index across repeated exposures to the same piece of music was compared to the consistency index corresponding to prefrontal activity from randomly matched non-identical musical excerpts. Consistency indexes were significantly different for identical versus non-identical musical excerpts when comparing a subset of repetitions. When all four exposures were compared, no significant difference was observed between the consistency indexes of randomly matched non-identical musical excerpts and the consistency index corresponding to repetitions of the same musical excerpts. This observation suggests the existence of only partial consistency between repeated exposures to the same musical excerpt, which may stem from the role of the prefrontal cortex in regulating other cognitive and emotional processes.

  7. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  8. Variable mechanical ventilation

    Science.gov (United States)

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini Jr., Luiz Alberto; Friedman, Gilberto

    2017-01-01

    Objective To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Methods Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". Results A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Conclusion Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation. PMID:28444076

  9. Variable mechanical ventilation.

    Science.gov (United States)

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini, Luiz Alberto; Friedman, Gilberto

    2017-01-01

    To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation.

  10. Ergodicity of Random Walks on Random DFA

    OpenAIRE

    Balle, Borja

    2013-01-01

    Given a DFA we consider the random walk that starts at the initial state and at each time step moves to a new state by taking a random transition from the current state. This paper shows that for typical DFA this random walk induces an ergodic Markov chain. The notion of typical DFA is formalized by showing that ergodicity holds with high probability when a DFA is sampled uniformly at random from the set of all automata with a fixed number of states. We also show the same result applies to DF...

  11. [Intervention to reduce adolescents sexual risk behaviors: a randomized controlled trial].

    Science.gov (United States)

    Gallegos, Esther C; Villarruel, Antonia M; Loveland-Cherry, Carol; Ronis, David L; Yan Zhou, Ms

    2008-01-01

    To test the efficacy of a behavioral intervention designed to decrease risk sexual behaviors for HIV/AIDS and unplanned pregnancies in Mexican adolescents. Randomized controlled trial with four follow ups; 832 adolescents recruited from high schools, age 14-17, were randomly assigned to the experimental or control group. The six hour intervention used active learning strategies, and was delivered in two sessions on two consecutive Saturdays. The study was carried out in Monterrey, Mexico, 2002-2005. GEE analysis indicated no differences in sexual relationships intentions between the two conditions, however, the experimental group had higher intentions to use condoms and contraceptives (mean differences 0.15 and 0.16, CI 95%) in the next three months, as compared with the control group. Theoretical variables, such as control beliefs, were significant mediators of the intervention. The behavioral intervention represents an important effort in promoting safe sexual behaviors among Mexican adolescents.

  12. Random fractional Fourier transform.

    Science.gov (United States)

    Liu, Zhengjun; Liu, Shutian

    2007-08-01

    We propose a novel random fractional Fourier transform by randomizing the transform kernel function of the conventional fractional Fourier transform. The random fractional Fourier transform inherits the excellent mathematical properties from the fractional Fourier transform and can be easily implemented in optics. As a primary application the random fractional Fourier transform can be directly used in optical image encryption and decryption. The double phase encoding image encryption schemes can thus be modeled with cascaded random fractional Fourier transformers.

  13. Representative Atmospheric Plume Development for Elevated Releases

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lowrey, Justin D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McIntyre, Justin I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miley, Harry S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prichard, Andrew W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-02-01

    An atmospheric explosion of a low-yield nuclear device will produce a large number of radioactive isotopes, some of which can be measured with airborne detection systems. However, properly equipped aircraft may not arrive in the region where an explosion occurred for a number of hours after the event. Atmospheric conditions will have caused the radioactive plume to move and diffuse before the aircraft arrives. The science behind predicting atmospheric plume movement has advanced enough that the location of the maximum concentrations in the plume can be determined reasonably accurately in real time, or near real time. Given the assumption that an aircraft can follow a plume, this study addresses the amount of atmospheric dilution expected to occur in a representative plume as a function of time past the release event. The approach models atmospheric transport of hypothetical releases from a single location for every day in a year using the publically available HYSPLIT code. The effective dilution factors for the point of maximum concentration in an elevated plume based on a release of a non-decaying, non-depositing tracer can vary by orders of magnitude depending on the day of the release, even for the same number of hours after the release event. However, the median of the dilution factors based on releases for 365 consecutive days at one site follows a power law relationship in time, as shown in Figure S-1. The relationship is good enough to provide a general rule of thumb for estimating typical future dilution factors in a plume starting at the same point. However, the coefficients of the power law function may vary for different release point locations. Radioactive decay causes the effective dilution factors to decrease more quickly with the time past the release event than the dilution factors based on a non-decaying tracer. An analytical expression for the dilution factors of isotopes with different half-lives can be developed given the power law expression

  14. HOW TO REPRESENT THE GENETIC CODE?

    Directory of Open Access Journals (Sweden)

    N.S. Santos-Magalhães

    2004-05-01

    Full Text Available The advent of molecular genetic comprises a true revolution of far-reaching consequences for human-kind, which evolved into a specialized branch of the modern-day Biochemistry. The analysis of specicgenomic information are gaining wide-ranging interest because of their signicance to the early diag-nosis of disease, and the discovery of modern drugs. In order to take advantage of a wide assortmentof signal processing (SP algorithms, the primary step of modern genomic SP involves convertingsymbolic-DNA sequences into complex-valued signals. How to represent the genetic code? Despitebeing extensively known, the DNA mapping into proteins is one of the relevant discoveries of genetics.The genetic code (GC is revisited in this work, addressing other descriptions for it, which can beworthy for genomic SP. Three original representations are discussed. The inner-to-outer map buildson the unbalanced role of nucleotides of a codon. A two-dimensional-Gray genetic representationis oered as a structured map that can help interpreting DNA spectrograms or scalograms. Theseare among the powerful visual tools for genome analysis, which depends on the choice of the geneticmapping. Finally, the world-chart for the GC is investigated. Evoking the cyclic structure of thegenetic mapping, it can be folded joining the left-right borders, and the top-bottom frontiers. As aresult, the GC can be drawn on the surface of a sphere resembling a world-map. Eight parallels oflatitude are required (four in each hemisphere as well as four meridians of longitude associated tofour corresponding anti-meridians. The tropic circles have 11.25o, 33.75o, 56.25o, and 78.5o (Northand South. Starting from an arbitrary Greenwich meridian, the meridians of longitude can be plottedat 22.5o, 67.5o, 112.5o, and 157.5o (East and West. Each triplet is assigned to a single point on thesurface that we named Nirenberg-Kohamas Earth. Despite being valuable, usual representations forthe GC can be

  15. TECHNICAL BASIS DOCUMENT FOR THE ABOVE GROUND TANK FAILURE REPRESENTATIVE ACCIDENT & ASSOCIATED REPRESENTED HAZARDOUS CONDITIONS

    Energy Technology Data Exchange (ETDEWEB)

    MARCHESE, A.R.

    2005-03-03

    This document analyzes aboveground tank failure accident scenarios for the Demonstration Bulk Vitrification system (DBVS). The radiological and toxicological consequences are determined for a range of aboveground tank failure accident scenarios to determine the representative accident for DBVS. Based on the consequence results and accident frequency evaluations, risk bins are determined and control decisions are made.

  16. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...... in high-dimensional problems and in settings such as HIV/AIDS that involve many competing risks....

  17. Impact of Flavonols on Cardiometabolic Biomarkers:  A Meta‐Analysis of Randomized Controlled Human  Trials to Explore the Role of Inter‐Individual  Variability

    Directory of Open Access Journals (Sweden)

    Regina Menezes

    2017-02-01

    Full Text Available Several  epidemiological  studies  have  linked  flavonols  with  decreased  risk  of  cardiovascular  disease  (CVD.  However,  some  heterogeneity  in  the  individual  physiological  responses to the consumption of these compounds has been identified. This meta‐analysis aimed to  study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood  pressure and plasma glucose, as well as factors affecting their inter‐individual variability. Data from  18 human randomized controlled trials were pooled and the effect was estimated using fixed or  random effects meta‐analysis model and reported as difference in means (DM. Variability in the  response of blood lipids to supplementation with flavonols was assessed by stratifying various  population subgroups: age, sex, country, and health status. Results showed significant reductions  in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01, LDL cholesterol (DM = −0.14 mmol/L;  Nutrients 2017, 9, 117  2 of 21  95% CI: −0.21, 0.07, and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03, and a significant  increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07. A significant reduction was also  observed in fasting plasma glucose (DM = −0.18 mmol/L; 95%CI: −0.29, −0.08, and in blood pressure  (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: -4.09, -2.55.  Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian  countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and  normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk,  however, country of

  18. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  19. Sampling intraspecific variability in leaf functional traits: Practical suggestions to maximize collected information.

    Science.gov (United States)

    Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni

    2017-12-01

    The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITVBI) and among populations (ITVPOP), relatively few studies have analyzed intraspecific variability within individuals (ITVWI). Here, we provide an analysis of ITVWI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITVWI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITVWI, comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.

  20. Variability of snow depth at the plot scale: implications for mean depth estimation and sampling strategies

    Directory of Open Access Journals (Sweden)

    J. I. López-Moreno

    2011-08-01

    Full Text Available Snow depth variability over small distances can affect the representativeness of depth samples taken at the local scale, which are often used to assess the spatial distribution of snow at regional and basin scales. To assess spatial variability at the plot scale, intensive snow depth sampling was conducted during January and April 2009 in 15 plots in the Rio Ésera Valley, central Spanish Pyrenees Mountains. Each plot (10 × 10 m; 100 m2 was subdivided into a grid of 1 m2 squares; sampling at the corners of each square yielded a set of 121 data points that provided an accurate measure of snow depth in the plot (considered as ground truth. The spatial variability of snow depth was then assessed using sampling locations randomly selected within each plot. The plots were highly variable, with coefficients of variation up to 0.25. This indicates that to improve the representativeness of snow depth sampling in a given plot the snow depth measurements should be increased in number and averaged when spatial heterogeneity is substantial.

    Snow depth distributions were simulated at the same plot scale under varying levels of standard deviation and spatial autocorrelation, to enable the effect of each factor on snowpack representativeness to be established. The results showed that the snow depth estimation error increased markedly as the standard deviation increased. The results indicated that in general at least five snow depth measurements should be taken in each plot to ensure that the estimation error is <10 %; this applied even under highly heterogeneous conditions. In terms of the spatial configuration of the measurements, the sampling strategy did not impact on the snow depth estimate under lack of spatial autocorrelation. However, with a high spatial autocorrelation a smaller error was obtained when the distance between measurements was greater.

  1. Generalised extreme value statistics and sum of correlated variables

    OpenAIRE

    Bertin, Eric; Clusel, Maxime

    2006-01-01

    To appear in J.Phys.A; We show that generalised extreme value statistics -the statistics of the k-th largest value among a large set of random variables- can be mapped onto a problem of random sums. This allows us to identify classes of non-identical and (generally) correlated random variables with a sum distributed according to one of the three (k-dependent) asymptotic distributions of extreme value statistics, namely the Gumbel, Frechet and Weibull distributions. These classes, as well as t...

  2. Generating Gamma and Cauchy Random Variables: An Extension to the Naval Postgraduate School Random Number Package

    Science.gov (United States)

    1975-04-01

    Seainumerical Algorithms, Addison-Hesley, 1972. [6] Lanchester, H.O., Tfee X£ Dist^ibatiga» « Hey » 1965. [7] Learmonth, G.P-., and Lewis... Whipple , Code 55Wp Prof. P.A.W. Lewis, Code 55Lw Department of Operations Research and Administrative Sciences Naval Postgraduate School Monterey, CA

  3. Phenotypic variability of cat-eye syndrome

    NARCIS (Netherlands)

    Berends, MJW; Tan-Sindhunata, G; Leegte, B; Van Essen, AJ

    2001-01-01

    Cat-Eye syndrome (CES) is a disorder with a variable pattern of multiple congenital anomalies of which coloboma of the iris and anal atresia are the best known. CES is cyogenetically characterised by the presence of an extra bisatellited marker chromosome, which represents an inverted dicentric

  4. THE STRUCTURE OF HAPPINESS REPRESENTATION FOR RUSSIAN AND AMERICAN REPRESENTATIVES

    Directory of Open Access Journals (Sweden)

    S. Yu. Zhdanova

    2017-01-01

    Full Text Available Introduction. Emotional state of students exerts direct impact on their ability and readiness to cope with challenges when studying, gives rise to the success of educational process and its effectiveness. In this regard, the search of methods and determination of the tasks of psychological diagnostics is brought into focus. Above all, the teacher should consider mentality and valuable attitudes of representatives of various cultures, including their understanding of happiness and personal well-being in the activity against the background of the increasing scales of the international and interethnic mobility.The development of Russian psychology has recently acquired the direction of positive psychology, the focus of which is happiness and positive functioning of the individual. Modern research reveals significant differences in the indicators of happiness and satisfaction with life between representatives of different cultures. However, the diagnostic tools used in such studies are based primarily on the model of happiness image that has been developed in American psychology. In this connection, the question arises as to what extent the image of happiness in American culture correlates with the image of happiness in Russian culture.The aim of this work is to study the representation of happiness between representatives of American and Russian culture, the definition of invariable and variable components in the structure of representation.Methodology and research methods. The study included several stages. At the first stage, the theoretical analysis and development of the ontology of the subject area “Psychology of Happiness” was carried out. At the second stage, an empirical study of the representations of American and Russian respondents was carried out. The main method of data collection was a narrative interview; a method of early personal memories was used to obtain the narrative of happiness. Subsequent processing of verbal

  5. Random walks in a random environment

    Indian Academy of Sciences (India)

    Home; Journals; Proceedings – Mathematical Sciences; Volume 114; Issue 4. Random Walks in a Random Environment. S R S Varadhan. Invited Articles Volume 114 Issue ... Author Affiliations. S R S Varadhan1. Department of Mathematics, Courant Institute of Mathematical Sciences, New York University, NY 10012, USA ...

  6. The nebular variables

    CERN Document Server

    Glasby, John S

    1974-01-01

    The Nebular Variables focuses on the nebular variables and their characteristics. Discussions are organized by type of nebular variable, namely, RW Aurigae stars, T Orionis stars, T Tauri stars, and peculiar nebular objects. Topics range from light variations of the stars to their spectroscopic and physical characteristics, spatial distribution, interaction with nebulosity, and evolutionary features. This volume is divided into four sections and consists of 25 chapters, the first of which provides general information on nebular variables, including their stellar associations and their classifi

  7. Random walks on random Koch curves

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, S; Hoffmann, K H [Institut fuer Physik, Technische Universitaet, D-09107 Chemnitz (Germany); Essex, C [Department of Applied Mathematics, University of Western Ontario, London, ON N6A 5B7 (Canada)

    2009-06-05

    Diffusion processes in porous materials are often modeled as random walks on fractals. In order to capture the randomness of the materials random fractals are employed, which no longer show the deterministic self-similarity of regular fractals. Finding a continuum differential equation describing the diffusion on such fractals has been a long-standing goal, and we address the question of whether the concepts developed for regular fractals are still applicable. We use the random Koch curve as a convenient example as it provides certain technical advantages by its separation of time and space features. While some of the concepts developed for regular fractals can be used unaltered, others have to be modified. Based on the concept of fibers, we introduce ensemble-averaged density functions which produce a differentiable estimate of probability explicitly and compare it to random walk data.

  8. Sexual desire in a nationally representative Danish population.

    Science.gov (United States)

    Eplov, Lene; Giraldi, Annamaria; Davidsen, Michael; Garde, Karin; Kamper-Jørgensen, Finn

    2007-01-01

    There are only a few studies on the frequency of sexual desire in the general population, whereas studies investigating the frequency of disordered sexual desire are more common. The aim of this study was to describe the frequency of sexual desire in a representative sample of the adult Danish population and to analyze the relationships between a number of relevant variables and sexual desire. The study population (N = 10,458, response rate 84.8%) answered a questionnaire with questions on sexual matters. The representativity of the population was examined. The frequency of self-reported sexual desire and decrease in sexual desire over a 5-year period was calculated for the two genders across age cohorts. Multiple logistic regression analysis was used to analyze the relationship between potential determinants and sexual desire. The frequency of self-reported sexual desire and decrease in sexual desire was examined. Factors of importance for sexual desire were tested using two outcome measures: (i) often having sexual desire; and (ii) seldom having sexual desire. A significant association between gender and sexual desire was found in all age groups, as men had a significantly higher level of sexual desire than women. In both genders, the frequency of sexual desire was significantly reduced with increasing age. Among the 45- to 66-year-olds, 57% of the men and 47% of the women reported no change in the level of sexual desire over the past 5 years. In general terms, factors related to seldom having sexual desire were age and social, psychological, and physical distress in both genders. This study shows that overall, men have a higher level of sexual desire than women; sexual desire decreases with increasing age; and social, psychological, or physical distress are associated with low level of sexual desire in both genders.

  9. Representing the dosimetric impact of deformable image registration errors

    Science.gov (United States)

    Vickress, Jason; Battista, Jerry; Barnett, Rob; Yartsev, Slav

    2017-09-01

    Deformable image registration (DIR) is emerging as a tool in radiation therapy for calculating the cumulative dose distribution across multiple fractions of treatment. Unfortunately, due to the variable nature of DIR algorithms and dependence of performance on image quality, registration errors can result in dose accumulation errors. In this study, landmarked images were used to characterize the DIR error throughout an image space and determine its impact on dosimetric analysis. Ten thoracic 4DCT images with 300 landmarks per image study matching the end-inspiration and end-expiration phases were obtained from ‘dir-labs’. DIR was performed using commercial software MIM Maestro. The range of dose uncertainty (RDU) was calculated at each landmark pair as the maximum and minimum of the doses within a sphere around the landmark in the end-expiration phase. The radius of the sphere was defined by a measure of DIR error which included either the actual DIR error, mean DIR error per study, constant errors of 2 or 5 mm, inverse consistency error, transitivity error or the distance discordance metric (DDM). The RDUs were evaluated using the magnitude of dose uncertainty (MDU) and inclusion rate (IR) of actual error lying within the predicted RDU. The RDU was calculated for 300 landmark pairs on each 4DCT study for all measures of DIR error. The most representative RDU was determined using the actual DIR error with a MDU of 2.5 Gy and IR of 97%. Across all other measures of DIR error, the DDM was most predictive with a MDU of 2.5 Gy and IR of 86%, closest to the actual DIR error. The proposed method represents the range of dosimetric uncertainty of DIR error using either landmarks at specific voxels or measures of registration accuracy throughout the volume.

  10. A strategy that iteratively retains informative variables for selecting optimal variable subset in multivariate calibration.

    Science.gov (United States)

    Yun, Yong-Huan; Wang, Wei-Ting; Tan, Min-Li; Liang, Yi-Zeng; Li, Hong-Dong; Cao, Dong-Sheng; Lu, Hong-Mei; Xu, Qing-Song

    2014-01-07

    Nowadays, with a high dimensionality of dataset, it faces a great challenge in the creation of effective methods which can select an optimal variables subset. In this study, a strategy that considers the possible interaction effect among variables through random combinations was proposed, called iteratively retaining informative variables (IRIV). Moreover, the variables are classified into four categories as strongly informative, weakly informative, uninformative and interfering variables. On this basis, IRIV retains both the strongly and weakly informative variables in every iterative round until no uninformative and interfering variables exist. Three datasets were employed to investigate the performance of IRIV coupled with partial least squares (PLS). The results show that IRIV is a good alternative for variable selection strategy when compared with three outstanding and frequently used variable selection methods such as genetic algorithm-PLS, Monte Carlo uninformative variable elimination by PLS (MC-UVE-PLS) and competitive adaptive reweighted sampling (CARS). The MATLAB source code of IRIV can be freely downloaded for academy research at the website: http://code.google.com/p/multivariate-calibration/downloads/list. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Registration Appointment and Services for Representatives Management Information

    Data.gov (United States)

    Social Security Administration — A new internet/intranet application that collects all representative information and establishes the relationship between the claimant and the representative. Allow...

  12. The weighted random graph model

    Science.gov (United States)

    Garlaschelli, Diego

    2009-07-01

    We introduce the weighted random graph (WRG) model, which represents the weighted counterpart of the Erdos-Renyi random graph and provides fundamental insights into more complicated weighted networks. We find analytically that the WRG is characterized by a geometric weight distribution, a binomial degree distribution and a negative binomial strength distribution. We also characterize exactly the percolation phase transitions associated with edge removal and with the appearance of weighted subgraphs of any order and intensity. We find that even this completely null model displays a percolation behaviour similar to what is observed in real weighted networks, implying that edge removal cannot be used to detect community structure empirically. By contrast, the analysis of clustering successfully reveals different patterns between the WRG and real networks.

  13. Metabolic variability in micro-populations.

    Directory of Open Access Journals (Sweden)

    Yuval Elhanati

    Full Text Available Biological cells in a population are variable in practically every property. Much is known about how variability of single cells is reflected in the statistical properties of infinitely large populations; however, many biologically relevant situations entail finite times and intermediate-sized populations. The statistical properties of an ensemble of finite populations then come into focus, raising questions concerning inter-population variability and dependence on initial conditions. Recent technologies of microfluidic and microdroplet-based population growth realize these situations and make them immediately relevant for experiments and biotechnological application. We here study the statistical properties, arising from metabolic variability of single cells, in an ensemble of micro-populations grown to saturation in a finite environment such as a micro-droplet. We develop a discrete stochastic model for this growth process, describing the possible histories as a random walk in a phenotypic space with an absorbing boundary. Using a mapping to Polya's Urn, a classic problem of probability theory, we find that distributions approach a limiting inoculum-dependent form after a large number of divisions. Thus, population size and structure are random variables whose mean, variance and in general their distribution can reflect initial conditions after many generations of growth. Implications of our results to experiments and to biotechnology are discussed.

  14. Metabolic variability in micro-populations.

    Science.gov (United States)

    Elhanati, Yuval; Brenner, Naama

    2012-01-01

    Biological cells in a population are variable in practically every property. Much is known about how variability of single cells is reflected in the statistical properties of infinitely large populations; however, many biologically relevant situations entail finite times and intermediate-sized populations. The statistical properties of an ensemble of finite populations then come into focus, raising questions concerning inter-population variability and dependence on initial conditions. Recent technologies of microfluidic and microdroplet-based population growth realize these situations and make them immediately relevant for experiments and biotechnological application. We here study the statistical properties, arising from metabolic variability of single cells, in an ensemble of micro-populations grown to saturation in a finite environment such as a micro-droplet. We develop a discrete stochastic model for this growth process, describing the possible histories as a random walk in a phenotypic space with an absorbing boundary. Using a mapping to Polya's Urn, a classic problem of probability theory, we find that distributions approach a limiting inoculum-dependent form after a large number of divisions. Thus, population size and structure are random variables whose mean, variance and in general their distribution can reflect initial conditions after many generations of growth. Implications of our results to experiments and to biotechnology are discussed.

  15. Analysis and Computation of Acoustic and Elastic Wave Equations in Random Media

    KAUST Repository

    Motamed, Mohammad

    2014-01-06

    We propose stochastic collocation methods for solving the second order acoustic and elastic wave equations in heterogeneous random media and subject to deterministic boundary and initial conditions [1, 4]. We assume that the medium consists of non-overlapping sub-domains with smooth interfaces. In each sub-domain, the materials coefficients are smooth and given or approximated by a finite number of random variable. One important example is wave propagation in multi-layered media with smooth interfaces. The numerical scheme consists of a finite difference or finite element method in the physical space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space. We provide a rigorous convergence analysis and demonstrate different types of convergence of the probability error with respect to the number of collocation points under some regularity assumptions on the data. In particular, we show that, unlike in elliptic and parabolic problems [2, 3], the solution to hyperbolic problems is not in general analytic with respect to the random variables. Therefore, the rate of convergence is only algebraic. A fast spectral rate of convergence is still possible for some quantities of interest and for the wave solutions with particular types of data. We also show that the semi-discrete solution is analytic with respect to the random variables with the radius of analyticity proportional to the grid/mesh size h. We therefore obtain an exponential rate of convergence which deteriorates as the quantity h p gets smaller, with p representing the polynomial degree in the stochastic space. We have shown that analytical results and numerical examples are consistent and that the stochastic collocation method may be a valid alternative to the more traditional Monte Carlo method. Here we focus on the stochastic acoustic wave equation. Similar results are obtained for stochastic elastic equations.

  16. INTER-EXAMINER VARIABILITY

    African Journals Online (AJOL)

    Background: The traditional clinical examination has fallen into disfavour on account of considerable inter-examiner variability. The OSCE is gaining popularity as it is perceived to be less prone to this. Objective: To establish whether inter-examiner variability is still a significant factor for the undergraduate orthopaedic ...

  17. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  18. Microinertia and internal variables

    CERN Document Server

    Berezovski, A

    2015-01-01

    The origin of microinertia of micromorphic theories is investigated from the point of view of non-equilibrium thermodynamics. In the framework of dual internal variables microinertia stems from a thermodynamic equation of state related to the internal variable with the properties of mechanical momentum.

  19. Variable volume combustor

    Science.gov (United States)

    Ostebee, Heath Michael; Ziminsky, Willy Steve; Johnson, Thomas Edward; Keener, Christopher Paul

    2017-01-17

    The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a linear actuator so as to maneuver the micro-mixer fuel nozzles axially along the liner.

  20. Long-term results of breast conserving surgery vs. mastectomy for early stage invasive breast cancer: 20-year follow-up of the Danish randomized DBCG-82TM protocol

    DEFF Research Database (Denmark)

    Blichert-Toft, M.; Nielsen, M.; During, M.

    2008-01-01

    outcome with BCS, and no evidence of disseminated disease. The patients accrued were grouped into three subsets: correctly randomized, suspicion of randomization error, and declining randomization. The main analyses focus on the subgroup of 793 correctly randomized patients representing 70...

  1. A Generalized Random Regret Minimization Model

    NARCIS (Netherlands)

    Chorus, C.G.

    2013-01-01

    This paper presents, discusses and tests a generalized Random Regret Minimization (G-RRM) model. The G-RRM model is created by replacing a fixed constant in the attribute-specific regret functions of the RRM model, by a regret-weight variable. Depending on the value of the regret-weights, the G-RRM

  2. Prevalence and psychosocial risk factors associated with internet addiction in a nationally representative sample of college students in Taiwan.

    Science.gov (United States)

    Lin, Min-Pei; Ko, Huei-Chen; Wu, Jo Yung-Wei

    2011-12-01

    The aim of this study was to examine the prevalence of Internet addiction in a nationally representative sample of college students and to identify any associated psychosocial risk factors. The present study was constructed using a cross-sectional design with 3,616 participants. Participants were surveyed during the middle of the spring and fall semesters and recruited from colleges around Taiwan using stratified and cluster random sampling methods. Associations between Internet addiction and psychosocial risk factors were examined using stepwise logistic regression analysis. The prevalence of Internet addiction was found to be 15.3 percent (95 percent confidence interval, 14.1 percent to 16.5 percent). More depressive symptoms, higher positive outcome expectancy of Internet use, higher Internet usage time, lower refusal self-efficacy of Internet use, higher impulsivity, lower satisfaction with academic performance, being male, and insecure attachment style were positively correlated with Internet addiction. The prevalence of Internet addiction among college students in Taiwan was high, and the variables mentioned were independently predictive in the logistic regression analysis. This study can be used as a reference for policy making regarding the design of Internet addiction prevention programs and can also aid in the development of strategies designed to help Internet-addicted college students.

  3. Instrumental variable methods for causal inference.

    Science.gov (United States)

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S

    2014-06-15

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment, and instead, an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration, which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Random inbreeding, isonymy, and population isolates in Argentina.

    Science.gov (United States)

    Dipierri, José; Rodríguez-Larralde, Alvaro; Barrai, Italo; Camelo, Jorge López; Redomero, Esperanza Gutiérrez; Rodríguez, Concepción Alonso; Ramallo, Virginia; Bronberg, Rubén; Alfaro, Emma

    2014-07-01

    Population isolates are an important tool in identifying and mapping genes of Mendelian diseases and complex traits. The geographical identification of isolates represents a priority from a genetic and health care standpoint. The purpose of this study is to analyze the spatial distribution of consanguinity by random isonymy (F ST) in Argentina and its relationship with the isolates previously identified in the country. F ST was estimated from the surname distribution of 22.6 million electors registered for the year 2001 in the 24 provinces, 5 geographical regions, and 510 departments of the country. Statistically significant spatial clustering of F ST was determined using the SaTScan V5.1 software. F ST exhibited a marked regional and departamental variation, showing the highest values towards the North and West of Argentina. The clusters of high consanguinity by random isonymy followed the same distribution. Recognized Argentinean genetic isolates are mainly localized at the north of the country, in clusters of high inbreeding. Given the availability of listings of surnames in high-capacity storage devices for different countries, estimating F ST from them can provide information on inbreeding for all levels of administrative subdivisions, to be used as a demographic variable for the identification of isolates within the country for public health purposes.

  5. Fuzzy conditional random fields for temporal data mining

    Science.gov (United States)

    Nurma Yulita, Intan; Setiawan Abdullah, Atje

    2017-10-01

    Temporal data mining is one of the interesting problems in computer science and its application has been performed in a wide variety of fields. The difference between the temporal data mining and data mining is the use of variable time. Therefore, the method used must be capable of processing variables of time. Compared with other methods, conditional random field has advantages in the processing variables of time. The method is a directed graph models that has been widely applied for segmenting and labelling sequence data that appears in various domains. In this study, we proposed use of Fuzzy Logic to be applied in Conditional Random Fields to overcome the problems of uncertainty. The experiment is compared Fuzzy Conditional Random Fields, Conditional Random Fields, and Hidden Markov Models. The result showed that accuracy of Fuzzy Conditional Random Fields is the best.

  6. The North Atlantic Oscillation: variability and interactions with the North Atlantic ocean and Artic sea ice

    Energy Technology Data Exchange (ETDEWEB)

    Jung, T.

    2000-07-01

    The North Atlantic oscillation (NAO) represents the dominant mode of atmospheric variability in the North Atlantic region and describes the strengthening and weakening of the midlatitude westerlies. In this study, variability of the NAO during wintertime and its relationship to the North Atlantic ocean and Arctic sea ice is investigated. For this purpose, observational data are analyzed along with integrations of models for the Atlantic ocean, Arctic sea ice, and the coupled global climate system. From a statistical point of view, the observed NAO index shows unusually high variance on interdecadal time scales during the 20th century. Variability on other time scales is consistent with realizations of random processes (''white noise''). Recurrence of wintertime NAO anomalies from winter-to-winter with missing signals during the inbetween nonwinter seasons is primarily associated with interdecadal variability of the NAO. This recurrence indicates that low-frequency changes of the NAO during the 20th century were in part externally forced. (orig.)

  7. 48 CFR 1830.7002-3 - Representative investment calculations.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Representative investment... Representative investment calculations. (a) The calculation of the representative investment requires... accounting period, the contractor shall either: (1) Determine a representative investment for the cost...

  8. 20 CFR 266.7 - Accountability of a representative payee.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Accountability of a representative payee. 266.7 Section 266.7 Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD RETIREMENT ACT REPRESENTATIVE PAYMENT § 266.7 Accountability of a representative payee. (a) A representative...

  9. Random Response Forest for Privacy-Preserving Classification

    Directory of Open Access Journals (Sweden)

    Gábor Szűcs

    2013-01-01

    Full Text Available The paper deals with classification in privacy-preserving data mining. An algorithm, the Random Response Forest, is introduced constructing many binary decision trees, as an extension of Random Forest for privacy-preserving problems. Random Response Forest uses the Random Response idea among the anonymization methods, which instead of generalization keeps the original data, but mixes them. An anonymity metric is defined for undistinguishability of two mixed sets of data. This metric, the binary anonymity, is investigated and taken into consideration for optimal coding of the binary variables. The accuracy of Random Response Forest is presented at the end of the paper.

  10. New specifications for exponential random graph models

    NARCIS (Netherlands)

    Snijders, Tom A. B.; Pattison, Philippa E.; Robins, Garry L.; Handcock, Mark S.; Stolzenberg, RM

    2006-01-01

    The most promising class of statistical models for expressing structural properties of social networks observed atone moment in time is the class of exponential random graph models (ERGMs), also known as p* models. The strong point of these models is that they can represent a variety of structural

  11. Dynamics of excitable nodes on random graphs

    Indian Academy of Sciences (India)

    ogy and dynamics of excitable nodes on Erd˝os–Rényi (ER) [16] random graphs. Our focus is on rhythmic dynamics, namely periodic solutions, in this representative model. Since the network topology plays an important role, the question of how different growth rules. DOI: 10.1007/s12043-011-0180-6; ePublication: 31 ...

  12. Random matrices, random processes and integrable systems

    CERN Document Server

    2011-01-01

    This book explores the remarkable connections between two domains that, a priori, seem unrelated: Random matrices (together with associated random processes) and integrable systems. The relations between random matrix models and the theory of classical integrable systems have long been studied. These appear mainly in the deformation theory, when parameters characterizing the measures or the domain of localization of the eigenvalues are varied. The resulting differential equations determining the partition function and correlation functions are, remarkably, of the same type as certain equations appearing in the theory of integrable systems. They may be analyzed effectively through methods based upon the Riemann-Hilbert problem of analytic function theory and by related approaches to the study of nonlinear asymptotics in the large N limit. Associated with studies of matrix models are certain stochastic processes, the "Dyson processes", and their continuum diffusion limits, which govern the spectrum in random ma...

  13. The effects of variable practice on locomotor adaptation to a novel asymmetric gait.

    Science.gov (United States)

    Hinkel-Lipsker, Jacob W; Hahn, Michael E

    2017-06-24

    Very little is known about the effects of specific practice on motor learning of predictive balance control during novel bipedal gait. This information could provide an insight into how the direction and magnitude of predictive errors during acquisition of a novel gait task influence transfer of balance control, as well as yield a practice protocol for the restoration of balance for those with locomotor impairments. This study examined the effect of a variable practice paradigm on transfer of a novel asymmetric gait pattern in able-bodied individuals. Using a split-belt treadmill, one limb was driven at a constant velocity (constant limb) and the other underwent specific changes in velocity (variable limb) during practice according to one of three prescribed practice paradigms: serial, where the variable limb velocity increased linearly; random blocked, where variable limb underwent random belt velocity changes every 20 strides; and random practice, where the variable limb underwent random step-to-step changes in velocity. Random practice showed the highest balance control variability during acquisition compared to serial and random blocked practice which demonstrated the best transfer of balance control on one transfer test. Both random and random blocked practices showed significantly less balance control variability during a second transfer test compared to serial practice. These results indicate that random blocked practice may be best for generalizability of balance control while learning a novel gait, perhaps, indicating that individuals who underwent this practice paradigm were able to find the most optimal balance control solution during practice.

  14. Rapidly variable relatvistic absorption

    Science.gov (United States)

    Parker, M.; Pinto, C.; Fabian, A.; Lohfink, A.; Buisson, D.; Alston, W.; Jiang, J.

    2017-10-01

    I will present results from the 1.5Ms XMM-Newton observing campaign on the most X-ray variable AGN, IRAS 13224-3809. We find a series of nine absorption lines with a velocity of 0.24c from an ultra-fast outflow. For the first time, we are able to see extremely rapid variability of the UFO features, and can link this to the X-ray variability from the inner accretion disk. We find a clear flux dependence of the outflow features, suggesting that the wind is ionized by increasing X-ray emission.

  15. An Integrated Method to Analyze Farm Vulnerability to Climatic and Economic Variability According to Farm Configurations and Farmers' Adaptations.

    Science.gov (United States)

    Martin, Guillaume; Magne, Marie-Angélina; Cristobal, Magali San

    2017-01-01

    The need to adapt to decrease farm vulnerability to adverse contextual events has been extensively discussed on a theoretical basis. We developed an integrated and operational method to assess farm vulnerability to multiple and interacting contextual changes and explain how this vulnerability can best be reduced according to farm configurations and farmers' technical adaptations over time. Our method considers farm vulnerability as a function of the raw measurements of vulnerability variables (e.g., economic efficiency of production), the slope of the linear regression of these measurements over time, and the residuals of this linear regression. The last two are extracted from linear mixed models considering a random regression coefficient (an intercept common to all farms), a global trend (a slope common to all farms), a random deviation from the general mean for each farm, and a random deviation from the general trend for each farm. Among all possible combinations, the lowest farm vulnerability is obtained through a combination of high values of measurements, a stable or increasing trend and low variability for all vulnerability variables considered. Our method enables relating the measurements, trends and residuals of vulnerability variables to explanatory variables that illustrate farm exposure to climatic and economic variability, initial farm configurations and farmers' technical adaptations over time. We applied our method to 19 cattle (beef, dairy, and mixed) farms over the period 2008-2013. Selected vulnerability variables, i.e., farm productivity and economic efficiency, varied greatly among cattle farms and across years, with means ranging from 43.0 to 270.0 kg protein/ha and 29.4-66.0% efficiency, respectively. No farm had a high level, stable or increasing trend and low residuals for both farm productivity and economic efficiency of production. Thus, the least vulnerable farms represented a compromise among measurement value, trend, and variability of

  16. An Integrated Method to Analyze Farm Vulnerability to Climatic and Economic Variability According to Farm Configurations and Farmers’ Adaptations

    Science.gov (United States)

    Martin, Guillaume; Magne, Marie-Angélina; Cristobal, Magali San

    2017-01-01

    The need to adapt to decrease farm vulnerability to adverse contextual events has been extensively discussed on a theoretical basis. We developed an integrated and operational method to assess farm vulnerability to multiple and interacting contextual changes and explain how this vulnerability can best be reduced according to farm configurations and farmers’ technical adaptations over time. Our method considers farm vulnerability as a function of the raw measurements of vulnerability variables (e.g., economic efficiency of production), the slope of the linear regression of these measurements over time, and the residuals of this linear regression. The last two are extracted from linear mixed models considering a random regression coefficient (an intercept common to all farms), a global trend (a slope common to all farms), a random deviation from the general mean for each farm, and a random deviation from the general trend for each farm. Among all possible combinations, the lowest farm vulnerability is obtained through a combination of high values of measurements, a stable or increasing trend and low variability for all vulnerability variables considered. Our method enables relating the measurements, trends and residuals of vulnerability variables to explanatory variables that illustrate farm exposure to climatic and economic variability, initial farm configurations and farmers’ technical adaptations over time. We applied our method to 19 cattle (beef, dairy, and mixed) farms over the period 2008–2013. Selected vulnerability variables, i.e., farm productivity and economic efficiency, varied greatly among cattle farms and across years, with means ranging from 43.0 to 270.0 kg protein/ha and 29.4–66.0% efficiency, respectively. No farm had a high level, stable or increasing trend and low residuals for both farm productivity and economic efficiency of production. Thus, the least vulnerable farms represented a compromise among measurement value, trend, and

  17. Current status of the HIBMC and results of representative diseases

    Science.gov (United States)

    Murakami, Masao; Demizu, Yusuke; Niwa, Yasue; Miyawaki, Daisuke; Terashima, Kazuki; Arimura, Takeshi; Mima, Masayuki; Nagayama, Shinichi; Maeda, Takuya; Baba, Masashi; Akagi, Takashi; Hishikawa, Yoshio; Abe, Mitsuyuki

    2009-07-01

    The proton radiotherapy (PRT) has been spreading, since 1990 when 250 MeV proton beams with rotation gantry was developed for medical use. On the other hand, carbon-ion radiotherapy (CRT) that has both physical and biological features is available at 4 facilities in the world. HIBMC is the only facility to be able to use both particles. From Apr 2001 to Dec 2008, 2486 patients were treated with PRT in 2030 patients or with CRT in 456. Treatment to the Head and Neck (H&N: in 405 patients), the lung (245), the liver (371), and the prostatic carcinoma (1059) was a major subject. The 2-year local control rates is 72% in H&N (n = 163, T1:9, T2:18, T3:36, T4:79, malignant melanoma 48, adenoid cystic carcinoma 35, squamous cell carcinoma (SCC) 32, adenocarcinoma 14, others 34), 88% in lung (n = 116, T1:59, T2:42, T3:4, T4:6, SCC 30, adenocarcinoma 59, others 27), and 89% in liver cancer (n = 153, Proton: 130, carbon: 23). Biochemical disease free 3-year survival of 291 prostate cancer is 100% in 9 patients with initial prostate-specific antigen (PSA) level 20 ng/ml. These results are excellent comparable or superior to those of surgery. Thus, particle therapy is sophisticated radiotherapy, however the only problem to prohibit the progress is high costs for construction and maintenance. Facilities at which both proton and carbon ion beams can be used, including the HIBMC, have to investigate the differential use. We started clinical randomized trial to compare both ion beams, and started biological examinations in a project aiming at the development of a laser driven proton radiotherapy. We stated about the current status of the HIBMC and the results of representative diseases.

  18. DATA COLLECTION METHOD FOR PEDESTRIAN MOVEMENT VARIABLES

    Directory of Open Access Journals (Sweden)

    Hajime Inamura

    2000-01-01

    Full Text Available The need of tools for design and evaluation of pedestrian areas, subways stations, entrance hall, shopping mall, escape routes, stadium etc lead to the necessity of a pedestrian model. One approach pedestrian model is Microscopic Pedestrian Simulation Model. To be able to develop and calibrate a microscopic pedestrian simulation model, a number of variables need to be considered. As the first step of model development, some data was collected using video and the coordinate of the head path through image processing were also taken. Several numbers of variables can be gathered to describe the behavior of pedestrian from a different point of view. This paper describes how to obtain variables from video taking and simple image processing that can represent the movement of pedestrians and its variables

  19. Eternity Variables to Simulate Specifications

    NARCIS (Netherlands)

    Hesselink, WH; Boiten, EA; Moller, B

    2002-01-01

    Simulation of specifications is introduced as a unification and generalization of refinement mappings, history variables, forward simulations, prophecy variables, and backward simulations. Eternity variables are introduced as a more powerful alternative for prophecy variables and backward

  20. Stochastic finite element method for random harmonic analysis of composite plates with uncertain modal damping parameters

    Science.gov (United States)

    Sepahvand, K.

    2017-07-01

    Damping parameters of fiber-reinforced composite possess significant uncertainty due to the structural complexity of such materials. Considering the parameters as random variables, this paper uses the generalized polynomial chaos (gPC) expansion to capture the uncertainty in the damping and frequency response function of composite plate structures. A spectral stochastic finite element formulation for damped vibration analysis of laminate plates is employed. Experimental modal data for samples of plates is used to identify and realize the range and probability distributions of uncertain damping parameters. The constructed gPC expansions for the uncertain parameters are used as inputs to a deterministic finite element model to realize random frequency responses on a few numbers of collocation points generated in random space. The realizations then are employed to estimate the unknown deterministic functions of the gPC expansion approximating the responses. Employing modal superposition method to solve harmonic analysis problem yields an efficient sparse gPC expansion representing the responses. The results show while the responses are influenced by the damping uncertainties at the mid and high frequency ranges, the impact in low frequency modes can be safely ignored. Utilizing a few random collocation points, the method indicates also a very good agreement compared to the sampling-based Monte Carlo simulations with large number of realizations. As the deterministic finite element model serves as black-box solver, the procedure can be efficiently adopted to complex structural systems with uncertain parameters in terms of computational time.

  1. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B....../150) of the studies were imbalanced at the 0.05 level of probability for the two treatments and 13.3% (20/150) imbalanced at the 0.01 level in the randomization. It is suggested that there may exist misunderstanding of the concept and the misuse of randomization based on the review....

  2. Variable-Rate Premiums

    Data.gov (United States)

    Pension Benefit Guaranty Corporation — These interest rates are used to value vested benefits for variable rate premium purposes as described in PBGC's regulation on Premium Rates (29 CFR Part 4006) and...

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic i...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability.......Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...

  4. Variable Attitude Test Stand

    Data.gov (United States)

    Federal Laboratory Consortium — The Variable Attitude Test Stand designed and built for testing of the V-22 tilt rotor aircraft propulsion system, is used to evaluate the effect of aircraft flight...

  5. A Note on the Correlated Random Coefficient Model

    DEFF Research Database (Denmark)

    Kolodziejczyk, Christophe

    In this note we derive the bias of the OLS estimator for a correlated random coefficient model with one random coefficient, but which is correlated with a binary variable. We provide set-identification to the parameters of interest of the model. We also show how to reduce the bias of the estimator...

  6. Stability in random Boolean cellular automata on the integer lattice

    NARCIS (Netherlands)

    A.C. Fey (Anne); L. van Driel; F.M. Dekking

    2010-01-01

    htmlabstractWe consider random boolean cellular automata on the integer lattice, i.e., the cells are identified with the integers from 1 to $N$. The behaviour of the automaton is mainly determined by the support of the random variable that selects one of the sixteen possible Boolean rules,

  7. Coarsening at random: characterizations, conjectures and counter-examples

    NARCIS (Netherlands)

    Gill, R.D.; Laan, M.J. van der; Robins, J.M.

    1997-01-01

    The notion of coarsening at random CAR was introduced by Heitjan and Rubin to describe the most general form of randomly grouped censored or missing data for which the coarsening mechanism can be ignored when making likelihoodbased inference about the parameters of the distribution of the variable

  8. Geographic Information Systems to Assess External Validity in Randomized Trials.

    Science.gov (United States)

    Savoca, Margaret R; Ludwig, David A; Jones, Stedman T; Jason Clodfelter, K; Sloop, Joseph B; Bollhalter, Linda Y; Bertoni, Alain G

    2017-08-01

    To support claims that RCTs can reduce health disparities (i.e., are translational), it is imperative that methodologies exist to evaluate the tenability of external validity in RCTs when probabilistic sampling of participants is not employed. Typically, attempts at establishing post hoc external validity are limited to a few comparisons across convenience variables, which must be available in both sample and population. A Type 2 diabetes RCT was used as an example of a method that uses a geographic information system to assess external validity in the absence of a priori probabilistic community-wide diabetes risk sampling strategy. A geographic information system, 2009-2013 county death certificate records, and 2013-2014 electronic medical records were used to identify community-wide diabetes prevalence. Color-coded diabetes density maps provided visual representation of these densities. Chi-square goodness of fit statistic/analysis tested the degree to which distribution of RCT participants varied across density classes compared to what would be expected, given simple random sampling of the county population. Analyses were conducted in 2016. Diabetes prevalence areas as represented by death certificate and electronic medical records were distributed similarly. The simple random sample model was not a good fit for death certificate record (chi-square, 17.63; p=0.0001) and electronic medical record data (chi-square, 28.92; p<0.0001). Generally, RCT participants were oversampled in high-diabetes density areas. Location is a highly reliable "principal variable" associated with health disparities. It serves as a directly measurable proxy for high-risk underserved communities, thus offering an effective and practical approach for examining external validity of RCTs. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  9. Calculus of one variable

    CERN Document Server

    Grossman, Stanley I

    1986-01-01

    Calculus of One Variable, Second Edition presents the essential topics in the study of the techniques and theorems of calculus.The book provides a comprehensive introduction to calculus. It contains examples, exercises, the history and development of calculus, and various applications. Some of the topics discussed in the text include the concept of limits, one-variable theory, the derivatives of all six trigonometric functions, exponential and logarithmic functions, and infinite series.This textbook is intended for use by college students.

  10. SLiMFinder: a probabilistic method for identifying over-represented, convergently evolved, short linear motifs in proteins.

    Directory of Open Access Journals (Sweden)

    Richard J Edwards

    Full Text Available BACKGROUND: Short linear motifs (SLiMs in proteins are functional microdomains of fundamental importance in many biological systems. SLiMs typically consist of a 3 to 10 amino acid stretch of the primary protein sequence, of which as few as two sites may be important for activity, making identification of novel SLiMs extremely difficult. In particular, it can be very difficult to distinguish a randomly recurring "motif" from a truly over-represented one. Incorporating ambiguous amino acid positions and/or variable-length wildcard spacers between defined residues further complicates the matter. METHODOLOGY/PRINCIPAL FINDINGS: In this paper we present two algorithms. SLiMBuild identifies convergently evolved, short motifs in a dataset of proteins. Motifs are built by combining dimers into longer patterns, retaining only those motifs occurring in a sufficient number of unrelated proteins. Motifs with fixed amino acid positions are identified and then combined to incorporate amino acid ambiguity and variable-length wildcard spacers. The algorithm is computationally efficient compared to alternatives, particularly when datasets include homologous proteins, and provides great flexibility in the nature of motifs returned. The SLiMChance algorithm estimates the probability of returned motifs arising by chance, correcting for the size and composition of the dataset, and assigns a significance value to each motif. These algorithms are implemented in a software package, SLiMFinder. SLiMFinder default settings identify known SLiMs with 100% specificity, and have a low false discovery rate on random test data. CONCLUSIONS/SIGNIFICANCE: The efficiency of SLiMBuild and low false discovery rate of SLiMChance make SLiMFinder highly suited to high throughput motif discovery and individual high quality analyses alike. Examples of such analyses on real biological data, and how SLiMFinder results can help direct future discoveries, are provided. SLiMFinder is freely

  11. A Randomized Controlled Trial of an Appearance-focused Intervention to Prevent Skin Cancer

    Science.gov (United States)

    Hillhouse, Joel; Turrisi, Rob; Stapleton, Jerod; Robinson, June

    2014-01-01

    BACKGROUND Skin cancer represents a significant health threat with over 1.3 million diagnoses, 8000 melanoma deaths, and more than $1 billion spent annually for skin cancer healthcare in the US. Despite findings from laboratory, case-control, and prospective studies that indicate a link between youthful indoor tanning (IT) and skin cancer, IT is increasing among US youth. Appearance-focused interventions represent a promising method to counteract these trends. METHODS A total of 430 female indoor tanners were randomized into intervention or no intervention control conditions. Intervention participants received an appearance-focused booklet based on decision-theoretical models of health behavior. Outcome variables included self-reports of IT behavior and intentions, as well as measures of cognitive mediating variables. RESULTS Normative increases in springtime IT rates were significantly lower (ie, over 35%) at 6-month follow-up in intervention versus control participants with similar reductions in future intentions. Mediation analyses revealed 6 cognitive variables (IT attitudes, fashion attitudes, perceived susceptibility to skin cancer and skin damage, subjective norms, and image norms) that significantly mediated change in IT behavior. CONCLUSIONS The appearance-focused intervention demonstrated strong effects on IT behavior and intentions in young indoor tanners. Appearance-focused approaches to skin cancer prevention need to present alternative behaviors as well as alter IT attitudes. Mediational results provide guides for strengthening future appearance-focused interventions directed at behaviors that increase risk of skin cancer. PMID:18937268

  12. Variability in response to albuminuria-lowering drugs

    DEFF Research Database (Denmark)

    Petrykiv, Sergei I; de Zeeuw, Dick; Persson, Frederik

    2017-01-01

    AIMS: Albuminuria-lowering drugs have shown different effect size in different individuals. Since urine albumin levels are known to vary considerably from day-to-day, we questioned whether the between-individual variability in albuminuria response after therapy initiation reflects a random...... variability or a true response variation to treatment. In addition, we questioned whether the response variability is drug dependent. METHODS: To determine whether the response to treatment is random or a true drug response, we correlated in six clinical trials the change in albuminuria during placebo...... or active treatment (on-treatment) with the change in albuminuria during wash-out (off-treatment). If these responses correlate during active treatment, it suggests that at least part of the response variability can be attributed to drug response variability. We tested this for enalapril, losartan...

  13. Color and Variability Characteristics of Point Sources in the Faint Sky Variability Survey

    OpenAIRE

    Huber, M.E.; Everett, M. E.; Howell, S. B.

    2006-01-01

    We present an analysis of the color and variability characteristics for point sources in the Faint Sky Variability Survey (FSVS). The FSVS cataloged ~23 square degrees in BVI filters from ~16--24 mag to investigate variability in faint sources at moderate to high Galactic latitudes. Point source completeness is found to be >83% for a selected representative sample (V=17.5--22.0 mag, B-V=0.0--1.5) containing both photometric B, V detections and 80% of the time-sampled V data available compared...

  14. MATERIAL SIGNATURE ORTHONORMAL MAPPING IN HYPERSPECTRAL UNMIXING TO ADDRESS ENDMEMBER VARIABILITY

    Directory of Open Access Journals (Sweden)

    Ali Jafari

    2016-03-01

    Full Text Available A new hyperspectral unmixing algorithm which considers endmember variability is presented. In the proposed algorithm, the endmembers are represented by correlated random vectors using the stochastic mixing model. Currently, there is no published theory for selecting the appropriate distribution for endmembers. The proposed algorithm first uses a linear transformation called material signature orthonormal mapping (MSOM, which transforms the endmembers to correlated Gaussian random vectors. The MSOM transformation reduces computational requirements by reducing the dimension and improves discrimination of endmembers by orthonormalizing the endmember mean vectors. In the original spectral space, the automated endmember bundles (AEB method extracts a set of spectra (endmember set for each material. The mean vector and covariance matrix of each endmember estimated directly from endmember sets in the MSOM space. Second, a new maximum likelihood method, called NCM_ML, is proposed which estimates abundances in the MSOM space using the normal compositional model (NCM. The proposed algorithm is evaluated and compared with other state-of-the-art unmixing algorithms using simulated and real hyperspectral images. Experimental results demonstrate that the proposed unmixing algorithm can unmix pixels composed of similar endmembers in hyperspectral images in the presence of spectral variability more accurately than previous methods.

  15. Generating Realistic Labelled, Weighted Random Graphs

    CERN Document Server

    Davis, Michael Charles; Liu, Weiru; Miller, Paul; Hunter, Ruth; Kee, Frank

    2015-01-01

    Generative algorithms for random graphs have yielded insights into the structure and evolution of real-world networks. Most networks exhibit a well-known set of properties, such as heavy-tailed degree distributions, clustering and community formation. Usually, random graph models consider only structural information, but many real-world networks also have labelled vertices and weighted edges. In this paper, we present a generative model for random graphs with discrete vertex labels and numeric edge weights. The weights are represented as a set of Beta Mixture Models (BMMs) with an arbitrary number of mixtures, which are learned from real-world networks. We propose a Bayesian Variational Inference (VI) approach, which yields an accurate estimation while keeping computation times tractable. We compare our approach to state-of-the-art random labelled graph generators and an earlier approach based on Gaussian Mixture Models (GMMs). Our results allow us to draw conclusions about the contribution of vertex labels a...

  16. VARIABLE SELECTION FOR CENSORED QUANTILE REGRESION.

    Science.gov (United States)

    Wang, Huixia Judy; Zhou, Jianhui; Li, Yi

    2013-01-01

    Quantile regression has emerged as a powerful tool in survival analysis as it directly links the quantiles of patients' survival times to their demographic and genomic profiles, facilitating the identification of important prognostic factors. In view of limited work on variable selection in the context, we develop a new adaptive-lasso-based variable selection procedure for quantile regression with censored outcomes. To account for random censoring for data with multivariate covariates, we employ the ideas of redistribution-of-mass and e ective dimension reduction. Asymptotically our procedure enjoys the model selection consistency, that is, identifying the true model with probability tending to one. Moreover, as opposed to the existing methods, our new proposal requires fewer assumptions, leading to more accurate variable selection. The analysis of a real cancer clinical trial demonstrates that our procedure can identify and distinguish important factors associated with patient sub-populations characterized by short or long survivals, which is of particular interest to oncologists.

  17. Nonlinear deterministic structures and the randomness of protein sequences

    CERN Document Server

    Huang Yan Zhao

    2003-01-01

    To clarify the randomness of protein sequences, we make a detailed analysis of a set of typical protein sequences representing each structural classes by using nonlinear prediction method. No deterministic structures are found in these protein sequences and this implies that they behave as random sequences. We also give an explanation to the controversial results obtained in previous investigations.

  18. Representing Autonomous Systems Self-Confidence through Competency Boundaries

    Science.gov (United States)

    2015-01-01

    probabilistic road mapping and rapidly exploring random trees ( RRT )) are investigated for scenarios that could result in mission failure due to incomplete... RRT ). For RRTs , random nodes are generated and a branch is formed from the nearest node already on the tree and extends out towards the new random...node. The algorithm builds local paths from the new branches until a global solution is found. RRT * is an extension of RRT that continuously

  19. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  20. Blocked randomization with randomly selected block sizes.

    Science.gov (United States)

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.