Reserves Represented by Random Walks
International Nuclear Information System (INIS)
Filipe, J A; Ferreira, M A M; Andrade, M
2012-01-01
The reserves problem is studied through models based on Random Walks. Random walks are a classical particular case in the analysis of stochastic processes. They do not appear only to study reserves evolution models. They are also used to build more complex systems and as analysis instruments, in a theoretical feature, of other kind of systems. In this work by studying the reserves, the main objective is to see and guarantee that pensions funds get sustainable. Being the use of these models considering this goal a classical approach in the study of pensions funds, this work concluded about the problematic of reserves. A concrete example is presented.
Directory of Open Access Journals (Sweden)
Anwer Khurshid
2012-07-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of have a complex univariate normal distribution. The characteristic function of has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector is Hermitian positive definite. Marginal distributions of have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.
Strong Decomposition of Random Variables
DEFF Research Database (Denmark)
Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.
2007-01-01
A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....
Voiculescu, Dan; Nica, Alexandru
1992-01-01
This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.
Contextuality in canonical systems of random variables
Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.
2017-10-01
Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Ordered random variables theory and applications
Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M
2016-01-01
Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).
A random number generator for continuous random variables
Guerra, V. M.; Tapia, R. A.; Thompson, J. R.
1972-01-01
A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.
Contextuality is about identity of random variables
International Nuclear Information System (INIS)
Dzhafarov, Ehtibar N; Kujala, Janne V
2014-01-01
Contextual situations are those in which seemingly ‘the same’ random variable changes its identity depending on the conditions under which it is recorded. Such a change of identity is observed whenever the assumption that the variable is one and the same under different conditions leads to contradictions when one considers its joint distribution with other random variables (this is the essence of all Bell-type theorems). In our Contextuality-by-Default approach, instead of asking why or how the conditions force ‘one and the same’ random variable to change ‘its’ identity, any two random variables recorded under different conditions are considered different ‘automatically.’ They are never the same, nor are they jointly distributed, but one can always impose on them a joint distribution (probabilistic coupling). The special situations when there is a coupling in which these random variables are equal with probability 1 are considered noncontextual. Contextuality means that such couplings do not exist. We argue that the determination of the identity of random variables by conditions under which they are recorded is not a causal relationship and cannot violate laws of physics. (paper)
Designing neural networks that process mean values of random variables
International Nuclear Information System (INIS)
Barber, Michael J.; Clark, John W.
2014-01-01
We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence
Designing neural networks that process mean values of random variables
Energy Technology Data Exchange (ETDEWEB)
Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)
2014-06-13
We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.
Benford's law and continuous dependent random variables
Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine
2018-01-01
Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.
Maximal Inequalities for Dependent Random Variables
DEFF Research Database (Denmark)
Hoffmann-Jorgensen, Jorgen
2016-01-01
Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X-k. Then a......Maximal inequalities play a crucial role in many probabilistic limit theorem; for instance, the law of large numbers, the law of the iterated logarithm, the martingale limit theorem and the central limit theorem. Let X-1, X-2,... be random variables with partial sums S-k = X-1 + ... + X......-k. Then a maximal inequality gives conditions ensuring that the maximal partial sum M-n = max(1) (...
Probability, random variables, and random processes theory and signal processing applications
Shynk, John J
2012-01-01
Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app
On the product and ratio of Bessel random variables
Directory of Open Access Journals (Sweden)
Saralees Nadarajah
2005-01-01
Full Text Available The distributions of products and ratios of random variables are of interest in many areas of the sciences. In this paper, the exact distributions of the product |XY| and the ratio |X/Y| are derived when X and Y are independent Bessel function random variables. An application of the results is provided by tabulating the associated percentage points.
Hoeffding’s Inequality for Sums of Dependent Random Variables
Czech Academy of Sciences Publication Activity Database
Pelekis, Christos; Ramon, J.
2017-01-01
Roč. 14, č. 6 (2017), č. článku 243. ISSN 1660-5446 Institutional support: RVO:67985807 Keywords : dependent random variables * Hoeffding’s inequality * k-wise independent random variables * martingale differences Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.868, year: 2016
Reduction of the Random Variables of the Turbulent Wind Field
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Nielsen, Søren R.K.
2012-01-01
.e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...
Polynomial chaos expansion with random and fuzzy variables
Jacquelin, E.; Friswell, M. I.; Adhikari, S.; Dessombz, O.; Sinou, J.-J.
2016-06-01
A dynamical uncertain system is studied in this paper. Two kinds of uncertainties are addressed, where the uncertain parameters are described through random variables and/or fuzzy variables. A general framework is proposed to deal with both kinds of uncertainty using a polynomial chaos expansion (PCE). It is shown that fuzzy variables may be expanded in terms of polynomial chaos when Legendre polynomials are used. The components of the PCE are a solution of an equation that does not depend on the nature of uncertainty. Once this equation is solved, the post-processing of the data gives the moments of the random response when the uncertainties are random or gives the response interval when the variables are fuzzy. With the PCE approach, it is also possible to deal with mixed uncertainty, when some parameters are random and others are fuzzy. The results provide a fuzzy description of the response statistical moments.
Exponential Inequalities for Positively Associated Random Variables and Applications
Directory of Open Access Journals (Sweden)
Yang Shanchao
2008-01-01
Full Text Available Abstract We establish some exponential inequalities for positively associated random variables without the boundedness assumption. These inequalities improve the corresponding results obtained by Oliveira (2005. By one of the inequalities, we obtain the convergence rate for the case of geometrically decreasing covariances, which closes to the optimal achievable convergence rate for independent random variables under the Hartman-Wintner law of the iterated logarithm and improves the convergence rate derived by Oliveira (2005 for the above case.
Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments
Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.
2015-12-01
The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide
Couso, Inés; Sánchez, Luciano
2014-01-01
This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...
Directory of Open Access Journals (Sweden)
Stefanović Milena
2013-01-01
Full Text Available In studies of population variability, particular attention has to be paid to the selection of a representative sample. The aim of this study was to assess the size of the new representative sample on the basis of the variability of chemical content of the initial sample on the example of a whitebark pine population. Statistical analysis included the content of 19 characteristics (terpene hydrocarbons and their derivates of the initial sample of 10 elements (trees. It was determined that the new sample should contain 20 trees so that the mean value calculated from it represents a basic set with a probability higher than 95 %. Determination of the lower limit of the representative sample size that guarantees a satisfactory reliability of generalization proved to be very important in order to achieve cost efficiency of the research. [Projekat Ministarstva nauke Republike Srbije, br. OI-173011, br. TR-37002 i br. III-43007
Compound Poisson Approximations for Sums of Random Variables
Serfozo, Richard F.
1986-01-01
We show that a sum of dependent random variables is approximately compound Poisson when the variables are rarely nonzero and, given they are nonzero, their conditional distributions are nearly identical. We give several upper bounds on the total-variation distance between the distribution of such a sum and a compound Poisson distribution. Included is an example for Markovian occurrences of a rare event. Our bounds are consistent with those that are known for Poisson approximations for sums of...
Randomized trial of intermittent or continuous amnioinfusion for variable decelerations.
Rinehart, B K; Terrone, D A; Barrow, J H; Isler, C M; Barrilleaux, P S; Roberts, W E
2000-10-01
To determine whether continuous or intermittent bolus amnioinfusion is more effective in relieving variable decelerations. Patients with repetitive variable decelerations were randomized to an intermittent bolus or continuous amnioinfusion. The intermittent bolus infusion group received boluses of 500 mL of normal saline, each over 30 minutes, with boluses repeated if variable decelerations recurred. The continuous infusion group received a bolus infusion of 500 mL of normal saline over 30 minutes and then 3 mL per minute until delivery occurred. The ability of the amnioinfusion to abolish variable decelerations was analyzed, as were maternal demographic and pregnancy outcome variables. Power analysis indicated that 64 patients would be required. Thirty-five patients were randomized to intermittent infusion and 30 to continuous infusion. There were no differences between groups in terms of maternal demographics, gestational age, delivery mode, neonatal outcome, median time to resolution of variable decelerations, or the number of times variable decelerations recurred. The median volume infused in the intermittent infusion group (500 mL) was significantly less than that in the continuous infusion group (905 mL, P =.003). Intermittent bolus amnioinfusion is as effective as continuous infusion in relieving variable decelerations in labor. Further investigation is necessary to determine whether either of these techniques is associated with increased occurrence of rare complications such as cord prolapse or uterine rupture.
How a dependent's variable non-randomness affects taper equation ...
African Journals Online (AJOL)
In order to apply the least squares method in regression analysis, the values of the dependent variable Y should be random. In an example of regression analysis linear and nonlinear taper equations, which estimate the diameter of the tree dhi at any height of the tree hi, were compared. For each tree the diameter at the ...
An infinite-dimensional weak KAM theory via random variables
Gomes, Diogo A.; Nurbekyan, Levon
2016-01-01
We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.
An infinite-dimensional weak KAM theory via random variables
Gomes, Diogo A.
2016-08-31
We develop several aspects of the infinite-dimensional Weak KAM theory using a random variables\\' approach. We prove that the infinite-dimensional cell problem admits a viscosity solution that is a fixed point of the Lax-Oleinik semigroup. Furthermore, we show the existence of invariant minimizing measures and calibrated curves defined on R.
Extensions of von Neumann's method for generating random variables
International Nuclear Information System (INIS)
Monahan, J.F.
1979-01-01
Von Neumann's method of generating random variables with the exponential distribution and Forsythe's method for obtaining distributions with densities of the form e/sup -G//sup( x/) are generalized to apply to certain power series representations. The flexibility of the power series methods is illustrated by algorithms for the Cauchy and geometric distributions
Limit theorems for multi-indexed sums of random variables
Klesov, Oleg
2014-01-01
Presenting the first unified treatment of limit theorems for multiple sums of independent random variables, this volume fills an important gap in the field. Several new results are introduced, even in the classical setting, as well as some new approaches that are simpler than those already established in the literature. In particular, new proofs of the strong law of large numbers and the Hajek-Renyi inequality are detailed. Applications of the described theory include Gibbs fields, spin glasses, polymer models, image analysis and random shapes. Limit theorems form the backbone of probability theory and statistical theory alike. The theory of multiple sums of random variables is a direct generalization of the classical study of limit theorems, whose importance and wide application in science is unquestionable. However, to date, the subject of multiple sums has only been treated in journals. The results described in this book will be of interest to advanced undergraduates, graduate students and researchers who ...
A review of instrumental variable estimators for Mendelian randomization.
Burgess, Stephen; Small, Dylan S; Thompson, Simon G
2017-10-01
Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.
Characteristics of quantum open systems: free random variables approach
International Nuclear Information System (INIS)
Gudowska-Nowak, E.; Papp, G.; Brickmann, J.
1998-01-01
Random Matrix Theory provides an interesting tool for modelling a number of phenomena where noises (fluctuations) play a prominent role. Various applications range from the theory of mesoscopic systems in nuclear and atomic physics to biophysical models, like Hopfield-type models of neural networks and protein folding. Random Matrix Theory is also used to study dissipative systems with broken time-reversal invariance providing a setup for analysis of dynamic processes in condensed, disordered media. In the paper we use the Random Matrix Theory (RMT) within the formalism of Free Random Variables (alias Blue's functions), which allows to characterize spectral properties of non-Hermitean ''Hamiltonians''. The relevance of using the Blue's function method is discussed in connection with application of non-Hermitean operators in various problems of physical chemistry. (author)
Generation of correlated finite alphabet waveforms using gaussian random variables
Ahmed, Sajid
2016-01-13
Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.
Variable Selection in Time Series Forecasting Using Random Forests
Directory of Open Access Journals (Sweden)
Hristos Tyralis
2017-10-01
Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.
Generation of correlated finite alphabet waveforms using gaussian random variables
Ahmed, Sajid; Alouini, Mohamed-Slim; Jardak, Seifallah
2016-01-01
Various examples of methods and systems are provided for generation of correlated finite alphabet waveforms using Gaussian random variables in, e.g., radar and communication applications. In one example, a method includes mapping an input signal comprising Gaussian random variables (RVs) onto finite-alphabet non-constant-envelope (FANCE) symbols using a predetermined mapping function, and transmitting FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The FANCE waveforms can be based upon the mapping of the Gaussian RVs onto the FANCE symbols. In another example, a system includes a memory unit that can store a plurality of digital bit streams corresponding to FANCE symbols and a front end unit that can transmit FANCE waveforms through a uniform linear array of antenna elements to obtain a corresponding beampattern. The system can include a processing unit that can encode the input signal and/or determine the mapping function.
Problems of variance reduction in the simulation of random variables
International Nuclear Information System (INIS)
Lessi, O.
1987-01-01
The definition of the uniform linear generator is given and some of the mostly used tests to evaluate the uniformity and the independence of the obtained determinations are listed. The problem of calculating, through simulation, some moment W of a random variable function is taken into account. The Monte Carlo method enables the moment W to be estimated and the estimator variance to be obtained. Some techniques for the construction of other estimators of W with a reduced variance are introduced
Generation of correlated finite alphabet waveforms using gaussian random variables
Jardak, Seifallah
2014-09-01
Correlated waveforms have a number of applications in different fields, such as radar and communication. It is very easy to generate correlated waveforms using infinite alphabets, but for some of the applications, it is very challenging to use them in practice. Moreover, to generate infinite alphabet constant envelope correlated waveforms, the available research uses iterative algorithms, which are computationally very expensive. In this work, we propose simple novel methods to generate correlated waveforms using finite alphabet constant and non-constant-envelope symbols. To generate finite alphabet waveforms, the proposed method map the Gaussian random variables onto the phase-shift-keying, pulse-amplitude, and quadrature-amplitude modulation schemes. For such mapping, the probability-density-function of Gaussian random variables is divided into M regions, where M is the number of alphabets in the corresponding modulation scheme. By exploiting the mapping function, the relationship between the cross-correlation of Gaussian and finite alphabet symbols is derived. To generate equiprobable symbols, the area of each region is kept same. If the requirement is to have each symbol with its own unique probability, the proposed scheme allows us that as well. Although, the proposed scheme is general, the main focus of this paper is to generate finite alphabet waveforms for multiple-input multiple-output radar, where correlated waveforms are used to achieve desired beampatterns. © 2014 IEEE.
Analysis of Secret Key Randomness Exploiting the Radio Channel Variability
Directory of Open Access Journals (Sweden)
Taghrid Mazloum
2015-01-01
Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.
Selection for altruism through random drift in variable size populations
Directory of Open Access Journals (Sweden)
Houchmandzadeh Bahram
2012-05-01
Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.
Generating variable and random schedules of reinforcement using Microsoft Excel macros.
Bancroft, Stacie L; Bourret, Jason C
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.
Krivitsky, Pavel N; Handcock, Mark S; Raftery, Adrian E; Hoff, Peter D
2009-07-01
Social network data often involve transitivity, homophily on observed attributes, clustering, and heterogeneity of actor degrees. We propose a latent cluster random effects model to represent all of these features, and we describe a Bayesian estimation method for it. The model is applicable to both binary and non-binary network data. We illustrate the model using two real datasets. We also apply it to two simulated network datasets with the same, highly skewed, degree distribution, but very different network behavior: one unstructured and the other with transitivity and clustering. Models based on degree distributions, such as scale-free, preferential attachment and power-law models, cannot distinguish between these very different situations, but our model does.
Energy Technology Data Exchange (ETDEWEB)
Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-08-01
Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss common modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges
Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables
Jardak, Seifallah
2012-11-01
The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.
Generating Correlated QPSK Waveforms By Exploiting Real Gaussian Random Variables
Jardak, Seifallah; Ahmed, Sajid; Alouini, Mohamed-Slim
2012-01-01
The design of waveforms with specified auto- and cross-correlation properties has a number of applications in multiple-input multiple-output (MIMO) radar, one of them is the desired transmit beampattern design. In this work, an algorithm is proposed to generate quadrature phase shift- keying (QPSK) waveforms with required cross-correlation properties using real Gaussian random-variables (RV’s). This work can be considered as the extension of what was presented in [1] to generate BPSK waveforms. This work will be extended for the generation of correlated higher-order phase shift-keying (PSK) and quadrature amplitude modulation (QAM) schemes that can better approximate the desired beampattern.
Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models
Directory of Open Access Journals (Sweden)
Hui Wang
2017-10-01
Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.
Jeong, Chan-Seok; Kim, Dongsup
2016-02-24
Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.
Automatic Probabilistic Program Verification through Random Variable Abstraction
Directory of Open Access Journals (Sweden)
Damián Barsotti
2010-06-01
Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.
International Nuclear Information System (INIS)
Nuno Almirantearena, F; Introzzi, A; Clara, F; Burillo Lopez, P
2007-01-01
In this work we use 53 Arterial Diameter Variation (ADV) waves extracted from radial artery of normotense males, along with the values of variables that represent the ADV wave, obtained by means of multivariate analysis. Then, we specify the linguistic variables and the linguistic terms. The variables are fuzzified using triangular and trapezoidal fuzzy numbers. We analyze the fuzziness of the linguistic terms by applying discrete and continuous fuzzy entropies. Finally, we infer which variable presents the greatest disorder associated to the loss of arterial elasticity in radial artery
Grace, J.B.; Bollen, K.A.
2008-01-01
Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically-based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling heterogeneous concepts of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially-reduced-form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influence of suites of variables are often of interest. ?? Springer Science+Business Media, LLC 2007.
On the fluctuations of sums of independent random variables.
Feller, W
1969-07-01
If X(1), X(2),... are independent random variables with zero expectation and finite variances, the cumulative sums S(n) are, on the average, of the order of magnitude S(n), where S(n) (2) = E(S(n) (2)). The occasional maxima of the ratios S(n)/S(n) are surprisingly large and the problem is to estimate the extent of their probable fluctuations.Specifically, let S(n) (*) = (S(n) - b(n))/a(n), where {a(n)} and {b(n)}, two numerical sequences. For any interval I, denote by p(I) the probability that the event S(n) (*) epsilon I occurs for infinitely many n. Under mild conditions on {a(n)} and {b(n)}, it is shown that p(I) equals 0 or 1 according as a certain series converges or diverges. To obtain the upper limit of S(n)/a(n), one has to set b(n) = +/- epsilon a(n), but finer results are obtained with smaller b(n). No assumptions concerning the under-lying distributions are made; the criteria explain structurally which features of {X(n)} affect the fluctuations, but for concrete results something about P{S(n)>a(n)} must be known. For example, a complete solution is possible when the X(n) are normal, replacing the classical law of the iterated logarithm. Further concrete estimates may be obtained by combining the new criteria with some recently developed limit theorems.
Directory of Open Access Journals (Sweden)
Lindsey Jones
2018-03-01
Full Text Available Promoting household resilience to climate extremes has emerged as a key development priority. Yet tracking and evaluating resilience at this level remains a critical challenge. Most quantitative approaches rely on objective indicators and assessment frameworks, but these are not fully satisfactory. Much of the difficulty arises from a combination of conceptual ambiguities, challenges in selecting appropriate indicators, and in measuring the many intangible aspects that contribute to household resilience. More recently, subjective measures of resilience have been advocated in helping to overcome some of the limitations of traditional objective characterizations. However, few large-scale studies of quantitative subjective approaches to resilience measurement have been conducted. In this study, we address this gap by exploring perceived levels of household resilience to climate extremes in Tanzania and the utility of standardized subjective methods for its assessment. A nationally representative cross-sectional survey involving 1294 individuals was carried out by mobile phone in June 2015 among randomly selected adult respondents aged 18 and above. Factors that are most associated with resilience-related capacities are having had advance knowledge of a previous flood, and to a lesser extent, believing flooding to be a serious community problem. Somewhat surprisingly, though a small number of weak relationships are apparent, most socio-demographic variables do not exhibit statistically significant differences with regards to perceived resilience-related capacities. These findings may challenge traditional assumptions about what factors characterize household resilience, offering a motivation for studying both subjective and objective perspectives, and understanding better their relationship to one another. If further validated, subjective measures may offer potential as both a complement and alternative to traditional objective methods of resilience
Southern hemisphere climate variability as represented by an ocean-atmosphere coupled model
CSIR Research Space (South Africa)
Beraki, A
2012-09-01
Full Text Available in the atmospheric circulation. The ability of predicting these modes of climate variability on longer timescales is vital. Potential predictability is usually measured as a signal-to-noise contrast between the slowly evolving and chaotic components of the climate...
Mulder, V.L.; Bruin, de S.; Schaepman, M.E.
2013-01-01
This paper presents a sparse, remote sensing-based sampling approach making use of conditioned Latin Hypercube Sampling (cLHS) to assess variability in soil properties at regional scale. The method optimizes the sampling scheme for a defined spatial population based on selected covariates, which are
Some limit theorems for negatively associated random variables
Indian Academy of Sciences (India)
random sampling without replacement, and (i) joint distribution of ranks. ... wide applications in multivariate statistical analysis and system reliability, the ... strong law of large numbers for negatively associated sequences under the case where.
Rossi, R.; Hendrix, E.M.T.
2014-01-01
We discuss the problem of computing optimal linearisation parameters for the first order loss function of a family of arbitrarily distributed random variable. We demonstrate that, in contrast to the problem in which parameters must be determined for the loss function of a single random variable,
Strong Laws of Large Numbers for Arrays of Rowwise NA and LNQD Random Variables
Directory of Open Access Journals (Sweden)
Jiangfeng Wang
2011-01-01
Full Text Available Some strong laws of large numbers and strong convergence properties for arrays of rowwise negatively associated and linearly negative quadrant dependent random variables are obtained. The results obtained not only generalize the result of Hu and Taylor to negatively associated and linearly negative quadrant dependent random variables, but also improve it.
ESEARCH OF THE LAW OF DISTRIBUTION OF THE RANDOM VARIABLE OF THE COMPRESSION
Directory of Open Access Journals (Sweden)
I. Sarayeva
2011-01-01
Full Text Available At research of diagnosing the process of modern automobile engines by means of methods of mathematical statistics the experimental data of the random variable of compression are analysed and it is proved that the random variable of compression has the form of the normal law of distribution.
Directory of Open Access Journals (Sweden)
Bogdan Gheorghe Munteanu
2013-01-01
Full Text Available Using the stochastic approximations, in this paper it was studiedthe convergence in distribution of the fractional parts of the sum of random variables to the truncated exponential distribution with parameter lambda. This fact is feasible by means of the Fourier-Stieltjes sequence (FSS of the random variable.
Raw and Central Moments of Binomial Random Variables via Stirling Numbers
Griffiths, Martin
2013-01-01
We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…
International Nuclear Information System (INIS)
Millwater, Harry; Singh, Gulshan; Cortina, Miguel
2012-01-01
There are many methods to identify the important variable out of a set of random variables, i.e., “inter-variable” importance; however, to date there are no comparable methods to identify the “region” of importance within a random variable, i.e., “intra-variable” importance. Knowledge of the critical region of an input random variable (tail, near-tail, and central region) can provide valuable information towards characterizing, understanding, and improving a model through additional modeling or testing. As a result, an intra-variable probabilistic sensitivity method was developed and demonstrated for independent random variables that computes the partial derivative of a probabilistic response with respect to a localized perturbation in the CDF values of each random variable. These sensitivities are then normalized in absolute value with respect to the largest sensitivity within a distribution to indicate the region of importance. The methodology is implemented using the Score Function kernel-based method such that existing samples can be used to compute sensitivities for negligible cost. Numerical examples demonstrate the accuracy of the method through comparisons with finite difference and numerical integration quadrature estimates. - Highlights: ► Probabilistic sensitivity methodology. ► Determines the “region” of importance within random variables such as left tail, near tail, center, right tail, etc. ► Uses the Score Function approach to reuse the samples, hence, negligible cost. ► No restrictions on the random variable types or limit states.
Stochastic Optimal Estimation with Fuzzy Random Variables and Fuzzy Kalman Filtering
Institute of Scientific and Technical Information of China (English)
FENG Yu-hu
2005-01-01
By constructing a mean-square performance index in the case of fuzzy random variable, the optimal estimation theorem for unknown fuzzy state using the fuzzy observation data are given. The state and output of linear discrete-time dynamic fuzzy system with Gaussian noise are Gaussian fuzzy random variable sequences. An approach to fuzzy Kalman filtering is discussed. Fuzzy Kalman filtering contains two parts: a real-valued non-random recurrence equation and the standard Kalman filtering.
Bias in random forest variable importance measures: Illustrations, sources and a solution
Directory of Open Access Journals (Sweden)
Hothorn Torsten
2007-01-01
Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and
Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift
DEFF Research Database (Denmark)
Lehre, Per Kristian; Witt, Carsten
2014-01-01
Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...
Kärcher, Bernd; Burkhardt, Ulrike; Ponater, Michael; Frömming, Christine
2010-11-09
Estimates of the global radiative forcing by line-shaped contrails differ mainly due to the large uncertainty in contrail optical depth. Most contrails are optically thin so that their radiative forcing is roughly proportional to their optical depth and increases with contrail coverage. In recent assessments, the best estimate of mean contrail radiative forcing was significantly reduced, because global climate model simulations pointed at lower optical depth values than earlier studies. We revise these estimates by comparing the probability distribution of contrail optical depth diagnosed with a climate model with the distribution derived from a microphysical, cloud-scale model constrained by satellite observations over the United States. By assuming that the optical depth distribution from the cloud model is more realistic than that from the climate model, and by taking the difference between the observed and simulated optical depth over the United States as globally representative, we quantify uncertainties in the climate model's diagnostic contrail parameterization. Revising the climate model results accordingly increases the global mean radiative forcing estimate for line-shaped contrails by a factor of 3.3, from 3.5 mW/m(2) to 11.6 mW/m(2) for the year 1992. Furthermore, the satellite observations and the cloud model point at higher global mean optical depth of detectable contrails than often assumed in radiative transfer (off-line) studies. Therefore, we correct estimates of contrail radiative forcing from off-line studies as well. We suggest that the global net radiative forcing of line-shaped persistent contrails is in the range 8-20 mW/m(2) for the air traffic in the year 2000.
PaCAL: A Python Package for Arithmetic Computations with Random Variables
Directory of Open Access Journals (Sweden)
Marcin Korze?
2014-05-01
Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.
Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables
Directory of Open Access Journals (Sweden)
Dragana Č. Pavlović
2013-01-01
Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.
A comparison of methods for representing random taste heterogeneity in discrete choice models
DEFF Research Database (Denmark)
Fosgerau, Mogens; Hess, Stephane
2009-01-01
This paper reports the findings of a systematic study using Monte Carlo experiments and a real dataset aimed at comparing the performance of various ways of specifying random taste heterogeneity in a discrete choice model. Specifically, the analysis compares the performance of two recent advanced...... distributions. Both approaches allow the researcher to increase the number of parameters as desired. The paper provides a range of evidence on the ability of the various approaches to recover various distributions from data. The two advanced approaches are comparable in terms of the likelihoods achieved...
On mean square displacement behaviors of anomalous diffusions with variable and random orders
International Nuclear Information System (INIS)
Sun Hongguang; Chen Wen; Sheng Hu; Chen Yangquan
2010-01-01
Mean square displacement (MSD) is used to characterize anomalous diffusion. Recently, models of anomalous diffusion with variable-order and random-order were proposed, but no MSD analysis has been given so far. The purpose of this Letter is to offer a concise derivation of MSD functions for the variable-order model and the random-order model. Numerical results are presented to illustrate the analytical results. In addition, we show how to establish a variable-random-order model for a given MSD function which has clear application potentials.
New Results On the Sum of Two Generalized Gaussian Random Variables
Soury, Hamza
2015-01-01
We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented.
New Results on the Sum of Two Generalized Gaussian Random Variables
Soury, Hamza
2016-01-06
We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].
Hung, Tran Loc; Giang, Le Truong
2016-01-01
Using the Stein-Chen method some upper bounds in Poisson approximation for distributions of row-wise triangular arrays of independent negative-binomial distributed random variables are established in this note.
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...
New Results on the Sum of Two Generalized Gaussian Random Variables
Soury, Hamza; Alouini, Mohamed-Slim
2016-01-01
We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].
Zero Distribution of System with Unknown Random Variables Case Study: Avoiding Collision Path
Directory of Open Access Journals (Sweden)
Parman Setyamartana
2014-07-01
Full Text Available This paper presents the stochastic analysis of finding the feasible trajectories of robotics arm motion at obstacle surrounding. Unknown variables are coefficients of polynomials joint angle so that the collision-free motion is achieved. ãk is matrix consisting of these unknown feasible polynomial coefficients. The pattern of feasible polynomial in the obstacle environment shows as random. This paper proposes to model the pattern of this randomness values using random polynomial with unknown variables as coefficients. The behavior of the system will be obtained from zero distribution as the characteristic of such random polynomial. Results show that the pattern of random polynomial of avoiding collision can be constructed from zero distribution. Zero distribution is like building block of the system with obstacles as uncertainty factor. By scale factor k, which has range, the random coefficient pattern can be predicted.
Saunders, C; Byrne, C D; Guthrie, B; Lindsay, R S; McKnight, J A; Philip, S; Sattar, N; Walker, J J; Wild, S H
2013-03-01
To describe the proportion of people with Type 2 diabetes living in Scotland who meet eligibility criteria for inclusion in several large randomized controlled trials of glycaemic control to inform physicians and guideline developers about the generalizibility of trial results. A literature review was performed to identify large trials assessing the impact of glycaemic control on risk of macrovascular disease. Inclusion and exclusion criteria from each trial were applied to data on the population of people with a diagnosis of Type 2 diabetes living in Scotland in 2008 (n = 180,590) in a population-based cross-sectional study and the number and proportion of people eligible for each trial was determined. Seven trials were identified. The proportion of people with Type 2 diabetes who met the eligibility criteria for the trials ranged from 3.5 to 50.7%. Trial participants were younger at age of diagnosis of diabetes and at time of trial recruitment than in the Scottish study population. The application of upper age criteria excluded the largest proportion of patients, with up to 39% of people with Type 2 diabetes ineligible for a trial with the most stringent criteria based on age alone. We found that many of the large trials of glycaemic control among people with Type 2 diabetes have limited external validity when applied to a population-based cohort of people with Type 2 diabetes. In particular, the age distribution of trial participants often does not reflect that of people with Type 2 diabetes in a contemporary British population. © 2012 The Authors. Diabetic Medicine © 2012 Diabetes UK.
What variables are important in predicting bovine viral diarrhea virus? A random forest approach.
Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo
2015-07-24
Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.
Output variability caused by random seeds in a multi-agent transport simulation model
DEFF Research Database (Denmark)
Paulsen, Mads; Rasmussen, Thomas Kjær; Nielsen, Otto Anker
2018-01-01
Dynamic transport simulators are intended to support decision makers in transport-related issues, and as such it is valuable that the random variability of their outputs is as small as possible. In this study we analyse the output variability caused by random seeds of a multi-agent transport...... simulator (MATSim) when applied to a case study of Santiago de Chile. Results based on 100 different random seeds shows that the relative accuracies of estimated link loads tend to increase with link load, but that relative errors of up to 10 % do occur even for links with large volumes. Although...
A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing
Directory of Open Access Journals (Sweden)
Jae-Hee Hur
2017-01-01
Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.
Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros
Bancroft, Stacie L.; Bourret, Jason C.
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time.…
Partial summations of stationary sequences of non-Gaussian random variables
DEFF Research Database (Denmark)
Mohr, Gunnar; Ditlevsen, Ove Dalager
1996-01-01
The distribution of the sum of a finite number of identically distributed random variables is in many cases easily determined given that the variables are independent. The moments of any order of the sum can always be expressed by the moments of the single term without computational problems...... of convergence of the distribution of a sum (or an integral) of mutually dependent random variables to the Gaussian distribution. The paper is closely related to the work in Ditlevsen el al. [Ditlevsen, O., Mohr, G. & Hoffmeyer, P. Integration of non-Gaussian fields. Prob. Engng Mech 11 (1996) 15-23](2)....... lognormal variables or polynomials of standard Gaussian variables. The dependency structure is induced by specifying the autocorrelation structure of the sequence of standard Gaussian variables. Particularly useful polynomials are the Winterstein approximations that distributionally fit with non...
Brooks, Mollie E; Mugabo, Marianne; Rodgers, Gwendolen M; Benton, Timothy G; Ozgul, Arpat
2016-03-01
Demographic rates are shaped by the interaction of past and current environments that individuals in a population experience. Past environments shape individual states via selection and plasticity, and fitness-related traits (e.g. individual size) are commonly used in demographic analyses to represent the effect of past environments on demographic rates. We quantified how well the size of individuals captures the effects of a population's past and current environments on demographic rates in a well-studied experimental system of soil mites. We decomposed these interrelated sources of variation with a novel method of multiple regression that is useful for understanding nonlinear relationships between responses and multicollinear explanatory variables. We graphically present the results using area-proportional Venn diagrams. Our novel method was developed by combining existing methods and expanding upon them. We showed that the strength of size as a proxy for the past environment varied widely among vital rates. For instance, in this organism with an income breeding life history, the environment had more effect on reproduction than individual size, but with substantial overlap indicating that size encompassed some of the effects of the past environment on fecundity. This demonstrates that the strength of size as a proxy for the past environment can vary widely among life-history processes within a species, and this variation should be taken into consideration in trait-based demographic or individual-based approaches that focus on phenotypic traits as state variables. Furthermore, the strength of a proxy will depend on what state variable(s) and what demographic rate is being examined; that is, different measures of body size (e.g. length, volume, mass, fat stores) will be better or worse proxies for various life-history processes. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.
Directory of Open Access Journals (Sweden)
Sugiarto Sugiarto
2015-01-01
Full Text Available The aim of this paper is to investigate the intentions of Jakarta citizens with respect to the electronic road pricing (ERP reform proposed by the city government. Utilizing data from a stated preference survey conducted in 2013, we construct six variables representing latent psychological motivations (appropriateness of ERP adoption; recognition that ERP can mitigate congestion and improve the environment; car dependency (CDC; awareness of the problems of cars in society; inhibition of freedom movement caused by ERP; and doubts about the ability of ERP to mitigate congestion and environment problems. A multiple-indicators multiple-causes (MIMIC model is developed to investigate the effects of respondents’ socio-demographics (causes on the latent constructs in order to gain better understanding of the relationship between respondents’ intentions and the observed individual’s responses (indicators obtained from the stated preference survey. The MIMIC model offers a good account of whether and how socio-demographic attributes and individual indicators predict the latent variables of psychological motivation constructs. Then, we further verify the influences of the latent variables, combining them with levy rate patterns and daily mobility attributes to investigate significant determining factors for social acceptance of the ERP proposal. A latent variable representations based on the generalized ordered response model are employed in our investigations to allow more flexibility in parameter estimation across outcomes. The results confirm that there is a strong correlation between latent psychological motivations and daily mobility attributes and the level of social acceptance for the ERP proposal. This empirical investigation demonstrates that the latent variables play more substantial role in determining scheme’s acceptance. Moreover, elasticity measures show that latent attributes are more sensitive compared to levies and daily mobility
A Particle Swarm Optimization Algorithm with Variable Random Functions and Mutation
Institute of Scientific and Technical Information of China (English)
ZHOU Xiao-Jun; YANG Chun-Hua; GUI Wei-Hua; DONG Tian-Xue
2014-01-01
The convergence analysis of the standard particle swarm optimization (PSO) has shown that the changing of random functions, personal best and group best has the potential to improve the performance of the PSO. In this paper, a novel strategy with variable random functions and polynomial mutation is introduced into the PSO, which is called particle swarm optimization algorithm with variable random functions and mutation (PSO-RM). Random functions are adjusted with the density of the population so as to manipulate the weight of cognition part and social part. Mutation is executed on both personal best particle and group best particle to explore new areas. Experiment results have demonstrated the effectiveness of the strategy.
Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut
2017-05-01
This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Hideki Katagiri
2017-10-01
Full Text Available This paper considers linear programming problems (LPPs where the objective functions involve discrete fuzzy random variables (fuzzy set-valued discrete random variables. New decision making models, which are useful in fuzzy stochastic environments, are proposed based on both possibility theory and probability theory. In multi-objective cases, Pareto optimal solutions of the proposed models are newly defined. Computational algorithms for obtaining the Pareto optimal solutions of the proposed models are provided. It is shown that problems involving discrete fuzzy random variables can be transformed into deterministic nonlinear mathematical programming problems which can be solved through a conventional mathematical programming solver under practically reasonable assumptions. A numerical example of agriculture production problems is given to demonstrate the applicability of the proposed models to real-world problems in fuzzy stochastic environments.
Higher order moments of a sum of random variables: remarks and applications.
Directory of Open Access Journals (Sweden)
Luisa Tibiletti
1996-02-01
Full Text Available The moments of a sum of random variables depend on both the pure moments of each random addendum and on the addendum mixed moments. In this note we introduce a simple measure to evaluate the relative impedance to attach to the latter. Once the pure moments are fixed, the functional relation between the random addenda leading to the extreme values is also provided. Applications to Finance, Decision Theory and Actuarial Sciences are also suggested.
Stable Graphical Model Estimation with Random Forests for Discrete, Continuous, and Mixed Variables
Fellinghauer, Bernd; Bühlmann, Peter; Ryffel, Martin; von Rhein, Michael; Reinhardt, Jan D.
2011-01-01
A conditional independence graph is a concise representation of pairwise conditional independence among many variables. Graphical Random Forests (GRaFo) are a novel method for estimating pairwise conditional independence relationships among mixed-type, i.e. continuous and discrete, variables. The number of edges is a tuning parameter in any graphical model estimator and there is no obvious number that constitutes a good choice. Stability Selection helps choosing this parameter with respect to...
Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients
Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako
2012-01-01
Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…
van der Zwan, J.E.; de Vente, W.; Huizink, A.C.; Bögels, S.M.; de Bruin, E.I.
2015-01-01
In contemporary western societies stress is highly prevalent, therefore the need for stress-reducing methods is great. This randomized controlled trial compared the efficacy of self-help physical activity (PA), mindfulness meditation (MM), and heart rate variability biofeedback (HRV-BF) in reducing
Sums and Products of Jointly Distributed Random Variables: A Simplified Approach
Stein, Sheldon H.
2005-01-01
Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…
Central limit theorem for the Banach-valued weakly dependent random variables
International Nuclear Information System (INIS)
Dmitrovskij, V.A.; Ermakov, S.V.; Ostrovskij, E.I.
1983-01-01
The central limit theorem (CLT) for the Banach-valued weakly dependent random variables is proved. In proving CLT convergence of finite-measured (i.e. cylindrical) distributions is established. A weak compactness of the family of measures generated by a certain sequence is confirmed. The continuity of the limiting field is checked
Non-uniform approximations for sums of discrete m-dependent random variables
Vellaisamy, P.; Cekanavicius, V.
2013-01-01
Non-uniform estimates are obtained for Poisson, compound Poisson, translated Poisson, negative binomial and binomial approximations to sums of of m-dependent integer-valued random variables. Estimates for Wasserstein metric also follow easily from our results. The results are then exemplified by the approximation of Poisson binomial distribution, 2-runs and $m$-dependent $(k_1,k_2)$-events.
J.L. Geluk (Jaap); L. Peng (Liang); C.G. de Vries (Casper)
1999-01-01
textabstractThe paper characterizes first and second order tail behavior of convolutions of i.i.d. heavy tailed random variables with support on the real line. The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.
Directory of Open Access Journals (Sweden)
Thandi Kapwata
2016-11-01
Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.
DEFF Research Database (Denmark)
Krøigård, Thomas; Gaist, David; Otto, Marit
2014-01-01
SUMMARY: The reproducibility of variables commonly included in studies of peripheral nerve conduction in healthy individuals has not previously been analyzed using a random effects regression model. We examined the temporal changes and variability of standard nerve conduction measures in the leg...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity......, sural nerve sensory conduction velocity, and tibial nerve minimal F-wave latency. Between-subject variability was greater than within-subject variability. Sample sizes ranging from 21 to 128 would be required to show changes twice the magnitude of the spontaneous changes observed in this study. Nerve...
SOERP, Statistics and 2. Order Error Propagation for Function of Random Variables
International Nuclear Information System (INIS)
Cox, N. D.; Miller, C. F.
1985-01-01
1 - Description of problem or function: SOERP computes second-order error propagation equations for the first four moments of a function of independently distributed random variables. SOERP was written for a rigorous second-order error propagation of any function which may be expanded in a multivariable Taylor series, the input variables being independently distributed. The required input consists of numbers directly related to the partial derivatives of the function, evaluated at the nominal values of the input variables and the central moments of the input variables from the second through the eighth. 2 - Method of solution: The development of equations for computing the propagation of errors begins by expressing the function of random variables in a multivariable Taylor series expansion. The Taylor series expansion is then truncated, and statistical operations are applied to the series in order to obtain equations for the moments (about the origin) of the distribution of the computed value. If the Taylor series is truncated after powers of two, the procedure produces second-order error propagation equations. 3 - Restrictions on the complexity of the problem: The maximum number of component variables allowed is 30. The IBM version will only process one set of input data per run
Vasenev, I.
2012-04-01
The essential spatial and temporal variability is mutual feature for most natural and man-changed soils at the Central region of European territory of Russia. The original spatial heterogeneity of forest and forest-steppe soils has been further complicated by a specific land-use history and different-direction soil successions due to environmental changes and human impacts. For demand-driven land-use planning and decision making the quantitative analysis, modeling and functional-ecological interpretation of representative soil cover patterns spatial variability is an important and challenging task that receives increasing attention from scientific society, private companies, governmental and environmental bodies. On basis of long-term different-scale soil mapping, key plot investigation, land quality and land-use evaluation, soil forming and degradation processes modeling, functional-ecological typology of the zonal set of elementary soil cover patterns (ESCP) has been done in representative natural and man transformed ecosystems of the forest, forest-steppe and steppe zones at the Central region of European territory of Russia (ETR). The validation and ranging of the limiting factors of functional quality and ecological state have been made for dominating and most dynamical components of ESCP regional-typological forms - with application of local GIS, traditional regression kriging and correlation tree models. Development, zonal-regional differentiation and verification of the basic set of criteria and algorithms for logically formalized distinguishing of the most "stable" & "hot" areas in soil cover patterns make it possible for quantitative assessment of dominating in them elementary landscape, soil-forming and degradation processes. The received data essentially expand known ranges of the soil forming processes (SFP) rate «in situ». In case of mature forests mutual for them the windthrow impacts and lateral processes make SFPs more active and complex both in
Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models
DEFF Research Database (Denmark)
Kock, Anders Bredahl
This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...
Directory of Open Access Journals (Sweden)
Correchel Vladia
2005-01-01
Full Text Available The precision of the 137Cs fallout redistribution technique for the evaluation of soil erosion rates is strongly dependent on the quality of an average inventory taken at a representative reference site. The knowledge of the sources and of the degree of variation of the 137Cs fallout spatial distribution plays an important role on its use. Four reference sites were selected in the South-Central region of Brazil which were characterized in terms of soil chemical, physical and mineralogical aspects as well as the spatial variability of 137Cs inventories. Some important differences in the patterns of 137Cs depth distribution in the soil profiles of the different sites were found. They are probably associated to chemical, physical, mineralogical and biological differences of the soils but many questions still remain open for future investigation, mainly those regarding the adsorption and dynamics of the 137Cs ions in soil profiles under tropical conditions. The random spatial variability (inside each reference site was higher than the systematic spatial variability (between reference sites but their causes were not clearly identified as possible consequences of chemical, physical, mineralogical variability, and/or precipitation.
Residual and Past Entropy for Concomitants of Ordered Random Variables of Morgenstern Family
Directory of Open Access Journals (Sweden)
M. M. Mohie EL-Din
2015-01-01
Full Text Available For a system, which is observed at time t, the residual and past entropies measure the uncertainty about the remaining and the past life of the distribution, respectively. In this paper, we have presented the residual and past entropy of Morgenstern family based on the concomitants of the different types of generalized order statistics (gos and give the linear transformation of such model. Characterization results for these dynamic entropies for concomitants of ordered random variables have been considered.
Directory of Open Access Journals (Sweden)
Qunying Wu
2017-05-01
Full Text Available Abstract In this paper, we study the equivalent conditions of complete moment convergence for sequences of identically distributed extended negatively dependent random variables. As a result, we extend and generalize some results of complete moment convergence obtained by Chow (Bull. Inst. Math. Acad. Sin. 16:177-201, 1988 and Li and Spătaru (J. Theor. Probab. 18:933-947, 2005 from the i.i.d. case to extended negatively dependent sequences.
An edgeworth expansion for a sum of M-Dependent random variables
Directory of Open Access Journals (Sweden)
Wan Soo Rhee
1985-01-01
Full Text Available Given a sequence X1,X2,…,Xn of m-dependent random variables with moments of order 3+α (0<α≦1, we give an Edgeworth expansion of the distribution of Sσ−1(S=X1+X2+…+Xn, σ2=ES2 under the assumption that E[exp(it Sσ1] is small away from the origin. The result is of the best possible order.
Geluk, Jaap; Peng, Liang; de Vries, Casper G.
1999-01-01
Suppose X1,X2 are independent random variables satisfying a second-order regular variation condition on the tail-sum and a balance condition on the tails. In this paper we give a description of the asymptotic behaviour as t → ∞ for P(X1 + X2 > t). The result is applied to the problem of risk diversification in portfolio analysis and to the estimation of the parameter in a MA(1) model.
A cellular automata model of traffic flow with variable probability of randomization
International Nuclear Information System (INIS)
Zheng Wei-Fan; Zhang Ji-Ye
2015-01-01
Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)
Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H
2017-07-01
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in
AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM
International Nuclear Information System (INIS)
Farrell, Sean A.; Murphy, Tara; Lo, Kitty K.
2015-01-01
In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of a random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.
Haller, Bernhard; Ulm, Kurt
2018-02-20
To individualize treatment decisions based on patient characteristics, identification of an interaction between a biomarker and treatment is necessary. Often such potential interactions are analysed using data from randomized clinical trials intended for comparison of two treatments. Tests of interactions are often lacking statistical power and we investigated if and how a consideration of further prognostic variables can improve power and decrease the bias of estimated biomarker-treatment interactions in randomized clinical trials with time-to-event outcomes. A simulation study was performed to assess how prognostic factors affect the estimate of the biomarker-treatment interaction for a time-to-event outcome, when different approaches, like ignoring other prognostic factors, including all available covariates or using variable selection strategies, are applied. Different scenarios regarding the proportion of censored observations, the correlation structure between the covariate of interest and further potential prognostic variables, and the strength of the interaction were considered. The simulation study revealed that in a regression model for estimating a biomarker-treatment interaction, the probability of detecting a biomarker-treatment interaction can be increased by including prognostic variables that are associated with the outcome, and that the interaction estimate is biased when relevant prognostic variables are not considered. However, the probability of a false-positive finding increases if too many potential predictors are included or if variable selection is performed inadequately. We recommend undertaking an adequate literature search before data analysis to derive information about potential prognostic variables and to gain power for detecting true interaction effects and pre-specifying analyses to avoid selective reporting and increased false-positive rates.
Extended q -Gaussian and q -exponential distributions from gamma random variables
Budini, Adrián A.
2015-05-01
The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.
Using randomized variable practice in the treatment of childhood apraxia of speech.
Skelton, Steven L; Hagopian, Aubrie Lynn
2014-11-01
The purpose of this study was to determine if randomized variable practice, a central component of concurrent treatment, would be effective and efficient in treating childhood apraxia of speech (CAS). Concurrent treatment is a treatment program that takes the speech task hierarchy and randomizes it so that all tasks are worked on in one session. Previous studies have shown the treatment program to be effective and efficient in treating phonological and articulation disorders. The program was adapted to be used with children with CAS. A research design of multiple baselines across participants was used. Probes of generalization to untaught words were administered every fifth session. Three children, ranging in age from 4 to 6 years old, were the participants. Data were collected as percent correct productions during baseline, treatment, and probes of generalization of target sounds to untaught words and three-word phrases. All participants showed an increase in correct productions during treatment and during probes. Effect sizes (standard mean difference) for treatment were 3.61-5.00, and for generalization probes, they were 3.15-8.51. The results obtained from this study suggest that randomized variable practice as used in concurrent treatment can be adapted for use in treating children with CAS. Replication of this study with other children presenting CAS will be needed to establish generality of the findings.
Plass-Johnson, Jeremiah G; Taylor, Marc H; Husain, Aidah A A; Teichberg, Mirta C; Ferse, Sebastian C A
2016-01-01
Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased
Energy Technology Data Exchange (ETDEWEB)
Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-11-01
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10^{-4} probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.
THE COVARIATION FUNCTION FOR SYMMETRIC &ALPHA;-STABLE RANDOM VARIABLES WITH FINITE FIRST MOMENTS
Directory of Open Access Journals (Sweden)
Dedi Rosadi
2012-05-01
Full Text Available In this paper, we discuss a generalized dependence measure which is designed to measure dependence of two symmetric α-stable random variables with finite mean(1<α<=2 and contains the covariance function as the special case (when α=2. Weshortly discuss some basic properties of the function and consider several methods to estimate the function and further investigate the numerical properties of the estimatorusing the simulated data. We show how to apply this function to measure dependence of some stock returns on the composite index LQ45 in Indonesia Stock Exchange.
A Method of Approximating Expectations of Functions of Sums of Independent Random Variables
Klass, Michael J.
1981-01-01
Let $X_1, X_2, \\cdots$ be a sequence of independent random variables with $S_n = \\sum^n_{i = 1} X_i$. Fix $\\alpha > 0$. Let $\\Phi(\\cdot)$ be a continuous, strictly increasing function on $\\lbrack 0, \\infty)$ such that $\\Phi(0) = 0$ and $\\Phi(cx) \\leq c^\\alpha\\Phi(x)$ for all $x > 0$ and all $c \\geq 2$. Suppose $a$ is a real number and $J$ is a finite nonempty subset of the positive integers. In this paper we are interested in approximating $E \\max_{j \\in J} \\Phi(|a + S_j|)$. We construct a nu...
Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C
2018-04-01
A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.
Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li
2014-01-01
Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158
Nam, Sungsik; Alouini, Mohamed-Slim; Yang, Hongchuan
2010-01-01
Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs
On the Distribution of Indefinite Quadratic Forms in Gaussian Random Variables
Al-Naffouri, Tareq Y.
2015-10-30
© 2015 IEEE. In this work, we propose a unified approach to evaluating the CDF and PDF of indefinite quadratic forms in Gaussian random variables. Such a quantity appears in many applications in communications, signal processing, information theory, and adaptive filtering. For example, this quantity appears in the mean-square-error (MSE) analysis of the normalized least-meansquare (NLMS) adaptive algorithm, and SINR associated with each beam in beam forming applications. The trick of the proposed approach is to replace inequalities that appear in the CDF calculation with unit step functions and to use complex integral representation of the the unit step function. Complex integration allows us then to evaluate the CDF in closed form for the zero mean case and as a single dimensional integral for the non-zero mean case. Utilizing the saddle point technique allows us to closely approximate such integrals in non zero mean case. We demonstrate how our approach can be extended to other scenarios such as the joint distribution of quadratic forms and ratios of such forms, and to characterize quadratic forms in isotropic distributed random variables.We also evaluate the outage probability in multiuser beamforming using our approach to provide an application of indefinite forms in communications.
Multiobjective Two-Stage Stochastic Programming Problems with Interval Discrete Random Variables
Directory of Open Access Journals (Sweden)
S. K. Barik
2012-01-01
Full Text Available Most of the real-life decision-making problems have more than one conflicting and incommensurable objective functions. In this paper, we present a multiobjective two-stage stochastic linear programming problem considering some parameters of the linear constraints as interval type discrete random variables with known probability distribution. Randomness of the discrete intervals are considered for the model parameters. Further, the concepts of best optimum and worst optimum solution are analyzed in two-stage stochastic programming. To solve the stated problem, first we remove the randomness of the problem and formulate an equivalent deterministic linear programming model with multiobjective interval coefficients. Then the deterministic multiobjective model is solved using weighting method, where we apply the solution procedure of interval linear programming technique. We obtain the upper and lower bound of the objective function as the best and the worst value, respectively. It highlights the possible risk involved in the decision-making tool. A numerical example is presented to demonstrate the proposed solution procedure.
Gómez-Gómez, Enrique; Carrasco-Valiente, Julia; Blanca-Pedregosa, Ana; Barco-Sánchez, Beatriz; Fernandez-Rueda, Jose Luis; Molina-Abril, Helena; Valero-Rosa, Jose; Font-Ugalde, Pilar; Requena-Tapia, Maria José
2017-04-01
To externally validate the European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculator (RC) and to evaluate its variability between 2 consecutive prostate-specific antigen (PSA) values. We prospectively catalogued 1021 consecutive patients before prostate biopsy for suspicion of prostate cancer (PCa). The risk of PCa and significant PCa (Gleason score ≥7) from 749 patients was calculated according to ERSPC-RC (digital rectal examination-based version 3 of 4) for 2 consecutive PSA tests per patient. The calculators' predictions were analyzed using calibration plots and the area under the receiver operating characteristic curve (area under the curve). Cohen kappa coefficient was used to compare the ability and variability. Of 749 patients, PCa was detected in 251 (33.5%) and significant PCa was detected in 133 (17.8%). Calibration plots showed an acceptable parallelism and similar discrimination ability for both PSA levels with an area under the curve of 0.69 for PCa and 0.74 for significant PCa. The ERSPC showed 226 (30.2%) unnecessary biopsies with the loss of 10 significant PCa. The variability of the RC was 16% for PCa and 20% for significant PCa, and a higher variability was associated with a reduced risk of significant PCa. We can conclude that the performance of the ERSPC-RC in the present cohort shows a high similitude between the 2 PSA levels; however, the RC variability value is associated with a decreased risk of significant PCa. The use of the ERSPC in our cohort detects a high number of unnecessary biopsies. Thus, the incorporation of ERSPC-RC could help the clinical decision to carry out a prostate biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Harunobu Usui
Full Text Available The very low-frequency (VLF band of heart rate variability (HRV has different characteristics compared with other HRV components. Here we investigated differences in HRV changes after a mental stress task. After the task, the high-frequency (HF band and ratio of high- to low-frequency bands (LF/HF immediately returned to baseline. We evaluated the characteristics of VLF band changes after a mental stress task. We hypothesized that the VLF band decreases during the Stroop color word task and there would be a delayed recovery for 2 h after the task (i.e., the VLF change would exhibit a "slow recovery". Nineteen healthy, young subjects were instructed to rest for 10 min, followed by a Stroop color word task for 20 min. After the task, the subjects were instructed to rest for 120 min. For all subjects, R-R interval data were collected; analysis was performed for VLF, HF, and LF/HF ratio. HRV during the rest time and each 15-min interval of the recovery time were compared. An analysis of the covariance was performed to adjust for the HF band and LF/HF ratio as confounding variables of the VLF component. HF and VLF bands significantly decreased and the LF/HF ratio significantly increased during the task compared with those during rest time. During recovery, the VLF band was significantly decreased compared with the rest time. After the task, the HF band and LF/HF ratio immediately returned to baseline and were not significantly different from the resting values. After adjusting for HF and LF/HF ratio, the VLF band had significantly decreased compared with that during rest. The VLF band is the "slow recovery" component and the HF band and LF/HF ratio are the "quick recovery" components of HRV. This VLF characteristic may clarify the unexplained association of the VLF band in cardiovascular disease prevention.
Usui, Harunobu; Nishida, Yusuke
2017-01-01
The very low-frequency (VLF) band of heart rate variability (HRV) has different characteristics compared with other HRV components. Here we investigated differences in HRV changes after a mental stress task. After the task, the high-frequency (HF) band and ratio of high- to low-frequency bands (LF/HF) immediately returned to baseline. We evaluated the characteristics of VLF band changes after a mental stress task. We hypothesized that the VLF band decreases during the Stroop color word task and there would be a delayed recovery for 2 h after the task (i.e., the VLF change would exhibit a "slow recovery"). Nineteen healthy, young subjects were instructed to rest for 10 min, followed by a Stroop color word task for 20 min. After the task, the subjects were instructed to rest for 120 min. For all subjects, R-R interval data were collected; analysis was performed for VLF, HF, and LF/HF ratio. HRV during the rest time and each 15-min interval of the recovery time were compared. An analysis of the covariance was performed to adjust for the HF band and LF/HF ratio as confounding variables of the VLF component. HF and VLF bands significantly decreased and the LF/HF ratio significantly increased during the task compared with those during rest time. During recovery, the VLF band was significantly decreased compared with the rest time. After the task, the HF band and LF/HF ratio immediately returned to baseline and were not significantly different from the resting values. After adjusting for HF and LF/HF ratio, the VLF band had significantly decreased compared with that during rest. The VLF band is the "slow recovery" component and the HF band and LF/HF ratio are the "quick recovery" components of HRV. This VLF characteristic may clarify the unexplained association of the VLF band in cardiovascular disease prevention.
Parajuli, A.; Nadeau, D.; Anctil, F.; Parent, A. C.; Bouchard, B.; Jutras, S.
2017-12-01
In snow-fed catchments, it is crucial to monitor and to model snow water equivalent (SWE), particularly to simulate the melt water runoff. However, the distribution of SWE can be highly heterogeneous, particularly within forested environments, mainly because of the large variability in snow depths. Although the boreal forest is the dominant land cover in Canada and in a few other northern countries, very few studies have quantified the spatiotemporal variability of snow depths and snowpack dynamics within this biome. The objective of this paper is to fill this research gap, through a detailed monitoring of snowpack dynamics at nine locations within a 3.57 km2 experimental forested catchment in southern Quebec, Canada (47°N, 71°W). The catchment receives 6 m of snow annually on average and is predominantly covered with balsam fir stand with some traces of spruce and white birch. In this study, we used a network of nine so-called `snow profiling stations', providing automated snow depth and snowpack temperature profile measurements, as well as three contrasting sites (juvenile, sapling and open areas) where sublimation rates were directly measured with flux towers. In addition, a total of 1401 manual snow samples supported by 20 snow pits measurements were collected throughout the winter of 2017. This paper presents some preliminary analyses of this unique dataset. Simple empirical relations relying SWE with easy-to-determine proxies, such as snow depths and snow temperature, are tested. Then, binary regression trees and multiple regression analysis are used to model SWE using topographic characteristics (slope, aspect, elevation), forest features (tree height, tree diameter, forest density and gap fraction) and meteorological forcing (solar radiation, wind speed, snow-pack temperature profile, air temperature, humidity). An analysis of sublimation rates comparing open area, saplings and juvenile forest is also presented in this paper.
Ramírez, Cristian; Young, Ashley; James, Bryony; Aguilera, José M
2010-10-01
Quantitative analysis of food structure is commonly obtained by image analysis of a small portion of the material that may not be the representative of the whole sample. In order to quantify structural parameters (air cells) of 2 types of bread (bread and bagel) the concept of representative volume element (RVE) was employed. The RVE for bread, bagel, and gelatin-gel (used as control) was obtained from the relationship between sample size and the coefficient of variation, calculated from the apparent Young's modulus measured on 25 replicates. The RVE was obtained when the coefficient of variation for different sample sizes converged to a constant value. In the 2 types of bread tested, the tendency of the coefficient of variation was to decrease as the sample size increased, while in the homogeneous gelatin-gel, it remained always constant around 2.3% to 2.4%. The RVE resulted to be cubes with sides of 45 mm for bread, 20 mm for bagels, and 10 mm for gelatin-gel (smallest sample tested). The quantitative image analysis as well as visual observation demonstrated that bread presented the largest dispersion of air-cell sizes. Moreover, both the ratio of maximum air-cell area/image area and maximum air-cell height/image height were greater for bread (values of 0.05 and 0.30, respectively) than for bagels (0.03 and 0.20, respectively). Therefore, the size and the size variation of air cells present in the structure determined the size of the RVE. It was concluded that RVE is highly dependent on the heterogeneity of the structure of the types of baked products.
Sloan, Richard P; Schwarz, Emilie; McKinley, Paula S; Weinstein, Maxine; Love, Gayle; Ryff, Carol; Mroczek, Daniel; Choo, Tse-Hwei; Lee, Seonjoo; Seeman, Teresa
2017-01-01
High frequency (HF) heart rate variability (HRV) has long been accepted as an index of cardiac vagal control. Recent studies report relationships between HF-HRV and indices of positive and negative affect, personality traits and well-being but these studies generally are based on small and selective samples. These relationships were examined using data from 967 participants in the second Midlife in the U.S. (MIDUS II) study. Participants completed survey questionnaires on well-being and affect. HF-HRV was measured at rest. A hierarchical series of regression analyses examined relationships between these various indices and HF-HRV before and after adjustment for relevant demographic and biomedical factors. Significant inverse relationships were found only between indices of negative affect and HF-HRV. Relationships between indices of psychological and hedonic well-being and positive affect failed to reach significance. These findings raise questions about relationships between cardiac parasympathetic modulation, emotion regulation, and indices of well-being. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Yagui, Ana Cristina Zanon; Vale, Luciana Assis Pires Andrade; Haddad, Luciana Branco; Prado, Cristiane; Rossi, Felipe Souza; Deutsch, Alice D Agostini; Rebello, Celso Moura
2011-01-01
To evaluate the efficacy and safety of nasal continuous positive airway pressure (NCPAP) using devices with variable flow or bubble continuous positive airway pressure (CPAP) regarding CPAP failure, presence of air leaks, total CPAP and oxygen time, and length of intensive care unit and hospital stay in neonates with moderate respiratory distress (RD) and birth weight (BW) ≥ 1,500 g. Forty newborns requiring NCPAP were randomized into two study groups: variable flow group (VF) and continuous flow group (CF). The study was conducted between October 2008 and April 2010. Demographic data, CPAP failure, presence of air leaks, and total CPAP and oxygen time were recorded. Categorical outcomes were tested using the chi-square test or the Fisher's exact test. Continuous variables were analyzed using the Mann-Whitney test. The level of significance was set at p CPAP failure (21.1 and 20.0% for VF and CF, respectively; p = 1.000), air leak syndrome (10.5 and 5.0%, respectively; p = 0.605), total CPAP time (median: 22.0 h, interquartile range [IQR]: 8.00-31.00 h and median: 22.0 h, IQR: 6.00-32.00 h, respectively; p = 0.822), and total oxygen time (median: 24.00 h, IQR: 7.00-85.00 h and median: 21.00 h, IQR: 9.50-66.75 h, respectively; p = 0.779). In newborns with BW ≥ 1,500 g and moderate RD, the use of continuous flow NCPAP showed the same benefits as the use of variable flow NCPAP.
On the strong law of large numbers for $\\varphi$-subgaussian random variables
Zajkowski, Krzysztof
2016-01-01
For $p\\ge 1$ let $\\varphi_p(x)=x^2/2$ if $|x|\\le 1$ and $\\varphi_p(x)=1/p|x|^p-1/p+1/2$ if $|x|>1$. For a random variable $\\xi$ let $\\tau_{\\varphi_p}(\\xi)$ denote $\\inf\\{a\\ge 0:\\;\\forall_{\\lambda\\in\\mathbb{R}}\\; \\ln\\mathbb{E}\\exp(\\lambda\\xi)\\le\\varphi_p(a\\lambda)\\}$; $\\tau_{\\varphi_p}$ is a norm in a space $Sub_{\\varphi_p}=\\{\\xi:\\;\\tau_{\\varphi_p}(\\xi)1$) there exist positive constants $c$ and $\\alpha$ such that for every natural number $n$ the following inequality $\\tau_{\\varphi_p}(\\sum_{i=1...
Directory of Open Access Journals (Sweden)
Deli Li
1992-01-01
Full Text Available Let X, Xn, n≥1 be a sequence of iid real random variables, and Sn=∑k=1nXk, n≥1. Convergence rates of moderate deviations are derived, i.e., the rate of convergence to zero of certain tail probabilities of the partial sums are determined. For example, we obtain equivalent conditions for the convergence of series ∑n≥1(ψ2(n/nP(|Sn|≥nφ(n only under the assumptions convergence that EX=0 and EX2=1, where φ and ψ are taken from a broad class of functions. These results generalize and improve some recent results of Li (1991 and Gafurov (1982 and some previous work of Davis (1968. For b∈[0,1] and ϵ>0, letλϵ,b=∑n≥3((loglognb/nI(|Sn|≥(2+ϵnloglogn.The behaviour of Eλϵ,b as ϵ↓0 is also studied.
MODELING THE TIME VARIABILITY OF SDSS STRIPE 82 QUASARS AS A DAMPED RANDOM WALK
International Nuclear Information System (INIS)
MacLeod, C. L.; Ivezic, Z.; Bullock, E.; Kimball, A.; Sesar, B.; Westman, D.; Brooks, K.; Gibson, R.; Becker, A. C.; Kochanek, C. S.; Kozlowski, S.; Kelly, B.; De Vries, W. H.
2010-01-01
We model the time variability of ∼9000 spectroscopically confirmed quasars in SDSS Stripe 82 as a damped random walk (DRW). Using 2.7 million photometric measurements collected over 10 yr, we confirm the results of Kelly et al. and Kozlowski et al. that this model can explain quasar light curves at an impressive fidelity level (0.01-0.02 mag). The DRW model provides a simple, fast (O(N) for N data points), and powerful statistical description of quasar light curves by a characteristic timescale (τ) and an asymptotic rms variability on long timescales (SF ∞ ). We searched for correlations between these two variability parameters and physical parameters such as luminosity and black hole mass, and rest-frame wavelength. Our analysis shows SF ∞ to increase with decreasing luminosity and rest-frame wavelength as observed previously, and without a correlation with redshift. We find a correlation between SF ∞ and black hole mass with a power-law index of 0.18 ± 0.03, independent of the anti-correlation with luminosity. We find that τ increases with increasing wavelength with a power-law index of 0.17, remains nearly constant with redshift and luminosity, and increases with increasing black hole mass with a power-law index of 0.21 ± 0.07. The amplitude of variability is anti-correlated with the Eddington ratio, which suggests a scenario where optical fluctuations are tied to variations in the accretion rate. However, we find an additional dependence on luminosity and/or black hole mass that cannot be explained by the trend with Eddington ratio. The radio-loudest quasars have systematically larger variability amplitudes by about 30%, when corrected for the other observed trends, while the distribution of their characteristic timescale is indistinguishable from that of the full sample. We do not detect any statistically robust differences in the characteristic timescale and variability amplitude between the full sample and the small subsample of quasars detected
Susukida, Ryoko; Crum, Rosa M; Stuart, Elizabeth A; Ebnesajjad, Cyrus; Mojtabai, Ramin
2016-07-01
To compare the characteristics of individuals participating in randomized controlled trials (RCTs) of treatments of substance use disorder (SUD) with individuals receiving treatment in usual care settings, and to provide a summary quantitative measure of differences between characteristics of these two groups of individuals using propensity score methods. Design Analyses using data from RCT samples from the National Institute of Drug Abuse Clinical Trials Network (CTN) and target populations of patients drawn from the Treatment Episodes Data Set-Admissions (TEDS-A). Settings Multiple clinical trial sites and nation-wide usual SUD treatment settings in the United States. A total of 3592 individuals from 10 CTN samples and 1 602 226 individuals selected from TEDS-A between 2001 and 2009. Measurements The propensity scores for enrolling in the RCTs were computed based on the following nine observable characteristics: sex, race/ethnicity, age, education, employment status, marital status, admission to treatment through criminal justice, intravenous drug use and the number of prior treatments. Findings The proportion of those with ≥ 12 years of education and the proportion of those who had full-time jobs were significantly higher among RCT samples than among target populations (in seven and nine trials, respectively, at P difference in the mean propensity scores between the RCTs and the target population was 1.54 standard deviations and was statistically significant at P different from individuals receiving treatment in usual care settings. Notably, RCT participants tend to have more years of education and a greater likelihood of full-time work compared with people receiving care in usual care settings. © 2016 Society for the Study of Addiction.
An AUC-based permutation variable importance measure for random forests.
Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure
2013-04-05
The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.
Directory of Open Access Journals (Sweden)
A. Stankovic
2012-12-01
Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.
Directory of Open Access Journals (Sweden)
Coletta Filho Helvécio Della
2000-01-01
Full Text Available RAPD analysis of 19 Ponkan mandarin accessions was performed using 25 random primers. Of 112 amplification products selected, only 32 were polymorphic across five accessions. The absence of genetic variability among the other 14 accessions suggested that they were either clonal propagations with different local names, or that they had undetectable genetic variability, such as point mutations which cannot be detected by RAPD.
Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?
Directory of Open Access Journals (Sweden)
Andrei Khrennikov
2008-03-01
Full Text Available The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space. These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras. The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous BellÃ¢Â€Â™s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and Ã¢Â€Âœdeath of realityÃ¢Â€Â which are typically linked to BellÃ¢Â€Â™s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in BellÃ¢Â€Â™s type inequalities statistical data from a number of experiments performed under different experimental contexts.
How Far Is Quasar UV/Optical Variability from a Damped Random Walk at Low Frequency?
Energy Technology Data Exchange (ETDEWEB)
Guo Hengxiao; Wang Junxian; Cai Zhenyi; Sun Mouyuan, E-mail: hengxiaoguo@gmail.com, E-mail: jxw@ustc.edu.cn [CAS Key Laboratory for Research in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Hefei 230026 (China)
2017-10-01
Studies have shown that UV/optical light curves of quasars can be described using the prevalent damped random walk (DRW) model, also known as the Ornstein–Uhlenbeck process. A white noise power spectral density (PSD) is expected at low frequency in this model; however, a direct observational constraint to the low-frequency PSD slope is difficult due to the limited lengths of the light curves available. Meanwhile, quasars show scatter in their DRW parameters that is too large to be attributed to uncertainties in the measurements and dependence on the variation of known physical factors. In this work we present simulations showing that, if the low-frequency PSD deviates from the DRW, the red noise leakage can naturally produce large scatter in the variation parameters measured from simulated light curves. The steeper the low-frequency PSD slope, the larger scatter we expect. Based on observations of SDSS Stripe 82 quasars, we find that the low-frequency PSD slope should be no steeper than −1.3. The actual slope could be flatter, which consequently requires that the quasar variabilities should be influenced by other unknown factors. We speculate that the magnetic field and/or metallicity could be such additional factors.
The quotient of normal random variables and application to asset price fat tails
Caginalp, Carey; Caginalp, Gunduz
2018-06-01
The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.
Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas
2010-02-27
Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.
Directory of Open Access Journals (Sweden)
Yang Yang
2013-01-01
Full Text Available We investigate the tailed asymptotic behavior of the randomly weighted sums with increments with convolution-equivalent distributions. Our obtained result can be directly applied to a discrete-time insurance risk model with insurance and financial risks and derive the asymptotics for the finite-time probability of the above risk model.
Chelminiak, P.; Dixon, J. M.; Tuszyński, J. A.; Marsh, R. E.
2006-05-01
This paper discusses an application of a random network with a variable number of links and traps to the elimination of drug molecules from the body by the liver. The nodes and links represent the transport vessels, and the traps represent liver cells with metabolic enzymes that eliminate drug molecules. By varying the number and configuration of links and nodes, different disease states of the liver related to vascular damage have been simulated, and the effects on the rate of elimination of a drug have been investigated. Results of numerical simulations show the prevalence of exponential decay curves with rates that depend on the concentration of links. In the case of fractal lattices at the percolation threshold, we find that the decay of the concentration is described by exponential functions for high trap concentrations but transitions to stretched exponential behavior at low trap concentrations.
International Nuclear Information System (INIS)
Todinov, M.T.
2004-01-01
A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level
Directory of Open Access Journals (Sweden)
DIYAH MARTANTI
2008-10-01
Full Text Available Amorphophallus muelleri Blume (Araceae is valued for its glucomanan content for use in food industry (healthy diet food, paper industry, pharmacy and cosmetics. The species is triploid (2n=3x=39 and the seed is developed apomictically. The present research is aimed to identify genetic variability of six population of A. muelleri from Java (consisted of 50 accessions using random amplified polymorphic DNA (RAPD. The six populations of the species are: East Java: (1 Silo-Jember, (2 Saradan-Madiun, (3 IPB (cultivated, from Saradan-Madiun, (4 Panti-Jember, (5 Probolinggo; and Central Java: (6 Cilacap. The results showed that five RAPD primers generated 42 scorable bands of which 29 (69.05% were polymorphic. Size of the bands varied from 300bp to 1.5kbp. The 50 accessions of A. muelleri were divided into two main clusters, some of them were grouped based on their populations, and some others were not. The range of individual genetic dissimilarity was from 0.02 to 0.36. The results showed that among six populations investigated, Saradan population showed the highest levels of genetic variation with mean values of na = 1.500+ 0.5061, ne = 1.3174 + 0.3841, PLP = 50% and He = 0, 0.1832+0.2054, whereas Silo-Jember population showed the lowest levels of genetic variation with mean values na = 1.2619+ 0.4450, ne = 1.1890 + 0.3507, PLP = 26.19% and He = 0.1048+0.1887. Efforts to conserve, domesticate, cultivate and improve genetically should be based on the genetic properties of each population and individual within population, especially Saradan population which has the highest levels of genetic variation, need more attention for its conservation.
Lühnen, Julia; Haastert, Burkhard; Mühlhauser, Ingrid; Richter, Tanja
2017-09-15
In Germany, the guardianship system provides adults who are no longer able to handle their own affairs a court-appointed legal representative, for support without restriction of legal capacity. Although these representatives only rarely are qualified in healthcare, they nevertheless play decisive roles in the decision-making processes for people with dementia. Previously, we developed an education program (PRODECIDE) to address this shortcoming and tested it for feasibility. Typical, autonomy-restricting decisions in the care of people with dementia-namely, using percutaneous endoscopic gastrostomy (PEG) or physical restrains (PR), or the prescription of antipsychotic drugs (AP)-were the subject areas trained. The training course aims to enhance the competency of legal representatives in informed decision-making. In this study, we will evaluate the efficacy of the PRODECIDE education program. A randomized controlled trial with a six-month follow-up will be conducted to compare the PRODECIDE education program with standard care, enrolling legal representatives (N = 216). The education program lasts 10 h and comprises four modules: A, decision-making processes and methods; and B, C and D, evidence-based knowledge about PEG, PR and AP, respectively. The primary outcome measure is knowledge, which is operationalized as the understanding of decision-making processes in healthcare affairs and in setting realistic expectations about benefits and harms of PEG, PR and AP in people with dementia. Secondary outcomes are sufficient and sustainable knowledge and percentage of persons concerned affected by PEG, FEM or AP. A qualitative process evaluation will be performed. Additionally, to support implementation, a concept for translating the educational contents into e-learning modules will be developed. The study results will show whether the efficacy of the education program could justify its implementation into the regular training curricula for legal representatives
Spieth, Peter M; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo
2014-05-02
General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary function and reduces systemic inflammatory response. However, it is currently not known whether patients undergoing open abdominal surgery might benefit from intraoperative variable ventilation. The PROtective VARiable ventilation trial ('PROVAR') is a single center, randomized controlled trial enrolling 50 patients who are planning for open abdominal surgery expected to last longer than 3 hours. PROVAR compares conventional (non-variable) lung protective ventilation (CV) with variable lung protective ventilation (VV) regarding pulmonary function and inflammatory response. The primary endpoint of the study is the forced vital capacity on the first postoperative day. Secondary endpoints include further lung function tests, plasma cytokine levels, spatial distribution of ventilation assessed by means of electrical impedance tomography and postoperative pulmonary complications. We hypothesize that VV improves lung function and reduces systemic inflammatory response compared to CV in patients receiving mechanical ventilation during general anesthesia for open abdominal surgery longer than 3 hours. PROVAR is the first randomized controlled trial aiming at intra- and postoperative effects of VV on lung function. This study may help to define the role of VV during general anesthesia requiring mechanical ventilation. Clinicaltrials.gov NCT01683578 (registered on September 3 3012).
Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.
You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary
2011-02-01
The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure
Rosenblum, Michael; van der Laan, Mark J.
2010-01-01
Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636
Rosenblum, Michael; van der Laan, Mark J
2010-04-01
Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.
Genetic variability of cultivated cowpea in Benin assessed by random amplified polymorphic DNA
Zannou, A.; Kossou, D.K.; Ahanchédé, A.; Zoundjihékpon, J.; Agbicodo, E.; Struik, P.C.; Sanni, A.
2008-01-01
Characterization of genetic diversity among cultivated cowpea [Vigna unguiculata (L.) Walp.] varieties is important to optimize the use of available genetic resources by farmers, local communities, researchers and breeders. Random amplified polymorphic DNA (RAPD) markers were used to evaluate the
Voracek, M
2001-04-01
Evolutionary psychological theories predict pronounced and universal male-female differences in sexual jealousy. Recent cross-cultural research, using the forced-choice jealousy items pioneered by Buss, et al., 1992, repeatedly found a large sex differential on these self-report measures: men significantly more often than women choose their mate's imagined sexual infidelity to be more distressing or upsetting to them than an imagined emotional infidelity. However, this body of evidence is solely based on undergraduate samples and does not take into account demographic factors. This study examined male-female differences in sexual jealousy in a community sample (N = 335, Eastern Austria). Within a logistic regression model, with other variables controlled for, marital status was a stronger predictor for sexual jealousy than respondents' sex. Contrary to previous research, the sex differential's effect size was only modest. These findings stress the pitfalls of prematurely generalizing evidence from undergraduate samples to the general population and the need for representative population samples in this research area.
Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W
2016-02-01
The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were
Probability densities and the radon variable transformation theorem
International Nuclear Information System (INIS)
Ramshaw, J.D.
1985-01-01
D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation
Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B
1994-01-01
Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),
General Exact Solution to the Problem of the Probability Density for Sums of Random Variables
Tribelsky, Michael I.
2002-07-01
The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.
Randomness and variability of the neuronal activity described by the Ornstein-Uhlenbeck model
Czech Academy of Sciences Publication Activity Database
Košťál, Lubomír; Lánský, Petr; Zucca, Ch.
2007-01-01
Roč. 18, č. 1 (2007), s. 63-75 ISSN 0954-898X R&D Projects: GA MŠk(CZ) LC554; GA AV ČR(CZ) 1ET400110401; GA AV ČR(CZ) KJB100110701 Grant - others:MIUR(IT) PRIN-Cofin 2005 Institutional research plan: CEZ:AV0Z50110509 Keywords : Ornstein-Uhlenbeck * entropy * randomness Subject RIV: FH - Neurology Impact factor: 1.385, year: 2007
International Nuclear Information System (INIS)
Maestrini, A.P.
1979-04-01
Several problems related to the application of the theory of random by means of state variables are studied. The well-known equations that define the propagation of the mean and the variance for linear and non-linear systems are first presented. The Monte Carlo method is next resorted to in order to determine the applicability of the hypothesis of a normally distributed output in case of linear systems subjected to non-Gaussian excitations. Finally, attention is focused on the properties of linear filters and modulation functions proposed to simulate seismic excitations as non stationary random processes. Acceleration spectra obtained by multiplying rms spectra by a constant factor are compared with design spectra suggested by several authors for various soil conditions. In every case, filter properties are given. (Author) [pt
Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim
2014-01-01
The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.
Wang, Kezhi
2014-09-01
The sum of ratios of products of independent 2642 2642α-μ random variables (RVs) is approximated by using the Generalized Gamma ratio approximation (GGRA) with Gamma ratio approximation (GRA) as a special case. The proposed approximation is used to calculate the outage probability of the equal gain combining (EGC) or maximum ratio combining (MRC) receivers for wireless multihop relaying or multiple scattering systems considering interferences. Numerical results show that the newly derived approximation works very well verified by the simulation, while GRA has a slightly worse performance than GGRA when outage probability is below 0.1 but with a more simplified form.
International Nuclear Information System (INIS)
Stotland, Alexander; Peer, Tal; Cohen, Doron; Budoyo, Rangga; Kottos, Tsampikos
2008-01-01
The calculation of the conductance of disordered rings requires a theory that goes beyond the Kubo-Drude formulation. Assuming 'mesoscopic' circumstances the analysis of the electro-driven transitions shows similarities with a percolation problem in energy space. We argue that the texture and the sparsity of the perturbation matrix dictate the value of the conductance, and study its dependence on the disorder strength, ranging from the ballistic to the Anderson localization regime. An improved sparse random matrix model is introduced to capture the essential ingredients of the problem, and leads to a generalized variable range hopping picture. (fast track communication)
A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.
Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco
2005-02-01
Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.
r2VIM: A new variable selection method for random forests in genome-wide association studies.
Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E
2016-01-01
Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.
Random variables in forest policy: A systematic sensitivity analysis using CGE models
International Nuclear Information System (INIS)
Alavalapati, J.R.R.
1999-01-01
Computable general equilibrium (CGE) models are extensively used to simulate economic impacts of forest policies. Parameter values used in these models often play a central role in their outcome. Since econometric studies and best guesses are the main sources of these parameters, some randomness exists about the 'true' values of these parameters. Failure to incorporate this randomness into these models may limit the degree of confidence in the validity of the results. In this study, we conduct a systematic sensitivity analysis (SSA) to assess the economic impacts of: 1) a 1 % increase in tax on Canadian lumber and wood products exports to the United States (US), and 2) a 1% decrease in technical change in the lumber and wood products and pulp and paper sectors of the US and Canada. We achieve this task by using an aggregated version of global trade model developed by Hertel (1997) and the automated SSA procedure developed by Arndt and Pearson (1996). The estimated means and standard deviations suggest that certain impacts are more likely than others. For example, an increase in export tax is likely to cause a decrease in Canadian income, while an increase in US income is unlikely. On the other hand, a decrease in US welfare is likely, while an increase in Canadian welfare is unlikely, in response to an increase in tax. It is likely that income and welfare both fall in Canada and the US in response to a decrease in the technical change in lumber and wood products and pulp and paper sectors 21 refs, 1 fig, 5 tabs
International Nuclear Information System (INIS)
Bunzl, K.
2002-01-01
In the field, the distribution coefficient, K d , for the sorption of a radionuclide by the soil cannot be expected to be constant. Even in a well defined soil horizon, K d will vary stochastically in horizontal as well as in vertical direction around a mean value. The horizontal random variability of K d produce a pronounced tailing effect in the concentration depth profile of a fallout radionuclide, much less is known on the corresponding effect of the vertical random variability. To analyze this effect theoretically, the classical convection-dispersion model in combination with the random-walk particle method was applied. The concentration depth profile of a radionuclide was calculated one year after deposition assuming constant values of the pore water velocity, the diffusion/dispersion coefficient, and the distribution coefficient (K d = 100 cm 3 x g -1 ) and exhibiting a vertical variability for K d according to a log-normal distribution with a geometric mean of 100 cm 3 x g -1 and a coefficient of variation of CV 0.53. The results show that these two concentration depth profiles are only slightly different, the location of the peak is shifted somewhat upwards, and the dispersion of the concentration depth profile is slightly larger. A substantial tailing effect of the concentration depth profile is not perceivable. Especially with respect to the location of the peak, a very good approximation of the concentration depth profile is obtained if the arithmetic mean of the K d -values (K d = 113 cm 3 x g -1 ) and a slightly increased dispersion coefficient are used in the analytical solution of the classical convection-dispersion equation with constant K d . The evaluation of the observed concentration depth profile with the analytical solution of the classical convection-dispersion equation with constant parameters will, within the usual experimental limits, hardly reveal the presence of a log-normal random distribution of K d in the vertical direction in
Park, Hame; Lueckmann, Jan-Matthis; von Kriegstein, Katharina; Bitzer, Sebastian; Kiebel, Stefan J.
2016-01-01
Decisions in everyday life are prone to error. Standard models typically assume that errors during perceptual decisions are due to noise. However, it is unclear how noise in the sensory input affects the decision. Here we show that there are experimental tasks for which one can analyse the exact spatio-temporal details of a dynamic sensory noise and better understand variability in human perceptual decisions. Using a new experimental visual tracking task and a novel Bayesian decision making model, we found that the spatio-temporal noise fluctuations in the input of single trials explain a significant part of the observed responses. Our results show that modelling the precise internal representations of human participants helps predict when perceptual decisions go wrong. Furthermore, by modelling precisely the stimuli at the single-trial level, we were able to identify the underlying mechanism of perceptual decision making in more detail than standard models. PMID:26752272
Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen
2018-01-01
This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.
DEFF Research Database (Denmark)
Representing Development presents the different social representations that have formed the idea of development in Western thinking over the past three centuries. Offering an acute perspective on the current state of developmental science and providing constructive insights into future pathways, ...
Bryan, Stephanie; Pinto Zipp, Genevieve; Parasher, Raju
2012-01-01
Physical inactivity is a serious issue for the American public. Because of conditions that result from inactivity, individuals incur close to $1 trillion USD in health-care costs, and approximately 250 000 premature deaths occur per year. Researchers have linked engaging in yoga to improved overall fitness, including improved muscular strength, muscular endurance, flexibility, and balance. Researchers have not yet investigated the impact of yoga on exercise adherence. The research team assessed the effects of 10 weeks of yoga classes held twice a week on exercise adherence in previously sedentary adults. The research team designed a randomized controlled pilot trial. The team collected data from the intervention (yoga) and control groups at baseline, midpoint, and posttest (posttest 1) and also collected data pertaining to exercise adherence for the yoga group at 5 weeks posttest (posttest 2). The pilot took place in a yoga studio in central New Jersey in the United States. The pretesting occurred at the yoga studio for all participants. Midpoint testing and posttesting occurred at the studio for the yoga group and by mail for the control group. Participants were 27 adults (mean age 51 y) who had been physically inactive for a period of at least 6 months prior to the study. Interventions The intervention group (yoga group) received hour-long hatha yoga classes that met twice a week for 10 weeks. The control group did not participate in classes during the research study; however, they were offered complimentary post research classes. Outcome Measures The study's primary outcome measure was exercise adherence as measured by the 7-day Physical Activity Recall. The secondary measures included (1) exercise self-efficacy as measured by the Multidimensional Self-Efficacy for Exercise Scale, (2) general well-being as measured by the General Well-Being Schedule, (3) exercise-group cohesion as measured by the Group Environment Questionnaire (GEQ), (4) acute feeling response
Nam, Sungsik
2010-11-01
Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks(Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels. © 2006 IEEE.
Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables
Directory of Open Access Journals (Sweden)
Andrew Rosalsky
2004-12-01
Full Text Available Let {X,Xn;nÃ¢Â‰Â¥1} be a sequence of real-valued i.i.d. random variables and let Sn=Ã¢ÂˆÂ‘i=1nXi, nÃ¢Â‰Â¥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ÃÂ•(x=1, then for every t>0, limsupnÃ¢Â†Â’Ã¢ÂˆÂžP(|Sn|>tn1/p/(nÃÂ•(n=tpÃŽÂ±.
Bakbergenuly, Ilyas; Morgenthaler, Stephan
2016-01-01
We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062
Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan
2016-07-01
We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Santo, H.; Taylor, P. H.; Gibson, R.
2016-09-01
Long-term estimation of extreme wave height remains a key challenge because of the short duration of available wave data, and also because of the possible impact of climate variability on ocean waves. Here, we analyse storm-based statistics to obtain estimates of extreme wave height at locations in the northeast Atlantic and North Sea using the NORA10 wave hindcast (1958-2011), and use a 5 year sliding window to examine temporal variability. The decadal variability is correlated to the North Atlantic oscillation and other atmospheric modes, using a six-term predictor model incorporating the climate indices and their Hilbert transforms. This allows reconstruction of the historic extreme climate back to 1661, using a combination of known and proxy climate indices. Significant decadal variability primarily driven by the North Atlantic oscillation is observed, and this should be considered for the long-term survivability of offshore structures and marine renewable energy devices. The analysis on wave climate reconstruction reveals that the variation of the mean, 99th percentile and extreme wave climates over decadal time scales for locations close to the dominant storm tracks in the open North Atlantic are comparable, whereas the wave climates for the rest of the locations including the North Sea are rather different.
Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis
2017-06-23
Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Directory of Open Access Journals (Sweden)
Röhl Johannes
2011-08-01
Full Text Available Abstract Dispositions and tendencies feature significantly in the biomedical domain and therefore in representations of knowledge of that domain. They are not only important for specific applications like an infectious disease ontology, but also as part of a general strategy for modelling knowledge about molecular interactions. But the task of representing dispositions in some formal ontological systems is fraught with several problems, which are partly due to the fact that Description Logics can only deal well with binary relations. The paper will discuss some of the results of the philosophical debate about dispositions, in order to see whether the formal relations needed to represent dispositions can be broken down to binary relations. Finally, we will discuss problems arising from the possibility of the absence of realizations, of multi-track or multi-trigger dispositions and offer suggestions on how to deal with them.
Directory of Open Access Journals (Sweden)
Luca Poncellini
2010-06-01
Full Text Available The analysis of natural phenomena applied to architectural planning and design is facing the most fascinating and elusive of the four dimensions through which man attempts to define life within the universe: time. We all know what time is, said St. Augustine, but nobody knows how to describe it. Within architectural projects and representations, time rarely appears in explicit form. This paper presents the results of a research conducted by students of NABA and of the Polytechnic of Milan with the purpose of representing time considered as a key element within architectural projects. Student investigated new approaches and methodologies to represent time using the two-dimensional support of a sheet of paper.
2014-01-01
Background Cluster randomized trials (CRTs) present unique ethical challenges. In the absence of a uniform standard for their ethical design and conduct, problems such as variability in procedures and requirements by different research ethics committees will persist. We aimed to assess the need for ethics guidelines for CRTs among research ethics chairs internationally, investigate variability in procedures for research ethics review of CRTs within and among countries, and elicit research ethics chairs’ perspectives on specific ethical issues in CRTs, including the identification of research subjects. The proper identification of research subjects is a necessary requirement in the research ethics review process, to help ensure, on the one hand, that subjects are protected from harm and exploitation, and on the other, that reviews of CRTs are completed efficiently. Methods A web-based survey with closed- and open-ended questions was administered to research ethics chairs in Canada, the United States, and the United Kingdom. The survey presented three scenarios of CRTs involving cluster-level, professional-level, and individual-level interventions. For each scenario, a series of questions was posed with respect to the type of review required (full, expedited, or no review) and the identification of research subjects at cluster and individual levels. Results A total of 189 (35%) of 542 chairs responded. Overall, 144 (84%, 95% CI 79 to 90%) agreed or strongly agreed that there is a need for ethics guidelines for CRTs and 158 (92%, 95% CI 88 to 96%) agreed or strongly agreed that research ethics committees could be better informed about distinct ethical issues surrounding CRTs. There was considerable variability among research ethics chairs with respect to the type of review required, as well as the identification of research subjects. The cluster-cluster and professional-cluster scenarios produced the most disagreement. Conclusions Research ethics committees
Nguyen, Hung T.; Kreinovich, Vladik
2014-01-01
To help computers make better decisions, it is desirable to describe all our knowledge in computer-understandable terms. This is easy for knowledge described in terms on numerical values: we simply store the corresponding numbers in the computer. This is also easy for knowledge about precise (well-defined) properties which are either true or false for each object: we simply store the corresponding “true” and “false” values in the computer. The challenge is how to store information about imprecise properties. In this paper, we overview different ways to fully store the expert information about imprecise properties. We show that in the simplest case, when the only source of imprecision is disagreement between different experts, a natural way to store all the expert information is to use random sets; we also show how fuzzy sets naturally appear in such random-set representation. We then show how the random-set representation can be extended to the general (“fuzzy”) case when, in addition to disagreements, experts are also unsure whether some objects satisfy certain properties or not. PMID:25386045
Wang, Kezhi
2015-06-01
Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.
Directory of Open Access Journals (Sweden)
Ahmet Kuzu
2014-01-01
Full Text Available This paper proposes two novel master-slave configurations that provide improvements in both control and communication aspects of teleoperation systems to achieve an overall improved performance in position control. The proposed novel master-slave configurations integrate modular control and communication approaches, consisting of a delay regulator to address problems related to variable network delay common to such systems, and a model tracking control that runs on the slave side for the compensation of uncertainties and model mismatch on the slave side. One of the configurations uses a sliding mode observer and the other one uses a modified Smith predictor scheme on the master side to ensure position transparency between the master and slave, while reference tracking of the slave is ensured by a proportional-differentiator type controller in both configurations. Experiments conducted for the networked position control of a single-link arm under system uncertainties and randomly varying network delays demonstrate significant performance improvements with both configurations over the past literature.
Staley, James R.
2017-01-01
ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167
Directory of Open Access Journals (Sweden)
Sornkitja Boonprong
2018-05-01
Full Text Available Burnt forest recovery is normally monitored with a time-series analysis of satellite data because of its proficiency for large observation areas. Traditional methods, such as linear correlation plotting, have been proven to be effective, as forest recovery naturally increases with time. However, these methods are complicated and time consuming when increasing the number of observed parameters. In this work, we present a random forest variable importance (RF-VIMP scheme called multilevel RF-VIMP to compare and assess the relationship between 36 spectral indices (parameters of burnt boreal forest recovery in the Great Xing’an Mountain, China. Six Landsat images were acquired in the same month 0, 1, 4, 14, 16, and 20 years after a fire, and 39,380 fixed-location samples were then extracted to calculate the effectiveness of the 36 parameters. Consequently, the proposed method was applied to find correlations between the forest recovery indices. The experiment showed that the proposed method is suitable for explaining the efficacy of those spectral indices in terms of discrimination and trend analysis, and for showing the satellite data and forest succession dynamics when applied in a time series. The results suggest that the tasseled cap transformation wetness, brightness, and the shortwave infrared bands (both 1 and 2 perform better than other indices for both classification and monitoring.
Wang, Kezhi; Wang, Tian; Chen, Yunfei; Alouini, Mohamed-Slim
2015-01-01
Exact results for the probability density function (PDF) and cumulative distribution function (CDF) of the sum of ratios of products (SRP) and the sum of products (SP) of independent α-μ random variables (RVs) are derived. They are in the form of 1-D integral based on the existing works on the products and ratios of α-μ RVs. In the derivation, generalized Gamma (GG) ratio approximation (GGRA) is proposed to approximate SRP. Gamma ratio approximation (GRA) is proposed to approximate SRP and the ratio of sums of products (RSP). GG approximation (GGA) and Gamma approximation (GA) are used to approximate SP. The proposed results of the SRP can be used to calculate the outage probability (OP) for wireless multihop relaying systems or multiple scattering channels with interference. The proposed results of the SP can be used to calculate the OP for these systems without interference. In addition, the proposed approximate result of the RSP can be used to calculate the OP of the signal-To-interference ratio (SIR) in a multiple scattering system with interference. © 1967-2012 IEEE.
Variable screening and ranking using sampling-based sensitivity measures
International Nuclear Information System (INIS)
Wu, Y-T.; Mohanty, Sitakanta
2006-01-01
This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables
Directory of Open Access Journals (Sweden)
Qinghui Du
2014-01-01
Full Text Available We consider semi-implicit Euler methods for stochastic age-dependent capital system with variable delays and random jump magnitudes, and investigate the convergence of the numerical approximation. It is proved that the numerical approximate solutions converge to the analytical solutions in the mean-square sense under given conditions.
Lin, Shu-Ling; Huang, Ching-Ya; Shiu, Shau-Ping; Yeh, Shu-Hui
2015-08-01
Mental health professionals experiencing work-related stress may experience burn out, leading to a negative impact on their organization and patients. The aim of this study was to examine the effects of yoga classes on work-related stress, stress adaptation, and autonomic nerve activity among mental health professionals. A randomized controlled trial was used, which compared the outcomes between the experimental (e.g., yoga program) and the control groups (e.g., no yoga exercise) for 12 weeks. Work-related stress and stress adaptation were assessed before and after the program. Heart rate variability (HRV) was measured at baseline, midpoint through the weekly yoga classes (6 weeks), and postintervention (after 12 weeks of yoga classes). The results showed that the mental health professionals in the yoga group experienced a significant reduction in work-related stress (t = -6.225, p control group revealed no significant changes. Comparing the mean differences in pre- and posttest scores between yoga and control groups, we found the yoga group significantly decreased work-related stress (t = -3.216, p = .002), but there was no significant change in stress adaptation (p = .084). While controlling for the pretest scores of work-related stress, participants in yoga, but not the control group, revealed a significant increase in autonomic nerve activity at midpoint (6 weeks) test (t = -2.799, p = .007), and at posttest (12 weeks; t = -2.099, p = .040). Because mental health professionals experienced a reduction in work-related stress and an increase in autonomic nerve activity in a weekly yoga program for 12 weeks, clinicians, administrators, and educators should offer yoga classes as a strategy to help health professionals reduce their work-related stress and balance autonomic nerve activities. © 2015 The Authors. Worldviews on Evidence-Based Nursing published by Wiley Periodicals, Inc. on behalf of Society for Worldviews on Evidence-Based Nursing.
Schein, Aso; Correa, Aps; Casali, Karina Rabello; Schaan, Beatriz D
2016-01-20
Physical exercise reduces glucose levels and glucose variability in patients with type 2 diabetes. Acute inspiratory muscle exercise has been shown to reduce these parameters in a small group of patients with type 2 diabetes, but these results have yet to be confirmed in a well-designed study. The aim of this study is to investigate the effect of acute inspiratory muscle exercise on glucose levels, glucose variability, and cardiovascular autonomic function in patients with type 2 diabetes. This study will use a randomized clinical trial crossover design. A total of 14 subjects will be recruited and randomly allocated to two groups to perform acute inspiratory muscle loading at 2 % of maximal inspiratory pressure (PImax, placebo load) or 60 % of PImax (experimental load). Inspiratory muscle training could be a novel exercise modality to be used to decrease glucose levels and glucose variability. ClinicalTrials.gov NCT02292810 .
Spieth, Peter M.; Güldner, Andreas; Uhlig, Christopher; Bluth, Thomas; Kiss, Thomas; Schultz, Marcus J.; Pelosi, Paolo; Koch, Thea; Gama de Abreu, Marcelo
2014-01-01
General anesthesia usually requires mechanical ventilation, which is traditionally accomplished with constant tidal volumes in volume- or pressure-controlled modes. Experimental studies suggest that the use of variable tidal volumes (variable ventilation) recruits lung tissue, improves pulmonary
Allen, Alexander R; Gullixson, Leah R; Wolhart, Sarah C; Kost, Susan L; Schroeder, Darrell R; Eisenach, John H
2014-02-01
Dietary sodium influences intermediate physiological traits in healthy adults independent of changes in blood pressure. The purpose of this study was to test the hypothesis that dietary sodium affects cardiac autonomic modulation during mental stress. In a prospective, randomized cross-over design separated by 1 month between diets, 70 normotensive healthy young adults (F/M: 44/26, aged 18-38 years) consumed a 5-day low (10 mmol/day), normal (150 mmol), and high (400 mmol) sodium diet followed by heart rate variability (HRV) recordings at rest and during 5-min computerized mental arithmetic. Women were studied in the low hormone phase of the menstrual cycle following each diet. Diet did not affect resting blood pressure, but heart rate (HR) (mean ± SE) was 66 ± 1, 64 ± 1, and 63 ± 1 bpm in low, normal, and high sodium conditions, respectively (analysis of variance P = 0.02). For HRV, there was a main effect of sodium on resting SD of normalized RR intervals (SDNN), square root of the mean squared difference of successive normalized RR intervals (RMSSD), high frequency, low-frequency normalized units (LFnu), and high-frequency normalized units (HFnu) (P sodium was most marked and consistent with sympathetic activation and reduced vagal activity, with increased LFnu and decreased SDNN, RMSSD, and HFnu compared to both normal and high sodium conditions (P ≤0.05 for all). Dietary sodium-by-mental stress interactions were significant for mean NN, RMSSD, high-frequency power, LFnu, and low frequency/high frequency ratio (P sodium restriction evoked an increase in resting sympathetic activity and reduced vagal activity to the extent that mental stress caused modest additional disruptions in autonomic balance. Conversely, normal and high sodium evoked a reduction in resting sympathetic activity and incremental increase in resting vagal activity, which were disrupted to a greater extent during mental stress compared to low sodium. We conclude that autonomic control of
2013-01-01
Background Chronic work-related stress is an independent risk factor for cardiometabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. The purpose of this study was to determine if an office worksite-based hatha yoga program could improve physiological stress, evaluated via heart rate variability (HRV), and associated health-related outcomes in a cohort of office workers. Methods Thirty-seven adults employed in university-based office positions were randomized upon the completion of baseline testing to an experimental or control group. The experimental group completed a 10-week yoga program prescribed three sessions per week during lunch hour (50 min per session). An experienced instructor led the sessions, which emphasized asanas (postures) and vinyasa (exercises). The primary outcome was the high frequency (HF) power component of HRV. Secondary outcomes included additional HRV parameters, musculoskeletal fitness (i.e. push-up, side-bridge, and sit & reach tests) and psychological indices (i.e. state and trait anxiety, quality of life and job satisfaction). Results All measures of HRV failed to change in the experimental group versus the control group, except that the experimental group significantly increased LF:HF (p = 0.04) and reduced pNN50 (p = 0.04) versus control, contrary to our hypotheses. Flexibility, evaluated via sit & reach test increased in the experimental group versus the control group (p yoga sessions (n = 11) to control (n = 19) yielded the same findings, except that the high adherers also reduced state anxiety (p = 0.02) and RMSSD (p = 0.05), and tended to improve the push-up test (p = 0.07) versus control. Conclusions A 10-week hatha yoga intervention delivered at the office worksite during lunch hour did not improve HF power or other HRV parameters. However, improvements in flexibility, state anxiety and musculoskeletal fitness were noted with high adherence
Directory of Open Access Journals (Sweden)
Chang Dennis
2011-07-01
Full Text Available Abstract Background Chronic work-related stress is a significant and independent risk factor for cardiovascular and metabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. Heart rate variability (HRV provides an estimate of parasympathetic and sympathetic autonomic control, and can serve as a marker of physiological stress. Hatha yoga is a physically demanding practice that can help to reduce stress; however, time constraints incurred by work and family life may limit participation. The purpose of the present study is to determine if a 10-week, worksite-based yoga program delivered during lunch hour can improve resting HRV and related physical and psychological parameters in sedentary office workers. Methods and design This is a parallel-arm RCT that will compare the outcomes of participants assigned to the experimental treatment group (yoga to those assigned to a no-treatment control group. Participants randomized to the experimental condition will engage in a 10-week yoga program delivered at their place of work. The yoga sessions will be group-based, prescribed three times per week during lunch hour, and will be led by an experienced yoga instructor. The program will involve teaching beginner students safely and progressively over 10 weeks a yoga sequence that incorporates asanas (poses and postures, vinyasa (exercises, pranayama (breathing control and meditation. The primary outcome of this study is the high frequency (HF spectral power component of HRV (measured in absolute units; i.e. ms2, a measure of parasympathetic autonomic control. Secondary outcomes include additional frequency and time domains of HRV, and measures of physical functioning and psychological health status. Measures will be collected prior to and following the intervention period, and at 6 months follow-up to determine the effect of intervention withdrawal. Discussion This study will determine the effect of worksite
Pfeiffer, Christine M.; Sternberg, Maya R.; Schleicher, Rosemary L.; Rybak, Michael E.
2016-01-01
Biochemical indicators of water-soluble vitamin (WSV) status have been measured in a nationally representative sample of the US population in NHANES 2003–2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle variables (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) with biomarkers of WSV status in adults (≥20 y): serum and RBC folate, serum pyridoxal-5′-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤0.43) and together with supplement use explained more of the variability as compared to the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7% (B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥6 out of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI; and for some biomarkers with PIR (5/8), education (1/8), alcohol consumption (4/8), and physical activity (5/8). We noted large estimated percent changes in biomarker concentrations between race-ethnic groups (from −24% to 20%), between supplement users and nonusers (from −12% to 104%), and between smokers and nonsmokers (from −28% to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status. PMID:23576641
Pfeiffer, Christine M; Sternberg, Maya R; Schleicher, Rosemary L; Rybak, Michael E
2013-06-01
Biochemical indicators of water-soluble vitamin (WSV) status were measured in a nationally representative sample of the U.S. population in NHANES 2003-2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) variables with biomarkers of WSV status in adults (aged ≥ 20 y): serum and RBC folate, serum pyridoxal-5'-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (vitamin B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤ 0.43) and together with supplement use explained more of the variability compared with the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7 (vitamin B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥ 6 of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI and for some biomarkers with PIR (5 of 8), education (1 of 8), alcohol consumption (4 of 8), and physical activity (5 of 8). We noted large estimated percentage changes in biomarker concentrations between race-ethnic groups (from -24 to 20%), between supplement users and nonusers (from -12 to 104%), and between smokers and nonsmokers (from -28 to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status.
International Nuclear Information System (INIS)
Choi, Seon Soon
2012-01-01
The primary aim of this paper was to evaluate several probabilistic fatigue crack propagation models using the residual of a random variable, and to present the model fit for probabilistic fatigue behavior in Mg Al Zn alloys. The proposed probabilistic models are the probabilistic Paris Erdogan model, probabilistic Walker model, probabilistic Forman model, and probabilistic modified Forman models. These models were prepared by applying a random variable to the empirical fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models vor describing fatigue crack propagation behavior in Mg Al Zn alloys were generally the probabilistic Paris Erdogan and probabilistic Walker models. The probabilistic Forman model was a good model only for a specimen with a thickness of 9.45mm
DEFF Research Database (Denmark)
Andersen, Andreas; Rieckmann, Andreas
2016-01-01
In this article, we illustrate how to use mi impute chained with intreg to fit an analysis of covariance analysis of censored and nondetectable immunological concentrations measured in a randomized pretest–posttest design.......In this article, we illustrate how to use mi impute chained with intreg to fit an analysis of covariance analysis of censored and nondetectable immunological concentrations measured in a randomized pretest–posttest design....
A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables
Vernizzi, Graziano; Nakai, Miki
2015-01-01
It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…
Nam, Sungsik
2014-08-01
The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.
De Ruiter, Naomi M. P.; Den Hartigh, Ruud J. R.; Cox, Ralf F. A.; Van Geert, Paul L. C.; Kunnen, E. Saskia
2015-01-01
Research regarding the variability of state self-esteem (SSE) commonly focuses on the magnitude of variability. In this article we provide the first empirical test of the temporalstructure of SSE as a real-time process during parent-adolescent interactions. We adopt a qualitative phenomenological
Respiratory variability preceding and following sighs: a resetter hypothesis.
Vlemincx, Elke; Van Diest, Ilse; Lehrer, Paul M; Aubert, André E; Van den Bergh, Omer
2010-04-01
Respiratory behavior is characterized by complex variability with structured and random components. Assuming that both a lack of variability and too much randomness represent suboptimal breathing regulation, we hypothesized that sighing acts as a resetter inducing structured variability. Spontaneous breathing was measured in healthy persons (N=42) during a 20min period of quiet sitting using the LifeShirt(®) System. Four blocks of 10 breaths with a 50% window overlap were determined before and after spontaneous sighs. Total respiratory variability of minute ventilation was measured using the coefficient of variation and structured (correlated) variability was quantified using autocorrelation. Towards a sigh, total variability gradually increased without concomittant changes in correlated variability, suggesting that randomness increased. After a sigh, correlated variability increased. No changes in variability were found in comparable epochs without intermediate sighs. We conclude that a sigh resets structured respiratory variability, enhancing information processing in the respiratory system. Copyright © 2009 Elsevier B.V. All rights reserved.
Siegelaar, S. E.; Kulik, W.; van Lenthe, H.; Mukherjee, R.; Hoekstra, J. B. L.; DeVries, J. H.
2009-01-01
To assess the effect of three times daily mealtime inhaled insulin therapy compared with once daily basal insulin glargine therapy on 72-h glucose profiles, glucose variability and oxidative stress in type 2 diabetes patients. In an inpatient crossover study, 40 subjects with type 2 diabetes were
International Nuclear Information System (INIS)
Cabout, T.; Buckley, J.; Cagli, C.; Jousseaume, V.; Nodin, J.-F.; Salvo, B. de; Bocquet, M.; Muller, Ch.
2013-01-01
This paper deals with the role of platinum or titanium–titanium nitride electrodes on variability of resistive switching characteristics and electrical performances of HfO 2 -based memory elements. Capacitor-like Pt/HfO 2 (10 nm)/Pt and Ti/HfO 2 (10 nm)/TiN structures were fabricated on top of a tungsten pillar bottom electrode and integrated in-between two interconnect metal lines. First, quasi-static measurements were performed to apprehend the role of electrodes on electroforming, set and reset operations and their corresponding switching parameters. Memory elements with Pt as top and bottom electrodes exhibited a non-polar behavior with sharp decrease of current during reset operation while Ti/HfO 2 /TiN capacitors showed a bipolar switching behavior, with a gradual reset. In a second step, statistical distributions of switching parameters (voltage and resistance) were extracted from data obtained on few hundreds of capacitors. Even if the resistance in low resistive state and reset voltage was found to be comparable for both types of electrodes, the progressive reset operation observed on samples with Ti/TiN electrodes led to a lower variability of resistance in high resistive state and concomitantly of set voltage. In addition Ti–TiN electrodes enabled gaining: (i) lower forming and set voltages with significantly narrower capacitor-to-capacitor distributions; (ii) a better data retention capability (10 years at 65 °C instead of 10 years at 50 °C for Pt electrodes); (iii) satisfactory dynamic performances with lower set and reset voltages for ramp speed ranging from 10 −2 to 10 7 V/s. The significant improvement of switching behavior with Ti–TiN electrodes is mainly attributed to the formation of a native interface layer between HfO 2 oxide and Ti top electrode. - Highlights: ► HfO2 based capacitor-like structures were fabricated with Pt and Ti based electrodes. ► Influence of electrode materials on switching parameter variability is assessed.
International Nuclear Information System (INIS)
Gogolak, C.V.
1986-11-01
The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities
Directory of Open Access Journals (Sweden)
Helmut Prodinger
2007-01-01
Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.
Sharma, Vivek Kumar; Subramanian, Senthil Kumar; Radhakrishnan, Krishnakumar; Rajendran, Rajathi; Ravindran, Balasubramanian Sulur; Arunachalam, Vinayathan
2017-05-01
Physical inactivity contributes to many health issues. The WHO-recommended physical activity for adolescents encompasses aerobic, resistance, and bone strengthening exercises aimed at achieving health-related physical fitness. Heart rate variability (HRV) and maximal aerobic capacity (VO2max) are considered as noninvasive measures of cardiovascular health. The objective of this study is to compare the effect of structured and unstructured physical training on maximal aerobic capacity and HRV among adolescents. We designed a single blinded, parallel, randomized active-controlled trial (Registration No. CTRI/2013/08/003897) to compare the physiological effects of 6 months of globally recommended structured physical activity (SPA), with that of unstructured physical activity (USPA) in healthy school-going adolescents. We recruited 439 healthy student volunteers (boys: 250, girls: 189) in the age group of 12-17 years. Randomization across the groups was done using age and gender stratified randomization method, and the participants were divided into two groups: SPA (n=219, boys: 117, girls: 102) and USPA (n=220, boys: 119, girls: 101). Depending on their training status and gender the participants in both SPA and USPA groups were further subdivided into the following four sub-groups: SPA athlete boys (n=22) and girls (n=17), SPA nonathlete boys (n=95) and girls (n=85), USPA athlete boys (n=23) and girls (n=17), and USPA nonathlete boys (n=96) and girls (n=84). We recorded HRV, body fat%, and VO2 max using Rockport Walk Fitness test before and after the intervention. Maximum aerobic capacity and heart rate variability increased significantly while heart rate, systolic blood pressure, diastolic blood pressure, and body fat percentage decreased significantly after both SPA and USPA intervention. However, the improvement was more in SPA as compared to USPA. SPA is more beneficial for improving cardiorespiratory fitness, HRV, and reducing body fat percentage in terms of
Forni, Valentina; Bianchi, Giorgia; Ogna, Adam; Salvadé, Igor; Vuistiner, Philippe; Burnier, Michel; Gabutti, Luca
2013-07-22
In a simulation based on a pharmacokinetic model we demonstrated that increasing the erythropoiesis stimulating agents (ESAs) half-life or shortening their administration interval decreases hemoglobin variability. The benefit of reducing the administration interval was however lessened by the variability induced by more frequent dosage adjustments. The purpose of this study was to analyze the reticulocyte and hemoglobin kinetics and variability under different ESAs and administration intervals in a collective of chronic hemodialysis patients. The study was designed as an open-label, randomized, four-period cross-over investigation, including 30 patients under chronic hemodialysis at the regional hospital of Locarno (Switzerland) in February 2010 and lasting 2 years. Four subcutaneous treatment strategies (C.E.R.A. every 4 weeks Q4W and every 2 weeks Q2W, Darbepoetin alfa Q4W and Q2W) were compared with each other. The mean square successive difference of hemoglobin, reticulocyte count and ESAs dose was used to quantify variability. We distinguished a short- and a long-term variability based respectively on the weekly and monthly successive difference. No difference was found in the mean values of biological parameters (hemoglobin, reticulocytes, and ferritin) between the 4 strategies. ESAs type did not affect hemoglobin and reticulocyte variability, but C.E.R.A induced a more sustained reticulocytes response over time and increased the risk of hemoglobin overshooting (OR 2.7, p = 0.01). Shortening the administration interval lessened the amplitude of reticulocyte count fluctuations but resulted in more frequent ESAs dose adjustments and in amplified reticulocyte and hemoglobin variability. Q2W administration interval was however more favorable in terms of ESAs dose, allowing a 38% C.E.R.A. dose reduction, and no increase of Darbepoetin alfa. The reticulocyte dynamic was a more sensitive marker of time instability of the hemoglobin response under ESAs therapy
Vila-Castelar, Clara; Ly, Jenny J; Kaplan, Lillian; Van Dyk, Kathleen; Berger, Jeffrey T; Macina, Lucy O; Stewart, Jennifer L; Foldi, Nancy S
2018-04-09
Donepezil is widely used to treat Alzheimer's disease (AD), but detecting early response remains challenging for clinicians. Acetylcholine is known to directly modulate attention, particularly under high cognitive conditions, but no studies to date test whether measures of attention under high load can detect early effects of donepezil. We hypothesized that load-dependent attention tasks are sensitive to short-term treatment effects of donepezil, while global and other domain-specific cognitive measures are not. This longitudinal, randomized, double-blind, placebo-controlled pilot trial (ClinicalTrials.gov Identifier: NCT03073876) evaluated 23 participants newly diagnosed with AD initiating de novo donepezil treatment (5 mg). After baseline assessment, participants were randomized into Drug (n = 12) or Placebo (n = 11) groups, and retested after approximately 6 weeks. Cognitive assessment included: (a) attention tasks (Foreperiod Effect, Attentional Blink, and Covert Orienting tasks) measuring processing speed, top-down accuracy, orienting, intra-individual variability, and fatigue; (b) global measures (Alzheimer's Disease Assessment Scale-Cognitive Subscale, Mini-Mental Status Examination, Dementia Rating Scale); and (c) domain-specific measures (memory, language, visuospatial, and executive function). The Drug but not the Placebo group showed benefits of treatment at high-load measures by preserving top-down accuracy, improving intra-individual variability, and averting fatigue. In contrast, other global or cognitive domain-specific measures could not detect treatment effects over the same treatment interval. The pilot-study suggests that attention measures targeting accuracy, variability, and fatigue under high-load conditions could be sensitive to short-term cholinergic treatment. Given the central role of acetylcholine in attentional function, load-dependent attentional measures may be valuable cognitive markers of early treatment response.
DEFF Research Database (Denmark)
Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L
2008-01-01
BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...
Directory of Open Access Journals (Sweden)
Fabrizio Maturo
2016-06-01
Full Text Available In practical applications relating to business and management sciences, there are many variables that, for their own nature, are better described by a pair of ordered values (i.e. financial data. By summarizing this measurement with a single value, there is a loss of information; thus, in these situations, data are better described by interval values rather than by single values. Interval arithmetic studies and analyzes this type of imprecision; however, if the intervals has no sharp boundaries, fuzzy set theory is the most suitable instrument. Moreover, fuzzy regression models are able to overcome some typical limitation of classical regression because they do not need the same strong assumptions. In this paper, we present a review of the main methods introduced in the literature on this topic and introduce some recent developments regarding the concept of randomness in fuzzy regression.
Directory of Open Access Journals (Sweden)
Nuria eRuffini
2015-08-01
Full Text Available Context: Heart Rate Variability (HRV indicates how heart rate changes in response to inner and external stimuli. HRV is linked to health status and it is an indirect marker of the autonomic nervous system (ANS function. Objective: To investigate the influence of osteopathic manipulative treatment (OMT on ANS activity through changes of High Frequency, a heart rate variability index indicating the parasympathetic activity, in healthy subjects, compared with sham therapy and control group.Methods: Sixty-six healthy subjects, both male and female, were included in the present 3-armed randomized placebo controlled within subject cross-over single blinded study. Participants were asymptomatic adults, both smokers and non-smokers and not on medications. At enrollment subjects were randomized in 3 groups: A, B, C. Standardized structural evaluation followed by a patient need-based osteopathic treatment was performed in the first session of group A and in the second session of group B. Standardized evaluation followed by a protocoled sham treatment was provided in the second session of group A and in the first session of group B. No intervention was performed in the two sessions of group C, acting as a time-control. The trial was registered on clinicaltrials.gov identifier: NCT01908920.Main Outcomes Measures: HRV was calculated from electrocardiography before, during and after the intervention, for a total amount time of 25 minutes.Results: OMT engendered a statistically significant increase of parasympathetic activity, as shown by High Frequency rate (p<0.001, and decrease of sympathetic activity, as revealed by Low Frequency rate (p<0.01; results also showed a reduction of Low Frequency/High Frequency ratio (p<0.001 and Detrended fluctuation scaling exponent (p<0.05. Conclusions: Findings suggested that OMT can influence ANS activity increasing parasympathetic function and decreasing sympathetic activity, compared to sham therapy and control group.
Menezes, Regina; Rodriguez-Mateos, Ana; Kaltsatou, Antonia; González-Sarrías, Antonio; Greyling, Arno; Giannaki, Christoforos; Andres-Lacueva, Cristina; Milenkovic, Dragan; Gibney, Eileen R.; Dumont, Julie; Schär, Manuel; Garcia-Aloy, Mar; Palma-Duran, Susana Alejandra; Ruskovska, Tatjana; Maksimova, Viktorija; Combet, Emilie; Pinto, Paula
2017-01-01
Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD). However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta-analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter-individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta-analysis model and reported as difference in means (DM). Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01), LDL cholesterol (DM = −0.14 mmol/L; 95% CI: −0.21, 0.07), and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03), and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07). A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95% CI: −0.29, −0.08), and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: −4.09, −2.55). Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of origin and health status may influence the effect of flavonol intake on blood lipid levels. PMID:28208791
Bhatti, A; Khan, J; Murki, S; Sundaram, V; Saini, S S; Kumar, P
2015-11-01
To compare the failure rates between Jet continuous positive airway pressure device (J-CPAP-variable flow) and Bubble continuous positive airway device (B-CPAP) in preterm infants with respiratory distress. Preterm newborns CPAP (a variable flow device) or B-CPAP (continuous flow device). A standardized protocol was followed for titration, weaning and removal of CPAP. Pressure was monitored close to the nares in both the devices every 6 hours and settings were adjusted to provide desired CPAP. The primary outcome was CPAP failure rate within 72 h of life. Secondary outcomes were CPAP failure within 7 days of life, need for surfactant post-randomization, time to CPAP failure, duration of CPAP and complications of prematurity. An intention to treat analysis was done. One-hundred seventy neonates were randomized, 80 to J-CPAP and 90 to B-CPAP. CPAP failure rates within 72 h were similar in infants who received J-CPAP and in those who received B-CPAP (29 versus 21%; relative risks 1.4 (0.8 to 2.3), P=0.25). Mean (95% confidence intervals) time to CPAP failure was 59 h (54 to 64) in the Jet CPAP group in comparison with 65 h (62 to 68) in the Bubble CPAP group (log rank P=0.19). All other secondary outcomes were similar between the two groups. In preterm infants with respiratory distress starting within 6 h of life, CPAP failure rates were similar with Jet CPAP and Bubble CPAP.
Sandberg, Jonna C; Björck, Inger M E; Nilsson, Anne C
2016-01-01
Whole grain has shown potential to prevent obesity, cardiovascular disease and type 2 diabetes. Possible mechanism could be related to colonic fermentation of specific indigestible carbohydrates, i.e. dietary fiber (DF). The aim of this study was to investigate effects on cardiometabolic risk factors and appetite regulation the next day when ingesting rye kernel bread rich in DF as an evening meal. Whole grain rye kernel test bread (RKB) or a white wheat flour based bread (reference product, WWB) was provided as late evening meals to healthy young adults in a randomized cross-over design. The test products RKB and WWB were provided in two priming settings: as a single evening meal or as three consecutive evening meals prior to the experimental days. Test variables were measured in the morning, 10.5-13.5 hours after ingestion of RKB or WWB. The postprandial phase was analyzed for measures of glucose metabolism, inflammatory markers, appetite regulating hormones and short chain fatty acids (SCFA) in blood, hydrogen excretion in breath and subjective appetite ratings. With the exception of serum CRP, no significant differences in test variables were observed depending on length of priming (P>0.05). The RKB evening meal increased plasma concentrations of PYY (0-120 min, Pappetite ratings during the whole experimental period (Pappetite sensation could be beneficial in preventing obesity. These effects could possibly be mediated through colonic fermentation. ClinicalTrials.gov NCT02093481.
Ni, Hsing-Chang; Hwang Gu, Shoou-Lian; Lin, Hsiang-Yuan; Lin, Yu-Ju; Yang, Li-Kuang; Huang, Hui-Chun; Gau, Susan Shur-Fen
2016-05-01
Intra-individual variability in reaction time (IIV-RT) is common in individuals with attention-deficit/hyperactivity disorder (ADHD). It can be improved by stimulants. However, the effects of atomoxetine on IIV-RT are inconclusive. We aimed to investigate the effects of atomoxetine on IIV-RT, and directly compared its efficacy with methylphenidate in adults with ADHD. An 8-10 week, open-label, head-to-head, randomized clinical trial was conducted in 52 drug-naïve adults with ADHD, who were randomly assigned to two treatment groups: immediate-release methylphenidate (n=26) thrice daily (10-20 mg per dose) and atomoxetine once daily (n=26) (0.5-1.2 mg/kg/day). IIV-RT, derived from the Conners' continuous performance test (CCPT), was represented by the Gaussian (reaction time standard error, RTSE) and ex-Gaussian models (sigma and tau). Other neuropsychological functions, including response errors and mean of reaction time, were also measured. Participants received CCPT assessments at baseline and week 8-10 (60.4±6.3 days). We found comparable improvements in performances of CCPT between the immediate-release methylphenidate- and atomoxetine-treated groups. Both medications significantly improved IIV-RT in terms of reducing tau values with comparable efficacy. In addition, both medications significantly improved inhibitory control by reducing commission errors. Our results provide evidence to support that atomoxetine could improve IIV-RT and inhibitory control, of comparable efficacy with immediate-release methylphenidate, in drug-naïve adults with ADHD. Shared and unique mechanisms underpinning these medication effects on IIV-RT awaits further investigation. © The Author(s) 2016.
Asymptotics for Associated Random Variables
Oliveira, Paulo Eduardo
2012-01-01
The book concerns the notion of association in probability and statistics. Association and some other positive dependence notions were introduced in 1966 and 1967 but received little attention from the probabilistic and statistics community. The interest in these dependence notions increased in the last 15 to 20 years, and many asymptotic results were proved and improved. Despite this increased interest, characterizations and results remained essentially scattered in the literature published in different journals. The goal of this book is to bring together the bulk of these results, presenting
Operant Variability: Some Random Thoughts
Marr, M. Jackson
2012-01-01
Barba's (2012) paper is a serious and thoughtful analysis of a vexing problem in behavior analysis: Just what should count as an operant class and how do people know? The slippery issue of a "generalized operant" or functional response class illustrates one aspect of this problem, and "variation" or "novelty" as an operant appears to fall into…
Burnout in Customer Service Representatives
Directory of Open Access Journals (Sweden)
Tariq Jalees
2008-09-01
Full Text Available The purpose and aim of this research was to (1 identify the factors that contributes towards job burnout in sales service representative (2 What are the relationships of these factors (3 To empirically test the relationships of the determinants relating to burnout in customer service representatives. Based on literature survey six different variables related to burnout were identified. The variables were (1 Emotional exhaustion.(2 Reduced personal accomplishment.(3 Job induced tension.(4 Job satisfaction.(5 Workload (6 Job satisfaction.Each of the variables contained 3 sub-variables. Five different hypotheses were developed and tested through techniques such as Z-test, F-test and regression analysis. The questionnaire administered for the study contained 15 questions including personal data. The subject was Moblink company customers sales service representative in Karachi.The valid sample size was 98 drawn through multi-cluster technique. Techniques such as measure of dispersion and measure of central tendencies were used for analyzing the data. Regression, Z-test, and F-test were used for testing the developed hypothesis.According to the respondents’ opinions, the reduced personal accomplishment had a high rating with a mean of 3.75 and job induced tension has the lowest mean of 3.58. The standard deviation of respondents’ opinions was highest for dimension depersonalization and least for dimension work load. This indicates that there is a high polarization of the respondents’ opinions on the dimension depersonalization moral and least on the dimension work load.The Skew nesses for all the dimensions were in negative except the determinants emotional exhaustion and workload. This indicates that the majority of respondents’ opinions on all the dimensions were below the mean except in the case of emotional exhaustion and workload.Five hypotheses were developed and tested:a The hypothesis relating to low level of burnout in customers
Directory of Open Access Journals (Sweden)
Aras Dicle
2016-03-01
Full Text Available ABSTRACT: Regular physical activity can cause some long term effects on human body. The purpose of this research was to examine the effect of sport rock climbing (SRC training at 70 % HRmax level on echocardiography (ECHO and heart rate variability (HRV for one hour a day and three days a week in an eight-week period. A total of 19 adults participated in this study voluntarily. The subjects were randomly divided into two groups as experimental (EG and control (CG. While the EG went and did climbing training by using the top-rope method for 60 minutes a day, three days a week for 8 weeks and didn’t join any other physical activity programs, CG didn’t train and take part in any physical activity during the course of the study. Same measurements were repeated at the end of eight weeks. According to the findings, no significant change was observed in any of the ECHO and HRV parameters. However, an improvement was seen in some HRV parameters [average heart rate (HRave, standard deviation of all NN intervals (SDNN, standard deviation of the averages of NN intervals in all five-minute segments of the entire recording (SDANN, percent of difference between adjacent NN intervals that are greater than 50 ms (PNN50, square root of the mean of the sum of the squares of differences between adjacent NN interval (RMSSD] in EG. An exercise program based on SRC should be made more than eight weeks in order to have statistically significant changes with the purpose of observing an improvement in heart structure and functions. Keywords: Echocardiography, heart rate variability, sport rock climbing
Nam, Sung Sik
2017-06-19
Complex wireless transmission systems require multi-dimensional joint statistical techniques for performance evaluation. Here, we first present the exact closed-form results on order statistics of any arbitrary partial sums of Gamma random variables with the closedform results of core functions specialized for independent and identically distributed Nakagami-m fading channels based on a moment generating function-based unified analytical framework. These both exact closed-form results have never been published in the literature. In addition, as a feasible application example in which our new offered derived closed-form results can be applied is presented. In particular, we analyze the outage performance of the finger replacement schemes over Nakagami fading channels as an application of our method. Note that these analysis results are directly applicable to several applications, such as millimeter-wave communication systems in which an antenna diversity scheme operates using an finger replacement schemes-like combining scheme, and other fading scenarios. Note also that the statistical results can provide potential solutions for ordered statistics in any other research topics based on Gamma distributions or other advanced wireless communications research topics in the presence of Nakagami fading.
Brugnera, Agostino; Zarbo, Cristina; Tarvainen, Mika P; Marchettini, Paolo; Adorni, Roberta; Compare, Angelo
2018-05-01
Acute psychosocial stress is typically investigated in laboratory settings using protocols with distinctive characteristics. For example, some tasks involve the action of speaking, which seems to alter Heart Rate Variability (HRV) through acute changes in respiration patterns. However, it is still unknown which task induces the strongest subjective and autonomic stress response. The present cross-over randomized trial sought to investigate the differences in perceived stress and in linear and non-linear analyses of HRV between three different verbal (Speech and Stroop) and non-verbal (Montreal Imaging Stress Task; MIST) stress tasks, in a sample of 60 healthy adults (51.7% females; mean age = 25.6 ± 3.83 years). Analyses were run controlling for respiration rates. Participants reported similar levels of perceived stress across the three tasks. However, MIST induced a stronger cardiovascular response than Speech and Stroop tasks, even after controlling for respiration rates. Finally, women reported higher levels of perceived stress and lower HRV both at rest and in response to acute psychosocial stressors, compared to men. Taken together, our results suggest the presence of gender-related differences during psychophysiological experiments on stress. They also suggest that verbal activity masked the vagal withdrawal through altered respiration patterns imposed by speaking. Therefore, our findings support the use of highly-standardized math task, such as MIST, as a valid and reliable alternative to verbal protocols during laboratory studies on stress. Copyright © 2018 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Yingxue Cui
2013-01-01
Full Text Available Objective. To determine the effects of the moxa smoke on human heart rate (HR and heart rate variability (HRV. Methods. Fifty-five healthy young adults were randomly divided into experimental (n=28 and control (n=27 groups. Experimental subjects were exposed to moxa smoke (2.5 ± 0.5 mg/m3 twice for 25 minutes in one week. ECG monitoring was performed before, during, and after exposure. Control subjects were exposed to normal indoor air in a similar environment and similarly monitored. Followup was performed the following week. Short-term (5 min HRV parameters were analyzed with HRV analysis software. SPSS software was used for statistical analysis. Results. During and after the first exposure, comparison of percentage changes or changes in all parameters between groups showed no significant differences. During the second exposure, percentage decrease in HR, percentage increases in lnTP, lnHF, lnLF, and RMSSD, and increase in PNN50 were significantly greater in the experimental group than in control. Conclusion. No significant adverse HRV effects were associated with this clinically routine 25-minute exposure to moxa smoke, and the data suggests that short-term exposure to moxa smoke might have positive regulating effects on human autonomic function. Further studies are warranted to confirm these findings.
Parr, Evelyn B; Coffey, Vernon G; Cato, Louise E; Phillips, Stuart M; Burke, Louise M; Hawley, John A
2016-05-01
This study determined the effects of 16-week high-dairy-protein, variable-carbohydrate (CHO) diets and exercise training (EXT) on body composition in men and women with overweight/obesity. One hundred and eleven participants (age 47 ± 6 years, body mass 90.9 ± 11.7 kg, BMI 33 ± 4 kg/m(2) , values mean ± SD) were randomly stratified to diets with either: high dairy protein, moderate CHO (40% CHO: 30% protein: 30% fat; ∼4 dairy servings); high dairy protein, high CHO (55%: 30%: 15%; ∼4 dairy servings); or control (55%: 15%: 30%; ∼1 dairy serving). Energy restriction (500 kcal/day) was achieved through diet (∼250 kcal/day) and EXT (∼250 kcal/day). Body composition was measured using dual-energy X-ray absorptiometry before, midway, and upon completion of the intervention. Eighty-nine (25 M/64 F) of 115 participants completed the 16-week intervention, losing 7.7 ± 3.2 kg fat mass (P exercise stimulus. © 2016 The Obesity Society.
Redwine, Laura S; Henry, Brook L; Pung, Meredith A; Wilson, Kathleen; Chinh, Kelly; Knight, Brian; Jain, Shamini; Rutledge, Thomas; Greenberg, Barry; Maisel, Alan; Mills, Paul J
2016-01-01
Stage B, asymptomatic heart failure (HF) presents a therapeutic window for attenuating disease progression and development of HF symptoms, and improving quality of life. Gratitude, the practice of appreciating positive life features, is highly related to quality of life, leading to development of promising clinical interventions. However, few gratitude studies have investigated objective measures of physical health; most relied on self-report measures. We conducted a pilot study in Stage B HF patients to examine whether gratitude journaling improved biomarkers related to HF prognosis. Patients (n = 70; mean [standard deviation] age = 66.2 [7.6] years) were randomized to an 8-week gratitude journaling intervention or treatment as usual. Baseline (T1) assessments included the six-item Gratitude Questionnaire, resting heart rate variability (HRV), and an inflammatory biomarker index. At T2 (midintervention), the six-item Gratitude Questionnaire was measured. At T3 (postintervention), T1 measures were repeated but also included a gratitude journaling task. The gratitude intervention was associated with improved trait gratitude scores (F = 6.0, p = .017, η = 0.10), reduced inflammatory biomarker index score over time (F = 9.7, p = .004, η = 0.21), and increased parasympathetic HRV responses during the gratitude journaling task (F = 4.2, p = .036, η = 0.15), compared with treatment as usual. However, there were no resting preintervention to postintervention group differences in HRV (p values > .10). Gratitude journaling may improve biomarkers related to HF morbidity, such as reduced inflammation; large-scale studies with active control conditions are needed to confirm these findings. Clinicaltrials.govidentifier:NCT01615094.
A Statistical and Spectral Model for Representing Noisy Sounds with Short-Time Sinusoids
Directory of Open Access Journals (Sweden)
Myriam Desainte-Catherine
2005-07-01
Full Text Available We propose an original model for noise analysis, transformation, and synthesis: the CNSS model. Noisy sounds are represented with short-time sinusoids whose frequencies and phases are random variables. This spectral and statistical model represents information about the spectral density of frequencies. This perceptually relevant property is modeled by three mathematical parameters that define the distribution of the frequencies. This model also represents the spectral envelope. The mathematical parameters are defined and the analysis algorithms to extract these parameters from sounds are introduced. Then algorithms for generating sounds from the parameters of the model are presented. Applications of this model include tools for composers, psychoacoustic experiments, and pedagogy.
DEFF Research Database (Denmark)
Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune
Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune
This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, R.; Brincker, Rune
1998-01-01
This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...
Directory of Open Access Journals (Sweden)
Evropi Theodoratou
Full Text Available Vitamin D deficiency has been associated with several common diseases, including cancer and is being investigated as a possible risk factor for these conditions. We reported the striking prevalence of vitamin D deficiency in Scotland. Previous epidemiological studies have reported an association between low dietary vitamin D and colorectal cancer (CRC. Using a case-control study design, we tested the association between plasma 25-hydroxy-vitamin D (25-OHD and CRC (2,001 cases, 2,237 controls. To determine whether plasma 25-OHD levels are causally linked to CRC risk, we applied the control function instrumental variable (IV method of the mendelian randomization (MR approach using four single nucleotide polymorphisms (rs2282679, rs12785878, rs10741657, rs6013897 previously shown to be associated with plasma 25-OHD. Low plasma 25-OHD levels were associated with CRC risk in the crude model (odds ratio (OR: 0.76, 95% Confidence Interval (CI: 0.71, 0.81, p: 1.4×10(-14 and after adjusting for age, sex and other confounding factors. Using an allele score that combined all four SNPs as the IV, the estimated causal effect was OR 1.16 (95% CI 0.60, 2.23, whilst it was 0.94 (95% CI 0.46, 1.91 and 0.93 (0.53, 1.63 when using an upstream (rs12785878, rs10741657 and a downstream allele score (rs2282679, rs6013897, respectively. 25-OHD levels were inversely associated with CRC risk, in agreement with recent meta-analyses. The fact that this finding was not replicated when the MR approach was employed might be due to weak instruments, giving low power to demonstrate an effect (<0.35. The prevalence and degree of vitamin D deficiency amongst individuals living in northerly latitudes is of considerable importance because of its relationship to disease. To elucidate the effect of vitamin D on CRC cancer risk, additional large studies of vitamin D and CRC risk are required and/or the application of alternative methods that are less sensitive to weak instrument
List of Accredited Representatives
Department of Veterans Affairs — VA accreditation is for the sole purpose of providing representation services to claimants before VA and does not imply that a representative is qualified to provide...
Representing vision and blindness.
Ray, Patrick L; Cox, Alexander P; Jensen, Mark; Allen, Travis; Duncan, William; Diehl, Alexander D
2016-01-01
There have been relatively few attempts to represent vision or blindness ontologically. This is unsurprising as the related phenomena of sight and blindness are difficult to represent ontologically for a variety of reasons. Blindness has escaped ontological capture at least in part because: blindness or the employment of the term 'blindness' seems to vary from context to context, blindness can present in a myriad of types and degrees, and there is no precedent for representing complex phenomena such as blindness. We explore current attempts to represent vision or blindness, and show how these attempts fail at representing subtypes of blindness (viz., color blindness, flash blindness, and inattentional blindness). We examine the results found through a review of current attempts and identify where they have failed. By analyzing our test cases of different types of blindness along with the strengths and weaknesses of previous attempts, we have identified the general features of blindness and vision. We propose an ontological solution to represent vision and blindness, which capitalizes on resources afforded to one who utilizes the Basic Formal Ontology as an upper-level ontology. The solution we propose here involves specifying the trigger conditions of a disposition as well as the processes that realize that disposition. Once these are specified we can characterize vision as a function that is realized by certain (in this case) biological processes under a range of triggering conditions. When the range of conditions under which the processes can be realized are reduced beyond a certain threshold, we are able to say that blindness is present. We characterize vision as a function that is realized as a seeing process and blindness as a reduction in the conditions under which the sight function is realized. This solution is desirable because it leverages current features of a major upper-level ontology, accurately captures the phenomenon of blindness, and can be
Chetverikov, Andrey; Campana, Gianluca; Kristjánsson, Árni
2017-10-01
Colors are rarely uniform, yet little is known about how people represent color distributions. We introduce a new method for studying color ensembles based on intertrial learning in visual search. Participants looked for an oddly colored diamond among diamonds with colors taken from either uniform or Gaussian color distributions. On test trials, the targets had various distances in feature space from the mean of the preceding distractor color distribution. Targets on test trials therefore served as probes into probabilistic representations of distractor colors. Test-trial response times revealed a striking similarity between the physical distribution of colors and their internal representations. The results demonstrate that the visual system represents color ensembles in a more detailed way than previously thought, coding not only mean and variance but, most surprisingly, the actual shape (uniform or Gaussian) of the distribution of colors in the environment.
OSMOSE experiment representativity studies.
Energy Technology Data Exchange (ETDEWEB)
Aliberti, G.; Klann, R.; Nuclear Engineering Division
2007-10-10
The OSMOSE program aims at improving the neutronic predictions of advanced nuclear fuels through measurements in the MINERVE facility at the CEA-Cadarache (France) on samples containing the following separated actinides: Th-232, U-233, U-234, U-235, U-236, U-238, Np-237, Pu-238, Pu-239, Pu-240, Pu-241, Pu-242, Am-241, Am-243, Cm-244 and Cm-245. The goal of the experimental measurements is to produce a database of reactivity-worth measurements in different neutron spectra for the separated heavy nuclides. This database can then be used as a benchmark for integral reactivity-worth measurements to verify and validate reactor analysis codes and integral cross-section values for the isotopes tested. In particular, the OSMOSE experimental program will produce very accurate sample reactivity-worth measurements for a series of actinides in various spectra, from very thermalized to very fast. The objective of the analytical program is to make use of the experimental data to establish deficiencies in the basic nuclear data libraries, identify their origins, and provide guidelines for nuclear data improvements in coordination with international programs. To achieve the proposed goals, seven different neutron spectra can be created in the MINERVE facility: UO2 dissolved in water (representative of over-moderated LWR systems), UO2 matrix in water (representative of LWRs), a mixed oxide fuel matrix, two thermal spectra containing large epithermal components (representative of under-moderated reactors), a moderated fast spectrum (representative of fast reactors which have some slowing down in moderators such as lead-bismuth or sodium), and a very hard spectrum (representative of fast reactors with little moderation from reactor coolant). The different spectra are achieved by changing the experimental lattice within the MINERVE reactor. The experimental lattice is the replaceable central part of MINERVE, which establishes the spectrum at the sample location. This configuration
Representing distance, consuming distance
DEFF Research Database (Denmark)
Larsen, Gunvor Riber
Title: Representing Distance, Consuming Distance Abstract: Distance is a condition for corporeal and virtual mobilities, for desired and actual travel, but yet it has received relatively little attention as a theoretical entity in its own right. Understandings of and assumptions about distance...... are being consumed in the contemporary society, in the same way as places, media, cultures and status are being consumed (Urry 1995, Featherstone 2007). An exploration of distance and its representations through contemporary consumption theory could expose what role distance plays in forming...
Czerwiec, M K
2018-02-01
Matthew P. McAllister wrote: "Comic books can and have contributed positively to the discourse about AIDS: images that encourage true education, understanding and compassion can help cope with a biomedical condition which has more than a biomedical relevance" [1]. With this in mind, I combined a 23-narrator oral history and my personal memoir about an inpatient Chicago AIDS hospital unit in my book, Taking Turns: Stories from HIV/AIDS Care Unit 371. In doing so, I built upon the existing rich history of HIV/AIDS in comics, which this article will briefly describe. Although not a comprehensive review of the intersection of AIDS and comics, the book is a tour through influences that proved useful to me. In addition, in making my book, I faced a distinct ethical issue with regard to representing patient experiences with HIV/AIDS, and I describe here how I addressed it. © 2018 American Medical Association. All Rights Reserved.
Representative of the municipality
International Nuclear Information System (INIS)
Castellnou Barcelo, J.
2007-01-01
Full text of publication follows. The decommissioning of the Vandellos-I nuclear power plant was a big challenge for the host community of Vandellos i l'Hospitalet de l'Infant and the close-by region. Closing down of the facility resulted in a rise of unemployment and a decrease of municipal income. The public was concerned with three issues: safety, transparency and information about the decommissioning, and economic future. Therefore, from the very beginning, municipal governments entered into negotiations with ENRESA on socio-economic benefits, including local employment in dismantling activities, and other types of financial and non-financial compensation. The ADE business association, i.e. a network of business organisations was created that guided the allotment of work to local firms. To satisfy public demand, local municipalities focused on the triad of safety, dialogue and local development, considered the three 'pillars of trust'. A Municipal Monitoring Commission was created, made up of representatives of affected municipalities, the regional government, the ADE business association, trade unions, the local university, the NPP management and ENRESA to monitor the dismantling process and regularly inform the local public. Items that were handled by this Commission included: - Work process monitoring. - Workers. - Materials Control. - Conventional and radioactive or contaminated waste management. - Emanation waste management (liquid and gas) - Safety (training and accidents). - Surveillance (radiological and environmental: dust, noise). - Effects. - Fulfillment of agreed conditions. A number of communication tools and channels were used, e.g., public information meetings, an information centre, the municipal magazine, the municipal radio station, and meetings with representatives of the local press. Particularly innovative was the idea to ask academics from the University of Tarragona to help with 'translating' technical information into language that could
Directory of Open Access Journals (Sweden)
Silvia Oliveira Ribeiro
2017-07-01
Full Text Available Abstract AIMS Changes resulting from the gestational period may lead to changes in the biomechanics of women, which can alter the performance of functional activities such as sit-to-stand. Thus, the objective of this study was to investigate the influence of a virtual reality-based exercise protocol on the kinematic variables of the sit-to-stand movement in women in their second and third gestational trimesters. METHODS The sample consisted of 44 women selected according to the eligibility criteria, allocated into 4 groups: control group, 2nd trimester (CG2T; experimental group, 2nd trimester (EG2T; control group, 3rd trimester (CG3T; and experimental group, 3rd trimester (EG3T. All the volunteers answered the identification and evaluation form and were sent to the kinematic evaluation through the Qualisys Motion Capture System®. An intervention with game therapy was performed in 12 sessions of 30 minutes each, three times a week. RESULTS No statistically significant differences were found intra- (P> 0.54 and inter-groups (P> 0.059 for kinematic variables. However, there was a tendency for improvement in the analyzed variables after the proposed protocol. CONCLUSIONS The data obtained suggest that the use of the Nintendo Wii Fit Plus® was not able to influence sit-to-stand kinematic variables in the analyzed women.
Separating the contributions of variability and parameter uncertainty in probability distributions
International Nuclear Information System (INIS)
Sankararaman, S.; Mahadevan, S.
2013-01-01
This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters
Energy Technology Data Exchange (ETDEWEB)
Fukuma, Masafumi; Sugishita, Sotaro; Umeda, Naoya [Department of Physics, Kyoto University,Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)
2015-07-17
We propose a class of models which generate three-dimensional random volumes, where each configuration consists of triangles glued together along multiple hinges. The models have matrices as the dynamical variables and are characterized by semisimple associative algebras A. Although most of the diagrams represent configurations which are not manifolds, we show that the set of possible diagrams can be drastically reduced such that only (and all of the) three-dimensional manifolds with tetrahedral decompositions appear, by introducing a color structure and taking an appropriate large N limit. We examine the analytic properties when A is a matrix ring or a group ring, and show that the models with matrix ring have a novel strong-weak duality which interchanges the roles of triangles and hinges. We also give a brief comment on the relationship of our models with the colored tensor models.
Papakostas, George I; Martinson, Max A; Fava, Maurizio; Iovieno, Nadia
2016-05-01
The aim of this work is to compare the efficacy of pharmacologic agents for the treatment of major depressive disorder (MDD) and bipolar depression. MEDLINE/PubMed databases were searched for studies published in English between January 1980 and September 2014 by cross-referencing the search term placebo with each of the antidepressant agents identified and with bipolar. The search was supplemented by manual bibliography review. We selected double-blind, randomized, placebo-controlled trials of antidepressant monotherapies for the treatment of MDD and of oral drug monotherapies for the treatment of bipolar depression. 196 trials in MDD and 19 trials in bipolar depression were found eligible for inclusion in our analysis. Data were extracted by one of the authors and checked for accuracy by a second one. Data extracted included year of publication, number of patients randomized, probability of receiving placebo, duration of the trial, baseline symptom severity, dosing schedule, study completion rates, and clinical response rates. Response rates for drug versus placebo in trials of MDD and bipolar depression were 52.7% versus 37.5% and 54.7% versus 40.5%, respectively. The random-effects meta-analysis indicated that drug therapy was more effective than placebo in both MDD (risk ratio for response = 1.373; P depression (risk ratio = 1.257; P depression trials in favor of MDD (P = .008). Although a statistically significantly greater treatment effect size was noted in MDD relative to bipolar depression studies, the absolute magnitude of the difference was numerically small. Therefore, the present study suggests no clinically significant differences in the overall short-term efficacy of pharmacologic monotherapies for MDD and bipolar depression. © Copyright 2016 Physicians Postgraduate Press, Inc.
Clinical Implications of Glucose Variability: Chronic Complications of Diabetes
Directory of Open Access Journals (Sweden)
Hye Seung Jung
2015-06-01
Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.
Directory of Open Access Journals (Sweden)
Rosemann Thomas
2011-04-01
Full Text Available Abstract Background The purpose of this study was to investigate the effect of short-term supplementation of amino acids before and during a 100 km ultra-marathon on variables of skeletal muscle damage and muscle soreness. We hypothesized that the supplementation of amino acids before and during an ultra-marathon would lead to a reduction in the variables of skeletal muscle damage, a decrease in muscle soreness and an improved performance. Methods Twenty-eight experienced male ultra-runners were divided into two groups, one with amino acid supplementation and the other as a control group. The amino acid group was supplemented a total of 52.5 g of an amino acid concentrate before and during the 100 km ultra-marathon. Pre- and post-race, creatine kinase, urea and myoglobin were determined. At the same time, the athletes were asked for subjective feelings of muscle soreness. Results Race time was not different between the groups when controlled for personal best time in a 100 km ultra-marathon. The increases in creatine kinase, urea and myoglobin were not different in both groups. Subjective feelings of skeletal muscle soreness were not different between the groups. Conclusions We concluded that short-term supplementation of amino acids before and during a 100 km ultra-marathon had no effect on variables of skeletal muscle damage and muscle soreness.
International Nuclear Information System (INIS)
Lamb, David S.; Denham, James W.; Joseph, David; Matthews, John; Atkinson, Chris; Spry, Nigel A.; Duchesne, Gillian; Ebert, Martin; Steigler, Allison; Delahunt, Brett; D'Este, Catherine
2011-01-01
Purpose: We sought to compare the prognostic value of early prostate-specific antigen (PSA) test-based variables for the 802 eligible patients treated in the Trans-Tasman Radiation Oncology Group 96.01 randomized trial. Methods and Materials: Patients in this trial had T2b, T2c, T3, and T4 N0 prostate cancer and were randomized to 0, 3, or 6 months of neoadjuvant androgen deprivation therapy (NADT) prior to and during radiation treatment at 66 Gy to the prostate and seminal vesicles. The early PSA test-based variables evaluated were the pretreatment initial PSA (iPSA) value, PSA values at 2 and 4 months into NADT, the PSA nadir (nPSA) value after radiation in all patients, and PSA response signatures in men receiving radiation. Comparisons of endpoints were made using Cox models of local progression-free survival, distant failure-free survival, biochemical failure-free survival, and prostate cancer-specific survival. Results: The nPSA value was a powerful predictor of all endpoints regardless of whether NADT was given before radiation. PSA response signatures also predicted all endpoints in men treated by radiation alone. iPSA and PSA results at 2 and 4 months into NADT predicted biochemical failure-free survival but not any of the clinical endpoints. nPSA values correlated with those of iPSA, Gleason grade, and T stage and were significantly higher in men receiving radiation alone than in those receiving NADT. Conclusions: The postradiation nPSA value is the strongest prognostic indicator of all early PSA-based variables. However, its use as a surrogate endpoint needs to take into account its dependence on pretreatment variables and treatment method.
Institute of Scientific and Technical Information of China (English)
Jianping Chu; Xueli Shen; Jun Fan; Changhai Chen; Shuyang Lin
2008-01-01
BACKGROUND: Heart rate variability refers to the beat-to-beat alteration in heart rate. It is usually a slight periodic variation of R-R intervals. Much information of autonomic nerve system balance can be obtained by measuring the heart rate variability of patients. It remains to be shown whether heart rate variability can be used as an index for determining the severity of insomnia and cerebral infarction. OBJECTIVE: This study aimed to analyze the correlation for each frequency spectrum parameter of heart rate variability with an insomnia index, as well as the degree of neurological defects in patients with simple cerebral infarction and cerebral infarction complicated by insomnia. The goal was to verify the feasibility of frequency spectrum parameters for heart rate variability as a marker for insomnia and cerebral infarction. DESIGN: A case-control observation. SETTING: Department of Neurology, First Hospital Affiliated to China Medical University. PARTICIPANTS: Sixty inpatients, and/or outpatients, with cerebral infarction were admitted to the 202 Hospital of Chinese PLA between December 2005 and October 2006, confirmed by CT, and recruited to the study. According to the insomnia condition (insomnia is defined by a Pittsburgh Sleep Quality Index score > 7), the patients were assigned to a simple cerebral infarction group and a cerebral infarction complicated by insomnia group, with 30 subjects in each group. Thirty additional subjects, who concurrently received ex-aminations and were confirmed to not suffer from cerebral infarction and insomnia, were recruited into the control group. Written informed consent was obtained from each subject for laboratory specimens. The pro-tocol was approved by the Hospital's Ethics Committee. METHODS: Following admission, each subject's neurological impairment was assessed with the National Institutes of Health Stroke Scale and Pittsburgh Sleep Quality Index. Heart rate variability of each subject was measured with an
Ferreira, Amanda M J; Farias-Junior, Luiz F; Mota, Thaynan A A; Elsangedy, Hassan M; Marcadenti, Aline; Lemos, Telma M A M; Okano, Alexandre H; Fayh, Ana P T
2018-01-01
The hypothesis of the central effect of carbohydrate mouth rinse (CMR) on performance improvement in a fed state has not been established, and its psychophysiological responses have not yet been described. The aim of this study was to evaluate the effect of CMR in athletes fed state on performance, biochemical and psychophysiological responses compared to ad libitum water intake. Eleven trained male cyclists completed a randomized, crossover trial, which consisted of a 30 km cycle ergometer at self-selected intensity and in a fed state. Subjects were under random influence of the following interventions: CMR with a 6% unflavored maltodextrin solution; mouth rinsing with a placebo solution (PMR); drinking "ad libitum" (DAL). The time for completion of the test (min), heart rate (bpm) and power (watts), rating of perceived exertion (RPE), affective response, blood glucose (mg/dL) and lactate (mmol/DL), were evaluated before, during and immediately after the test, while insulin (uIL/mL), cortisol (μg/dL) and creatine kinase (U/L) levels were measured before, immediately after the test and 30 min after the test. Time for completion of the 30 km trial did not differ significantly among CMR, PMR and DAL interventions (means = 54.5 ± 2.9, 54.7 ± 2.9 and 54.5 ± 2.5 min, respectively; p = 0.82). RPE and affective response were higher in DAL intervention ( p creatine kinase responses showed no significant difference among interventions. In a fed state, CMR has not caused metabolic changes, and it has not improved physical performance compared to ad libitum water intake, but demonstrated a possible central effect. ReBec registration number: RBR-4vpwkg. Available in http://www.ensaiosclinicos.gov.br/rg/?q=RBR-4vpwkg.
Sample size estimation and sampling techniques for selecting a representative sample
Directory of Open Access Journals (Sweden)
Aamir Omair
2014-01-01
Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.
Dawson, Jacqueline K; Dorff, Tanya B; Todd Schroeder, E; Lane, Christianne J; Gross, Mitchell E; Dieli-Conwright, Christina M
2018-04-03
Prostate cancer patients on androgen deprivation therapy (ADT) experience adverse effects such as lean mass loss, known as sarcopenia, fat gain, and changes in cardiometabolic factors that increase risk of metabolic syndrome (MetS). Resistance training can increase lean mass, reduce body fat, and improve physical function and quality of life, but no exercise interventions in prostate cancer patients on ADT have concomitantly improved body composition and MetS. This pilot trial investigated 12 weeks of resistance training on body composition and MetS changes in prostate cancer patients on ADT. An exploratory aim examined if a combined approach of training and protein supplementation would elicit greater changes in body composition. Prostate cancer patients on ADT were randomized to resistance training and protein supplementation (TRAINPRO), resistance training (TRAIN), protein supplementation (PRO), or control stretching (STRETCH). Exercise groups (EXE = TRAINPRO, TRAIN) performed supervised exercise 3 days per week for 12 weeks, while non-exercise groups (NoEXE = PRO, STRETCH) performed a home-based stretching program. TRAINPRO and PRO received 50 g⋅day - 1 of whey protein. The primary outcome was change in lean mass assessed through dual energy x-ray absorptiometry. Secondary outcomes examined changes in sarcopenia, assessed through appendicular skeletal mass (ASM) index (kg/m 2 ), body fat %, strength, physical function, quality of life, MetS score and the MetS components of waist circumference, blood pressure, glucose, high-density lipoprotein-cholesterol, and triglyceride levels. A total of 37 participants were randomized; 32 participated in the intervention (EXE n = 13; NoEXE n = 19). At baseline, 43.8% of participants were sarcopenic and 40.6% met the criteria for MetS. Post-intervention, EXE significantly improved lean mass (d = 0.9), sarcopenia prevalence (d = 0.8), body fat % (d = 1.1), strength (d = 0.8-3.0), and
Energy Technology Data Exchange (ETDEWEB)
Farooq, S; Iqbal, N; Arif, M [Nuclear Institute for Agriculture and Biology (NIAB), Faisalabad (Pakistan)
1998-03-01
Random Amplified Polymorphic DNA (RAPDs) markers were utilized to detect polymorphism between pure lines and commercially available Basmati rice varieties to assess variation which may be helpful in quality control and varietal identification (Basmati-370 and derived radiation induced mutants), differentiation of mutants and parents, and identification of RAPD markers co-segregating with important agronomic traits including plant height, days to flower and grain quality. Basmati varieties were distinguished from non-Basmati varieties with the help of five diagnostic markers which will be useful for detecting mixing of non-Basmati and Basmati rices, currently a serious marketing problem. Different Basmati cultivars were identified with the help of diagnostic RAPD markers which can be used in quality control as well as for ``fingerprinting`` of cultivars. Different radiation induced mutants were also successfully distinguished from the parents on the basis of variety specific and mutant specific markers which will be useful for varietal identification. In addition to this, other markers were also identified which can differentiate mutants from each other and are being, used for the fingerprinting of different mutants, particularly the dwarf mutants having similar appearance but different parentage. For identification of RAPD markers co-segregating with plant height and days to flower, 50 F{sub 2} plants and four F{sub 3} families were studied from a reciprocal cross made between Kashmir Basmati (tall and early) and Basmati-198 (dwarf and late). Segregating bands were observed within these populations, and indicating the possible use of RAPD markers for tagging gene(s) of agronomic importance in rice. (author). 38 refs, 6 figs, 3 tabs.
International Nuclear Information System (INIS)
Farooq, S.; Iqbal, N.; Arif, M.
1998-01-01
Random Amplified Polymorphic DNA (RAPDs) markers were utilized to detect polymorphism between pure lines and commercially available Basmati rice varieties to assess variation which may be helpful in quality control and varietal identification (Basmati-370 and derived radiation induced mutants), differentiation of mutants and parents, and identification of RAPD markers co-segregating with important agronomic traits including plant height, days to flower and grain quality. Basmati varieties were distinguished from non-Basmati varieties with the help of five diagnostic markers which will be useful for detecting mixing of non-Basmati and Basmati rices, currently a serious marketing problem. Different Basmati cultivars were identified with the help of diagnostic RAPD markers which can be used in quality control as well as for ''fingerprinting'' of cultivars. Different radiation induced mutants were also successfully distinguished from the parents on the basis of variety specific and mutant specific markers which will be useful for varietal identification. In addition to this, other markers were also identified which can differentiate mutants from each other and are being, used for the fingerprinting of different mutants, particularly the dwarf mutants having similar appearance but different parentage. For identification of RAPD markers co-segregating with plant height and days to flower, 50 F 2 plants and four F 3 families were studied from a reciprocal cross made between Kashmir Basmati (tall and early) and Basmati-198 (dwarf and late). Segregating bands were observed within these populations, and indicating the possible use of RAPD markers for tagging gene(s) of agronomic importance in rice. (author)
Springvloet, Linda; Lechner, Lilian; Candel, Math J J M; de Vries, Hein; Oenema, Anke
2016-03-01
This study explored whether the determinants that were targeted in two versions of a Web-based computer-tailored nutrition education intervention mediated the effects on fruit, high-energy snack, and saturated fat intake among adults who did not comply with dietary guidelines. A RCT was conducted with a basic (tailored intervention targeting individual cognitions and self-regulation), plus (additionally targeting environmental-level factors), and control group (generic nutrition information). Participants were recruited from the general Dutch adult population and randomly assigned to one of the study groups. Online self-reported questionnaires assessed dietary intake and potential mediating variables (behavior-specific cognitions, action- and coping planning, environmental-level factors) at baseline and one (T1) and four (T2) months post-intervention (i.e. four and seven months after baseline). The joint-significance test was used to establish mediating variables at different time points (T1-mediating variables - T2-intake; T1-mediating variables - T1-intake; T2-mediating variables - T2-intake). Educational differences were examined by testing interaction terms. The effect of the plus version on fruit intake was mediated (T2-T2) by intention and fruit availability at home and for high-educated participants also by attitude. Among low/moderate-educated participants, high-energy snack availability at home mediated (T1-T1) the effect of the basic version on high-energy snack intake. Subjective norm mediated (T1-T1) the effect of the basic version on fat intake among high-educated participants. Only some of the targeted determinants mediated the effects of both intervention versions on fruit, high-energy snack, and saturated fat intake. A possible reason for not finding a more pronounced pattern of mediating variables is that the educational content was tailored to individual characteristics and that participants only received feedback for relevant and not for all
Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes
Orsingher, Enzo; Polito, Federico
2012-08-01
In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.
Stirban, A; Pop, A; Tschoepe, D
2013-10-01
In a pilot study we suggested that benfotiamine, a thiamine prodrug, prevents postprandial endothelial dysfunction in people with Type 2 diabetes mellitus. The aim of this study was to test these effects in a larger population. In a double-blind, placebo-controlled, randomized, crossover study, 31 people with Type 2 diabetes received 900 mg/day benfotiamine or a placebo for 6 weeks (with a washout period of 6 weeks between). At the end of each treatment period, macrovascular and microvascular function were assessed, together with variables of autonomic nervous function in a fasting state, as well as 2, 4 and 6 h following a heated, mixed test meal. Participants had an impaired baseline flow-mediated dilatation (2.63 ± 2.49%). Compared with the fasting state, neither variable changed postprandially following the placebo treatment. The 6 weeks' treatment with high doses of benfotiamine did not alter this pattern, either in the fasting state or postprandially. Among a subgroup of patients with the highest flow-mediated dilatation, following placebo treatment there was a significant postprandial flow-mediated dilatation decrease, while this effect was attenuated by benfotiamine pretreatment. In people with Type 2 diabetes and markedly impaired fasting flow-mediated dilatation, a mixed test meal does not further deteriorate flow-mediated dilatation or variables of microvascular or autonomic nervous function. Because no significant deterioration of postprandial flow-mediated dilatation, microvascular or autonomic nervous function tests occurred after placebo treatment, a prevention of the postprandial deterioration of these variables with benfotiamine was not feasible. © 2013 The Authors. Diabetic Medicine © 2013 Diabetes UK.
Problems in probability theory, mathematical statistics and theory of random functions
Sveshnikov, A A
1979-01-01
Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim
Shimbo, Daichi; Wang, Lu; Lamonte, Michael J.; Allison, Matthew; Wellenius, Gregory A.; Bavry, Anthony A.; Martin, Lisa W.; Aragaki, Aaron; Newman, Jonathan D.; Swica, Yael; Rossouw, Jacques E.; Manson, JoAnn E.; Wassertheil-Smoller, Sylvia
2014-01-01
Objectives Mean and visit-to-visit variability (VVV) of blood pressure are associated with an increased cardiovascular disease risk. We examined the effect of hormone therapy on mean and VVV of blood pressure in postmenopausal women from the Women’s Health Initiative (WHI) randomized controlled trials. Methods Blood pressure was measured at baseline and annually in the two WHI hormone therapy trials in which 10,739 and 16,608 postmenopausal women were randomized to conjugated equine estrogens (CEE, 0.625 mg/day) or placebo, and CEE plus medroxyprogesterone acetate (MPA, 2.5 mg/day) or placebo, respectively. Results At the first annual visit (Year 1), mean systolic blood pressure was 1.04 mmHg (95% CI 0.58, 1.50) and 1.35 mmHg (95% CI 0.99, 1.72) higher in the CEE and CEE+MPA arms respectively compared to corresponding placebos. These effects remained stable after Year 1. CEE also increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.03, Pblood pressure increased at Year 1, and the differences in the CEE and CEE+MPA arms vs. placebos also continued to increase after Year 1. Further, both CEE and CEE+MPA significantly increased VVV of systolic blood pressure (ratio of VVV in CEE vs. placebo, 1.04, Pblood pressure. PMID:24991872
Vianna, Andre Gustavo Daher; Lacerda, Claudio Silva; Pechmann, Luciana Muniz; Polesel, Michelle Garcia; Marino, Emerson Cestari; Faria-Neto, Jose Rocha
2018-05-01
This study aims to evaluate whether there is a difference between the effects of vildagliptin and gliclazide MR (modified release) on glycemic variability (GV) in women with type 2 diabetes (T2DM) as evaluated by continuous glucose monitoring (CGM). An open-label, randomized study was conducted in T2DM women on steady-dose metformin monotherapy which were treated with 50 mg vildagliptin twice daily or 60-120 mg of gliclazide MR once daily. CGM and GV indices calculation were performed at baseline and after 24 weeks. In total, 42 patients (age: 61.9 ± 5.9 years, baseline glycated hemoglobin (HbA1c): 7.3 ± 0.56) were selected and 37 completed the 24-week protocol. Vildagliptin and gliclazide MR reduced GV, as measured by the mean amplitude of glycemic excursions (MAGE, p = 0.007 and 0.034, respectively). The difference between the groups did not reach statistical significance. Vildagliptin also significantly decreased the standard deviation of the mean glucose (SD) and the mean of the daily differences (MODD) (p = 0.007 and 0.030). Vildagliptin and gliclazide MR similarly reduced the MAGE in women with T2DM after 24 weeks of treatment. Further studies are required to attest differences between vildagliptin and gliclazide MR regarding glycemic variability. Copyright © 2018 Elsevier B.V. All rights reserved.
Coupé, Christophe
2018-01-01
As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM), which address grouping of observations, and generalized linear mixed-effects models (GLMM), which offer a family of distributions for the dependent variable. Generalized additive models (GAM) are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS). We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for 'difficult' variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships. Relying on GAMLSS, we
Directory of Open Access Journals (Sweden)
Christophe Coupé
2018-04-01
Full Text Available As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM, which address grouping of observations, and generalized linear mixed-effects models (GLMM, which offer a family of distributions for the dependent variable. Generalized additive models (GAM are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS. We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for ‘difficult’ variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships
Crews, W David; Harrison, David W; Wright, James W
2008-04-01
In recent years, there has been increased interest in the potential health-related benefits of antioxidant- and phytochemical-rich dark chocolate and cocoa. The objective of the study was to examine the short-term (6 wk) effects of dark chocolate and cocoa on variables associated with neuropsychological functioning and cardiovascular health in healthy older adults. A double-blind, placebo-controlled, fixed-dose, parallel-group clinical trial was used. Participants (n = 101) were randomly assigned to receive a 37-g dark chocolate bar and 8 ounces (237 mL) of an artificially sweetened cocoa beverage or similar placebo products each day for 6 wk. No significant group (dark chocolate and cocoa or placebo)-by-trial (baseline, midpoint, and end-of-treatment assessments) interactions were found for the neuropsychological, hematological, or blood pressure variables examined. In contrast, the midpoint and end-of-treatment mean pulse rate assessments in the dark chocolate and cocoa group were significantly higher than those at baseline and significantly higher than the midpoint and end-of-treatment rates in the control group. Results of a follow-up questionnaire item on the treatment products that participants believed they had consumed during the trial showed that more than half of the participants in both groups correctly identified the products that they had ingested during the experiment. This investigation failed to support the predicted beneficial effects of short-term dark chocolate and cocoa consumption on any of the neuropsychological or cardiovascular health-related variables included in this research. Consumption of dark chocolate and cocoa was, however, associated with significantly higher pulse rates at 3- and 6-wk treatment assessments.
Edgington, Eugene
2007-01-01
Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani
Marketing norm perception among medical representatives in Indian pharmaceutical industry.
Nagashekhara, Molugulu; Agil, Syed Omar Syed; Ramasamy, Ravindran
2012-03-01
Study of marketing norm perception among medical representatives is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the perception of marketing norms among medical representatives. The research design is quantitative and cross sectional study with medical representatives as unit of analysis. Data is collected from medical representatives (n=300) using a simple random and cluster sampling using a structured questionnaire. Results indicate that there is no difference in the perception of marketing norms among male and female medical representatives. But there is a difference in opinion among domestic and multinational company's medical representatives. Educational back ground of medical representatives also shows the difference in opinion among medical representatives. Degree holders and multinational company medical representatives have high perception of marketing norms compare to their counterparts. The researchers strongly believe that mandatory training on marketing norms is beneficial in decision making process during the dilemmas in the sales field.
On a randomly imperfect spherical cap pressurized by a random ...
African Journals Online (AJOL)
In this paper, we investigate a dynamical system in a random setting of dual randomness in space and time variables in which both the imperfection of the structure and the load function are considered random , each with a statistical zero-mean .The auto- covariance of the load is correlated as an exponentially decaying ...
Drop Spreading with Random Viscosity
Xu, Feng; Jensen, Oliver
2016-11-01
Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.
Shimabukuro, Michio; Tanaka, Atsushi; Sata, Masataka; Dai, Kazuoki; Shibata, Yoshisato; Inoue, Yohei; Ikenaga, Hiroki; Kishimoto, Shinji; Ogasawara, Kozue; Takashima, Akira; Niki, Toshiyuki; Arasaki, Osamu; Oshiro, Koichi; Mori, Yutaka; Ishihara, Masaharu; Node, Koichi
2017-07-06
Little is known about clinical associations between glucose fluctuations including hypoglycemia, heart rate variability (HRV), and the activity of the sympathetic nervous system (SNS) in patients with acute phase of acute coronary syndrome (ACS). This pilot study aimed to evaluate the short-term effects of glucose fluctuations on HRV and SNS activity in type 2 diabetes mellitus (T2DM) patients with recent ACS. We also examined the effect of suppressing glucose fluctuations with miglitol on these variables. This prospective, randomized, open-label, blinded-endpoint, multicenter, parallel-group comparative study included 39 T2DM patients with recent ACS, who were randomly assigned to either a miglitol group (n = 19) or a control group (n = 20). After initial 24-h Holter electrocardiogram (ECG) (Day 1), miglitol was commenced and another 24-h Holter ECG (Day 2) was recorded. In addition, continuous glucose monitoring (CGM) was performed throughout the Holter ECG. Although frequent episodes of subclinical hypoglycemia (≤4.44 mmo/L) during CGM were observed on Day 1 in the both groups (35% of patients in the control group and 31% in the miglitol group), glucose fluctuations were decreased and the minimum glucose level was increased with substantial reduction in the episodes of subclinical hypoglycemia to 7.7% in the miglitol group on Day 2. Holter ECG showed that the mean and maximum heart rate and mean LF/HF were increased on Day 2 in the control group, and these increases were attenuated by miglitol. When divided 24-h time periods into day-time (0700-1800 h), night-time (1800-0000 h), and bed-time (0000-0700 h), we found increased SNS activity during day-time, increased maximum heart rate during night-time, and glucose fluctuations during bed-time, which were attenuated by miglitol treatment. In T2DM patients with recent ACS, glucose fluctuations with subclinical hypoglycemia were associated with alterations of HRV and SNS activity, which were mitigated by
Correlated random sampling for multivariate normal and log-normal distributions
International Nuclear Information System (INIS)
Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.
2012-01-01
A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.
Simulation-based production planning for engineer-to-order systems with random yield
Akcay, Alp; Martagan, Tugce
2018-01-01
We consider an engineer-to-order production system with unknown yield. We model the yield as a random variable which represents the percentage output obtained from one unit of production quantity. We develop a beta-regression model in which the mean value of the yield depends on the unique
DEFF Research Database (Denmark)
Melo, Jean
. Although many researchers suggest that preprocessor-based variability amplifies maintenance problems, there is little to no hard evidence on how actually variability affects programs and programmers. Specifically, how does variability affect programmers during maintenance tasks (bug finding in particular......)? How much harder is it to debug a program as variability increases? How do developers debug programs with variability? In what ways does variability affect bugs? In this Ph.D. thesis, I set off to address such issues through different perspectives using empirical research (based on controlled...... experiments) in order to understand quantitatively and qualitatively the impact of variability on programmers at bug finding and on buggy programs. From the program (and bug) perspective, the results show that variability is ubiquitous. There appears to be no specific nature of variability bugs that could...
Representative process sampling - in practice
DEFF Research Database (Denmark)
Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen
2007-01-01
Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...
Random phenomena; Phenomenes aleatoires
Energy Technology Data Exchange (ETDEWEB)
Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)
1963-07-01
This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.
Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.
2014-01-01
In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to
Marc Treib: Representing Landscape Architecture
DEFF Research Database (Denmark)
Braae, Ellen Marie
2008-01-01
The editor of Representing Landscape Architecture, Marc Treib, argues that there is good reason to evaluate the standard practices of representation that landscape architects have been using for so long. In the rush to the promised land of computer design these practices are now in danger of being...
Does representative wind information exist?
Wieringa, J.
1996-01-01
Representativity requirements are discussed for various wind data users. It is shown that most applications can be dealt with by using data from wind stations when these are made to conform with WMO specifications. Methods to achieve this WMO normalization are reviewed, giving minimum specifications
OAS :: Member States : Permanent Representatives
Rights Actions against Corruption C Children Civil Registry Civil Society Contact Us Culture Cyber Barbados Belize Bolivia Brazil Canada Chile Colombia Costa Rica Cuba 1 Dominica (Commonwealth of) Dominican Gutierez Ambassador, Permanent Representative of Belize Diego Pary RodrÃguez Bolivia Diego Pary RodrÃguez
Judgments of and by Representativeness
1981-05-15
p. 4i). This hy- pothesis was studied in several contexts, including intuitive statisti- cal judgments and the prediction of professional choice (Kahneman... professional choice . Here, X is representative of M either because it is frequently associated with M (e.g., high fever commonly accompanies pneumonia
WIPP facility representative program plan
International Nuclear Information System (INIS)
1994-01-01
This plan describes the Department of Energy (DOE), Carlsbad Area Office (CAO) facility representative (FR) program at the Waste Isolation Pilot Plant (WIPP). It provides the following information: (1) FR and support organization authorities and responsibilities; (2) FR program requirements; and (3) FR training and qualification requirements
International Nuclear Information System (INIS)
1989-01-01
The study of stellar pulsations is a major route to the understanding of stellar structure and evolution. At the South African Astronomical Observatory (SAAO) the following stellar pulsation studies were undertaken: rapidly oscillating Ap stars; solar-like oscillations in stars; 8-Scuti type variability in a classical Am star; Beta Cephei variables; a pulsating white dwarf and its companion; RR Lyrae variables and galactic Cepheids. 4 figs
Siegler, Robert S.
2007-01-01
Children's thinking is highly variable at every level of analysis, from neural and associative levels to the level of strategies, theories, and other aspects of high-level cognition. This variability exists within people as well as between them; individual children often rely on different strategies or representations on closely related problems…
International Nuclear Information System (INIS)
Tahir-Kheli, R.A.
1975-01-01
A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt
Classical randomness in quantum measurements
International Nuclear Information System (INIS)
D'Ariano, Giacomo Mauro; Presti, Paoloplacido Lo; Perinotti, Paolo
2005-01-01
Similarly to quantum states, also quantum measurements can be 'mixed', corresponding to a random choice within an ensemble of measuring apparatuses. Such mixing is equivalent to a sort of hidden variable, which produces a noise of purely classical nature. It is then natural to ask which apparatuses are indecomposable, i.e. do not correspond to any random choice of apparatuses. This problem is interesting not only for foundations, but also for applications, since most optimization strategies give optimal apparatuses that are indecomposable. Mathematically the problem is posed describing each measuring apparatus by a positive operator-valued measure (POVM), which gives the statistics of the outcomes for any input state. The POVMs form a convex set, and in this language the indecomposable apparatuses are represented by extremal points-the analogous of 'pure states' in the convex set of states. Differently from the case of states, however, indecomposable POVMs are not necessarily rank-one, e.g. von Neumann measurements. In this paper we give a complete classification of indecomposable apparatuses (for discrete spectrum), by providing different necessary and sufficient conditions for extremality of POVMs, along with a simple general algorithm for the decomposition of a POVM into extremals. As an interesting application, 'informationally complete' measurements are analysed in this respect. The convex set of POVMs is fully characterized by determining its border in terms of simple algebraic properties of the corresponding POVMs
Groundwater recharge: Accurately representing evapotranspiration
CSIR Research Space (South Africa)
Bugan, Richard DH
2011-09-01
Full Text Available of solutes in unsaturated, partially saturated and fully saturated porous media (Simunek et al., 1999). It uses Richards' equation for variably-saturated water flow and the convection-dispersion equations for heat and solute transport, based on Fick?s Law... be of irregular shape and having non-uniform soil with a prescribed degree of anisotropy. Water flow can occur in the vertical plane, horizontal plane or radially on both sides of a vertical axis of symmetry. The boundaries of the system can be set at constant...
[Sensitivity of four representative angular cephalometric measures].
Xü, T; Ahn, J; Baumrind, S
2000-05-01
Examined the sensitivity of four representative cephalometric angles to the detection of different vectors of craniofacial growth. Landmark coordinate data from a stratified random sample of 48 adolescent subjects were used to calculate conventional values for changes between the pretreatment and end-of-treatment lateral cephalograms. By modifying the end-of-treatment coordinate values appropriately, the angular changes could be recalculated reflecting three hypothetical situations: Case 1. What if there were no downward landmark displacement between timepoints? Case 2. What if there were no forward landmark displacement between timepoints? Case 3. What if there were no Nasion change? These questions were asked for four representative cephalometric angles: SNA, ANB, NAPg and UI-SN. For Case 1, the associations (r) between the baseline and the modified measure for the three angles were very highly significant (P < 0.001) with r2 values no lower than 0.94! For Case 2, however, the associations were much weaker and no r value reached significance. These angular measurements are less sensitive for measuring downward landmark displacement than they are for measuring forward landmark displacement.
Picturing and modelling catchments by representative hillslopes
Loritz, Ralf; Hassler, Sibylle; Jackisch, Conrad; Zehe, Erwin
2016-04-01
Hydrological modelling studies often start with a qualitative sketch of the hydrological processes of a catchment. These so-called perceptual models are often pictured as hillslopes and are generalizations displaying only the dominant and relevant processes of a catchment or hillslope. The problem with these models is that they are prone to become too much predetermined by the designer's background and experience. Moreover it is difficult to know if that picture is correct and contains enough complexity to represent the system under study. Nevertheless, because of their qualitative form, perceptual models are easy to understand and can be an excellent tool for multidisciplinary exchange between researchers with different backgrounds, helping to identify the dominant structures and processes in a catchment. In our study we explore whether a perceptual model built upon an intensive field campaign may serve as a blueprint for setting up representative hillslopes in a hydrological model to reproduce the functioning of two distinctly different catchments. We use a physically-based 2D hillslope model which has proven capable to be driven by measured soil-hydrological parameters. A key asset of our approach is that the model structure itself remains a picture of the perceptual model, which is benchmarked against a) geo-physical images of the subsurface and b) observed dynamics of discharge, distributed state variables and fluxes (soil moisture, matric potential and sap flow). Within this approach we are able to set up two behavioral model structures which allow the simulation of the most important hydrological fluxes and state variables in good accordance with available observations within the 19.4 km2 large Colpach catchment and the 4.5 km2 large Wollefsbach catchment in Luxembourg without the necessity of calibration. This corroborates, contrary to the widespread opinion, that a) lower mesoscale catchments may be modelled by representative hillslopes and b) physically
Representing culture in interstellar messages
Vakoch, Douglas A.
2008-09-01
As scholars involved with the Search for Extraterrestrial Intelligence (SETI) have contemplated how we might portray humankind in any messages sent to civilizations beyond Earth, one of the challenges they face is adequately representing the diversity of human cultures. For example, in a 2003 workshop in Paris sponsored by the SETI Institute, the International Academy of Astronautics (IAA) SETI Permanent Study Group, the International Society for the Arts, Sciences and Technology (ISAST), and the John Templeton Foundation, a varied group of artists, scientists, and scholars from the humanities considered how to encode notions of altruism in interstellar messages . Though the group represented 10 countries, most were from Europe and North America, leading to the group's recommendation that subsequent discussions on the topic should include more globally representative perspectives. As a result, the IAA Study Group on Interstellar Message Construction and the SETI Institute sponsored a follow-up workshop in Santa Fe, New Mexico, USA in February 2005. The Santa Fe workshop brought together scholars from a range of disciplines including anthropology, archaeology, chemistry, communication science, philosophy, and psychology. Participants included scholars familiar with interstellar message design as well as specialists in cross-cultural research who had participated in the Symposium on Altruism in Cross-cultural Perspective, held just prior to the workshop during the annual conference of the Society for Cross-cultural Research . The workshop included discussion of how cultural understandings of altruism can complement and critique the more biologically based models of altruism proposed for interstellar messages at the 2003 Paris workshop. This paper, written by the chair of both the Paris and Santa Fe workshops, will explore the challenges of communicating concepts of altruism that draw on both biological and cultural models.
Semantic Representatives of the Concept
Directory of Open Access Journals (Sweden)
Elena N. Tsay
2013-01-01
Full Text Available In the article concept as one of the principle notions of cognitive linguistics is investigated. Considering concept as culture phenomenon, having language realization and ethnocultural peculiarities, the description of the concept “happiness” is presented. Lexical and semantic paradigm of the concept of happiness correlates with a great number of lexical and semantic variants. In the work semantic representatives of the concept of happiness, covering supreme spiritual values are revealed and semantic interpretation of their functioning in the Biblical discourse is given.
Randomness and locality in quantum mechanics
International Nuclear Information System (INIS)
Bub, J.
1976-01-01
This paper considers the problem of representing the statistical states of a quantum mechanical system by measures on a classical probability space. The Kochen and Specker theorem proves the impossibility of embedding the possibility structure of a quantum mechanical system into a Boolean algebra. It is shown that a hidden variable theory involves a Boolean representation which is not an embedding, and that such a representation cannot recover the quantum statistics for sequential probabilities without introducing a randomization process for the hidden variables which is assumed to apply only on measurement. It is suggested that the relation of incompatability is to be understood as a type of stochastic independence, and that the indeterminism of a quantum mechanical system is engendered by the existence of independent families of properties. Thus, the statistical relations reflect the possibility structure of the system: the probabilities are logical. The hidden variable thesis is influenced by the Copenhagen interpretation of quantum mechanics, i.e. by some version of the disturbance theory of measurement. Hence, the significance of the representation problem is missed, and the completeness of quantum mechanics is seen to turn on the possibility of recovering the quantum statistics by a hidden variable scheme which satisfies certain physically motivated conditions, such as locality. Bell's proof that no local hidden variable theory can reproduce the statistical relations of quantum mechanics is considered. (Auth.)
Asymptotic distribution of products of sums of independent random ...
Indian Academy of Sciences (India)
integrable random variables (r.v.) are asymptotically log-normal. This fact ... the product of the partial sums of i.i.d. positive random variables as follows. .... Now define ..... by Henan Province Foundation and Frontier Technology Research Plan.
Becker, Geórgia F; Passos, Eduardo P; Moulin, Cileide C
2015-12-01
Obesity is related to hormonal disorders that affect the reproductive system. Low-glycemic index (LGI) diets seem to exert a positive effect on weight loss and on metabolic changes that result from obesity. We investigated the effects of a hypocaloric diet with an LGI and low glycemic load on anthropometric and metabolic variables, ghrelin and leptin concentrations, and the pregnancy rate in overweight and obese infertile women who were undergoing in vitro fertilization (IVF). The study was a randomized block-design controlled trial in which we analyzed 26 overweight or obese infertile women. Patients were assigned to a hypocaloric LGI-diet group or a control group and followed the protocol for 12 wk. Body weight, body mass index (BMI), percentage of body fat, glucose, insulin, homeostasis model assessment of insulin resistance, serum lipids, reproductive hormones, leptin, acylated ghrelin, number of oocytes retrieved in the IVF cycle, and pregnancy rate were determined. There were greater reductions in body mass, BMI, percentage of body fat, waist:hip ratio, and leptin in the LGI-diet group than in the control group (P diet group had 85.4% more oocytes retrieved than did the control group (7.75 ± 1.44 and 4.18 ± 0.87, respectively; P = 0.039) in the IVF cycle. Three patients (21.4%) in the LGI group experienced a spontaneous pregnancy during the follow-up, which generated 3 live births. The hypocaloric LGI diet promoted a decrease in BMI, percentage of body fat, and leptin concentrations, which improved oocyte development and pregnancy rate. These results support the clinical recommendation to advise overweight and obese women to lose weight through a balanced diet before being submitted for treatment with assisted reproduction technologies. A hypocaloric diet combined with LGI foods seems to be beneficial for these patients, but additional studies are required before this treatment is recommended. This trial was registered at clinicaltrials.gov as NCT02416960
Variable Selection for Regression Models of Percentile Flows
Fouad, G.
2017-12-01
Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high
Conspicuous Waste and Representativeness Heuristic
Directory of Open Access Journals (Sweden)
Tatiana M. Shishkina
2017-12-01
Full Text Available The article deals with the similarities between conspicuous waste and representativeness heuristic. The conspicuous waste is analyzed according to the classic Veblen’ interpretation as a strategy to increase social status through conspicuous consumption and conspicuous leisure. In “The Theory of the Leisure Class” Veblen introduced two different types of utility – conspicuous and functional. The article focuses on the possible benefits of the analysis of conspicuous utility not only in terms of institutional economic theory, but also in terms of behavioral economics. To this end, the representativeness heuristics is considered, on the one hand, as a way to optimize the decision-making process, which allows to examine it in comparison with procedural rationality by Simon. On the other hand, it is also analyzed as cognitive bias within the Kahneman and Twersky’ approach. The article provides the analysis of the patterns in the deviations from the rational behavior strategy that could be observed in case of conspicuous waste both in modern market economies in the form of conspicuous consumption and in archaic economies in the form of gift-exchange. The article also focuses on the marketing strategies for luxury consumption’ advertisement. It highlights the impact of the symbolic capital (in Bourdieu’ interpretation on the social and symbolic payments that actors get from the act of conspicuous waste. This allows to perform a analysis of conspicuous consumption both as a rational way to get the particular kind of payments, and, at the same time, as a form of institutionalized cognitive bias.
Randomized random walk on a random walk
International Nuclear Information System (INIS)
Lee, P.A.
1983-06-01
This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)
Representative mass reduction in sampling
DEFF Research Database (Denmark)
Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf
2004-01-01
We here present a comprehensive survey of current mass reduction principles and hardware available in the current market. We conduct a rigorous comparison study of the performance of 17 field and/or laboratory instruments or methods which are quantitatively characterized (and ranked) for accuracy...... dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... most often used mass reduction method, performs appallingly?its use must be discontinued (with the singular exception for completely homogenized fine powders). Only proper mass reduction (i.e. carried out in complete compliance with all appropriate design principles, maintenance and cleaning rules) can...
Uniformity transition for ray intensities in random media
Pradas, Marc; Pumir, Alain; Wilkinson, Michael
2018-04-01
This paper analyses a model for the intensity of distribution for rays propagating without absorption in a random medium. The random medium is modelled as a dynamical map. After N iterations, the intensity is modelled as a sum S of {{\\mathcal N}} contributions from different trajectories, each of which is a product of N independent identically distributed random variables x k , representing successive focussing or de-focussing events. The number of ray trajectories reaching a given point is assumed to proliferate exponentially: {{\\mathcal N}}=ΛN , for some Λ>1 . We investigate the probability distribution of S. We find a phase transition as parameters of the model are varied. There is a phase where the fluctuations of S are suppressed as N\\to ∞ , and a phase where the S has large fluctuations, for which we provide a large deviation analysis.
Want change? Call your representative
Fischhoff, Ilya R.
2011-07-01
During my tenure as an AGU Congressional Science Fellow, which began in September 2010 and continues until November 2011, my time has been shared between working with the U.S. House of Representatives Natural Resource Committee Democratic staff and in the office of Rep. Ed Markey (D-Mass., ranking Democrat on the committee). I appreciate getting to work with staff, fellows, and interns who inspire me, make me laugh, and know their issues cold. Much of my work on the committee is related to fish, wildlife, oceans, lands, and water issues and is directly related to my background in ecology and evolutionary biology (I studied zebra ecology and behavior in Kenya). My assignments have included asking the Environmental Protection Agency (EPA) about why it has not changed the allowed usage of certain pesticides that the National Marine Fisheries Service has found to jeopardize the recovery of endangered Pacific salmon; helping to identify research needs and management options to combat the swiftly spreading and catastrophic white nose syndrome in North American bats; and inquiring as to whether a captive-ape welfare bill, if passed without amendment, could thwart development of a vaccine to stop the Ebola virus from continuing to cause mass mortality in endangered wild apes.
Fisher, Stephen D
1999-01-01
The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
Competing order parameters in quenched random alloys: Fe/sub 1-x/Co/sub x/Cl2
International Nuclear Information System (INIS)
Wong, P.; Horn, P.M.; Birgeneau, R.J.; Safinya, C.R.; Shirane, G.
1980-01-01
A study is reported of the magnetic properties of the random alloy Fe/sub 1-x/Co/sub x/Cl 2 , which represents an archetypal example of a system with competing orthogonal spin anisotropies. Behavior similar to previous experiments and theoretical predictions is found, but with important qualitative and quantitative differences; in particular the phase transition in one variable is drastically altered by the existence of long-range order in the other variable. It is hypothesized that this is due to microscopic random-field effects
International Nuclear Information System (INIS)
Feast, M.W.; Wenzel, W.; Fernie, J.D.; Percy, J.R.; Smak, J.; Gascoigne, S.C.B.; Grindley, J.E.; Lovell, B.; Sawyer Hogg, H.B.; Baker, N.; Fitch, W.S.; Rosino, L.; Gursky, H.
1976-01-01
A critical review of variable stars is presented. A fairly complete summary of major developments and discoveries during the period 1973-1975 is given. The broad developments and new trends are outlined. Essential problems for future research are identified. (B.R.H. )
The Two Sides of the Representative Coin
Directory of Open Access Journals (Sweden)
Keith Sutherland
2011-12-01
Full Text Available In Federalist 10 James Madison drew a functional distinction between “parties” (advocates for factional interests and “judgment” (decision-making for the public good and warned of the corrupting effect of combining both functions in a “single body of men.” This paper argues that one way of overcoming “Madisonian corruption” would be by restricting political parties to an advocacy role, reserving the judgment function to an allotted (randomly-selected microcosm of the whole citizenry, who would determine the outcome of parliamentary debates by secret ballot—a division of labour suggested by James Fishkin’s experiments in deliberative polling. The paper then defends this radical constitutional proposal against Bernard Manin’s (1997 claim that an allotted microcosm could not possibly fulfil the “consent” requirement of Natural Right theory. Not only does the proposal challenge Manin’s thesis, but a 28th Amendment implementing it would finally reconcile the competing visions that have bedevilled representative democracy since the Constitutional Convention of 1787.
International Nuclear Information System (INIS)
Whitelock, P.A.
1990-01-01
The observational characteristics of pulsating red variables are reviewed with particular emphasis on the Miras. These variables represent the last stage in the evolution of stars on the Asymptotic Giant Branch (AGB). A large fraction of the IRAS sources in the Bulge are Mira variables and a subset of these are also OH/IR sources. Their periods range up to 720 days, though most are between 360 and 560 days. At a given period those stars with the highest pulsation amplitudes have the highest mass-loss rates; this is interpreted as evidence for a causal connection between mass-loss and pulsation. It is suggested that once an AGB star has become a Mira it will evolve with increasing pulsation amplitude and mass-loss, but with very little change of luminosity or logarithmic period. 26 refs
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2018-01-01
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained
Benchmarking Variable Selection in QSAR.
Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars
2012-02-01
Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Variational Infinite Hidden Conditional Random Fields
Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin
2015-01-01
Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of
Random phenomena fundamentals of probability and statistics for engineers
Ogunnaike, Babatunde A
2009-01-01
PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...
van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel
2014-01-01
Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007
Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel
2014-07-07
The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.
Flanigan, Francis J
2010-01-01
A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion
Biological Sampling Variability Study
Energy Technology Data Exchange (ETDEWEB)
Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2016-11-08
There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
Comparison of variance estimators for metaanalysis of instrumental variable estimates
Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.
2016-01-01
Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two
International Nuclear Information System (INIS)
Tsallis, C.
1980-03-01
The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt
International Nuclear Information System (INIS)
Tsallis, C.
1981-01-01
The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt
Effects of randomness on chaos and order of coupled logistic maps
International Nuclear Information System (INIS)
Savi, Marcelo A.
2007-01-01
Natural systems are essentially nonlinear being neither completely ordered nor completely random. These nonlinearities are responsible for a great variety of possibilities that includes chaos. On this basis, the effect of randomness on chaos and order of nonlinear dynamical systems is an important feature to be understood. This Letter considers randomness as fluctuations and uncertainties due to noise and investigates its influence in the nonlinear dynamical behavior of coupled logistic maps. The noise effect is included by adding random variations either to parameters or to state variables. Besides, the coupling uncertainty is investigated by assuming tinny values for the connection parameters, representing the idea that all Nature is, in some sense, weakly connected. Results from numerical simulations show situations where noise alters the system nonlinear dynamics
Selection of Representative Models for Decision Analysis Under Uncertainty
Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.
2016-03-01
The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.
Lower limits for distribution tails of randomly stopped sums
Denisov, D.E.; Korshunov, D.A.; Foss, S.G.
2008-01-01
We study lower limits for the ratio $\\overline{F^{*\\tau}}(x)/\\,\\overline F(x)$ of tail distributions, where $F^{*\\tau}$ is a distribution of a sum of a random size $\\tau$ of independent identically distributed random variables having a common distribution $F$, and a random variable $\\tau$ does not
International Nuclear Information System (INIS)
Bennett, D.L.; Brene, N.; Nielsen, H.B.
1986-06-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)
International Nuclear Information System (INIS)
Bennett, D.L.
1987-01-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: Gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)
Bennett, D. L.; Brene, N.; Nielsen, H. B.
1987-01-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.
Energy Technology Data Exchange (ETDEWEB)
Jung, T
2000-07-01
The North Atlantic oscillation (NAO) represents the dominant mode of atmospheric variability in the North Atlantic region and describes the strengthening and weakening of the midlatitude westerlies. In this study, variability of the NAO during wintertime and its relationship to the North Atlantic ocean and Arctic sea ice is investigated. For this purpose, observational data are analyzed along with integrations of models for the Atlantic ocean, Arctic sea ice, and the coupled global climate system. From a statistical point of view, the observed NAO index shows unusually high variance on interdecadal time scales during the 20th century. Variability on other time scales is consistent with realizations of random processes (''white noise''). Recurrence of wintertime NAO anomalies from winter-to-winter with missing signals during the inbetween nonwinter seasons is primarily associated with interdecadal variability of the NAO. This recurrence indicates that low-frequency changes of the NAO during the 20th century were in part externally forced. (orig.)
Bottom-up and Top-down Input Augment the Variability of Cortical Neurons
Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.
2016-01-01
SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459
Genetic Variants Contribute to Gene Expression Variability in Humans
Hulse, Amanda M.; Cai, James J.
2013-01-01
Expression quantitative trait loci (eQTL) studies have established convincing relationships between genetic variants and gene expression. Most of these studies focused on the mean of gene expression level, but not the variance of gene expression level (i.e., gene expression variability). In the present study, we systematically explore genome-wide association between genetic variants and gene expression variability in humans. We adapt the double generalized linear model (dglm) to simultaneously fit the means and the variances of gene expression among the three possible genotypes of a biallelic SNP. The genomic loci showing significant association between the variances of gene expression and the genotypes are termed expression variability QTL (evQTL). Using a data set of gene expression in lymphoblastoid cell lines (LCLs) derived from 210 HapMap individuals, we identify cis-acting evQTL involving 218 distinct genes, among which 8 genes, ADCY1, CTNNA2, DAAM2, FERMT2, IL6, PLOD2, SNX7, and TNFRSF11B, are cross-validated using an extra expression data set of the same LCLs. We also identify ∼300 trans-acting evQTL between >13,000 common SNPs and 500 randomly selected representative genes. We employ two distinct scenarios, emphasizing single-SNP and multiple-SNP effects on expression variability, to explain the formation of evQTL. We argue that detecting evQTL may represent a novel method for effectively screening for genetic interactions, especially when the multiple-SNP influence on expression variability is implied. The implication of our results for revealing genetic mechanisms of gene expression variability is discussed. PMID:23150607
OAS :: Authorities : Permanent Representatives to the OAS
Rights Actions against Corruption C Children Civil Registry Civil Society Contact Us Culture Cyber Representative of Belize Diego Pary RodrÃguez Bolivia Diego Pary RodrÃguez Ambassador, Permanent Representative of Bolivia JosÃ© Luiz Machado Brazil JosÃ© Luiz Machado e Costa Ambassador, Permanent Representative
The randomly renewed general item and the randomly inspected item with exponential life distribution
International Nuclear Information System (INIS)
Schneeweiss, W.G.
1979-01-01
For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de
International Nuclear Information System (INIS)
Richey, J.B.; McBride, T.R.; Covic, J.
1979-01-01
This invention describes an automatic variable collimator which controls the width and thickness of X-ray beams in X-ray diagnostic medical equipment, and which is particularly adapted for use with computerized axial tomographic scanners. A two-part collimator is provided which shapes an X-ray beam both prior to its entering an object subject to radiographic analysis and after the attenuated beam has passed through the object. Interposed between a source of radiation and the object subject to radiographic analysis is a first or source collimator. The source collimator causes the X-ray beam emitted by the source of radiation to be split into a plurality of generally rectangular shaped beams. Disposed within the source collimator is a movable aperture plate which may be used to selectively vary the thickness of the plurality of generally rectangular shaped beams transmitted through the source collimator. A second or receiver collimator is interposed between the object subject to radiographic analysis and a series of radiation detectors. The receiver collimator is disposed to receive the attenuated X-ray beams passing through the object subject to radiographic analysis. Located within the receiver collimator are a plurality of movable aperture plates adapted to be displaced relative to a plurality of fixed aperture plates for the purpose of varying the width and thickness of the attenuated X-ray beams transmitted through the object subject to radiographic analysis. The movable aperture plates of the source and receiver collimators are automatically controlled by circuitry which is provided to allow remote operation of the movable aperture plates
Human visual system automatically represents large-scale sequential regularities.
Kimura, Motohiro; Widmann, Andreas; Schröger, Erich
2010-03-04
Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.
Grundvold, Irene; Tveit, Arnljot; Smith, Pål; Seljeflot, Ingebjørg; Abdelnoor, Michael; Arnesen, Harald
2008-01-01
The recurrence rate of atrial fibrillation after electrical cardioversion is disappointingly high. The aim of the present study was to prospectively investigate if standard echocardiographic variables at the day of cardioversion could predict sinus rhythm maintenance. Transthoracic echocardiographic examination was performed within 4 h after cardioversion for all the patients in the CAPRAF (Candesartan in the Prevention of Relapsing Atrial Fibrillation) study. Cardioversion was successful for 137 patients not given specific antiarrhythmic therapy, and only 41 (30%) maintained sinus rhythm at 6-month follow-up. There were significant (p = 0.05) lower transmitral A wave velocities in the group with relapsing atrial fibrillation compared with the group with sinus rhythm at 6-month follow-up. All patients with the lowest A wave velocities had an early recurrence of atrial fibrillation. There were no differences between the groups regarding atrial dimensions or left ventricular function. The use of the angiotensin II receptor antagonist candesartan had no influence on the echocardiographic variables, nor on the recurrence rate of atrial fibrillation after cardioversion. Transthoracic echocardiographic examination performed a short time after electrical cardioversion of atrial fibrillation showed that only A wave peak velocities were significantly predictive of sinus rhythm maintenance 6 months after the procedure. (c) 2008 S. Karger AG, Basel.
Widaman, Keith F; Grimm, Kevin J; Early, Dawnté R; Robins, Richard W; Conger, Rand D
2013-07-01
Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group.
International Nuclear Information System (INIS)
Shlesinger, Michael F
2009-01-01
There are a wide variety of searching problems from molecules seeking receptor sites to predators seeking prey. The optimal search strategy can depend on constraints on time, energy, supplies or other variables. We discuss a number of cases and especially remark on the usefulness of Levy walk search patterns when the targets of the search are scarce.
Patient representatives' views on patient information in clinical cancer trials.
Dellson, Pia; Nilbert, Mef; Carlsson, Christina
2016-02-01
Patient enrolment into clinical trials is based on oral information and informed consent, which includes an information sheet and a consent certificate. The written information should be complete, but at the same time risks being so complex that it may be questioned if a fully informed consent is possible to provide. We explored patient representatives' views and perceptions on the written trial information used in clinical cancer trials. Written patient information leaflets used in four clinical trials for colorectal cancer were used for the study. The trials included phase I-III trials, randomized and non-randomized trials that evaluated chemotherapy/targeted therapy in the neoadjuvant, adjuvant and palliative settings. Data were collected through focus groups and were analysed using inductive content analysis. Two major themes emerged: emotional responses and cognitive responses. Subthemes related to the former included individual preferences and perceptions of effect, while subthemes related to the latter were comprehensibility and layout. Based on these observations the patient representatives provided suggestions for improvement, which largely included development of future simplified and more attractive informed consent forms. The emotional and cognitive responses to written patient information reported by patient representatives provides a basis for revised formats in future trials and add to the body of information that support use of plain language, structured text and illustrations to improve the informed consent process and thereby patient enrolment into clinical trials.
On the representativeness of behavior observation samples in classrooms.
Tiger, Jeffrey H; Miller, Sarah J; Mevers, Joanna Lomas; Mintz, Joslyn Cynkus; Scheithauer, Mindy C; Alvarez, Jessica
2013-01-01
School consultants who rely on direct observation typically conduct observational samples (e.g., 1 30-min observation per day) with the hopes that the sample is representative of performance during the remainder of the day, but the representativeness of these samples is unclear. In the current study, we recorded the problem behavior of 3 referred students for 4 consecutive school days between 9:30 a.m. and 2:30 p.m. using duration recording in consecutive 10-min sessions. We then culled 10-min, 20-min, 30-min, and 60-min observations from the complete record and compared these observations to the true daily mean to assess their accuracy (i.e., how well individual observations represented the daily occurrence of target behaviors). The results indicated that when behavior occurred with low variability, the majority of brief observations were representative of the overall levels; however, when behavior occurred with greater variability, even 60-min observations did not accurately capture the true levels of behavior. © Society for the Experimental Analysis of Behavior.
Assessing the use of cognitive heuristic representativeness in clinical reasoning.
Payne, Velma L; Crowley, Rebecca S; Crowley, Rebecca
2008-11-06
We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors.
Assessing Use of Cognitive Heuristic Representativeness in Clinical Reasoning
Payne, Velma L.; Crowley, Rebecca S.
2008-01-01
We performed a pilot study to investigate use of the cognitive heuristic Representativeness in clinical reasoning. We tested a set of tasks and assessments to determine whether subjects used the heuristics in reasoning, to obtain initial frequencies of heuristic use and related cognitive errors, and to collect cognitive process data using think-aloud techniques. The study investigates two aspects of the Representativeness heuristic - judging by perceived frequency and representativeness as causal beliefs. Results show that subjects apply both aspects of the heuristic during reasoning, and make errors related to misapplication of these heuristics. Subjects in this study rarely used base rates, showed significant variability in their recall of base rates, demonstrated limited ability to use provided base rates, and favored causal data in diagnosis. We conclude that the tasks and assessments we have developed provide a suitable test-bed to study the cognitive processes underlying heuristic errors. PMID:18999140
Represented Speech in Qualitative Health Research
DEFF Research Database (Denmark)
Musaeus, Peter
2017-01-01
Represented speech refers to speech where we reference somebody. Represented speech is an important phenomenon in everyday conversation, health care communication, and qualitative research. This case will draw first from a case study on physicians’ workplace learning and second from a case study...... on nurses’ apprenticeship learning. The aim of the case is to guide the qualitative researcher to use own and others’ voices in the interview and to be sensitive to represented speech in everyday conversation. Moreover, reported speech matters to health professionals who aim to represent the voice...... of their patients. Qualitative researchers and students might learn to encourage interviewees to elaborate different voices or perspectives. Qualitative researchers working with natural speech might pay attention to how people talk and use represented speech. Finally, represented speech might be relevant...
High Entropy Random Selection Protocols
H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim
2007-01-01
textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.
Randomized Item Response Theory Models
Fox, Gerardus J.A.
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by
Decompounding random sums: A nonparametric approach
DEFF Research Database (Denmark)
Hansen, Martin Bøgsted; Pitts, Susan M.
Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...
A Model for Positively Correlated Count Variables
DEFF Research Database (Denmark)
Møller, Jesper; Rubak, Ege Holger
2010-01-01
An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields...... and their potential applications. The purpose of this paper is to summarize useful probabilistic results, study stochastic constructions and simulation techniques, and discuss some examples of α-permanental random fields. This should provide a useful basis for discussing the statistical aspects in future work....
International Nuclear Information System (INIS)
Loubenets, Elena R.
2015-01-01
We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence of this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)
Gurau, Razvan
2017-01-01
Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....
Understanding Solar Cycle Variability
Energy Technology Data Exchange (ETDEWEB)
Cameron, R. H.; Schüssler, M., E-mail: cameron@mps.mpg.de [Max-Planck-Institut für Sonnensystemforschung, Justus-von-Liebig-Weg 3, D-37077 Göttingen (Germany)
2017-07-10
The level of solar magnetic activity, as exemplified by the number of sunspots and by energetic events in the corona, varies on a wide range of timescales. Most prominent is the 11-year solar cycle, which is significantly modulated on longer timescales. Drawing from dynamo theory, together with the empirical results of past solar activity and similar phenomena for solar-like stars, we show that the variability of the solar cycle can be essentially understood in terms of a weakly nonlinear limit cycle affected by random noise. In contrast to ad hoc “toy models” for the solar cycle, this leads to a generic normal-form model, whose parameters are all constrained by observations. The model reproduces the characteristics of the variable solar activity on timescales between decades and millennia, including the occurrence and statistics of extended periods of very low activity (grand minima). Comparison with results obtained with a Babcock–Leighton-type dynamo model confirm the validity of the normal-mode approach.
Representative Sampling for reliable data analysis
DEFF Research Database (Denmark)
Petersen, Lars; Esbensen, Kim Harry
2005-01-01
regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...
How to get rid of W: a latent variables approach to modelling spatially lagged variables
Folmer, H.; Oud, J.
2008-01-01
In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are
How to get rid of W : a latent variables approach to modelling spatially lagged variables
Folmer, Henk; Oud, Johan
2008-01-01
In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are
Cryotherapy, Sensation, and Isometric-Force Variability
Denegar, Craig R.; Buckley, William E.; Newell, Karl M.
2003-01-01
Objective: To determine the changes in sensation of pressure, 2-point discrimination, and submaximal isometric-force production variability due to cryotherapy. Design and Setting: Sensation was assessed using a 2 × 2 × 2 × 3 repeated-measures factorial design, with treatment (ice immersion or control), limb (right or left), digit (finger or thumb), and sensation test time (baseline, posttreatment, or postisometric-force trials) as independent variables. Dependent variables were changes in sensation of pressure and 2-point discrimination. Isometric-force variability was tested with a 2 × 2 × 3 repeated-measures factorial design. Treatment condition (ice immersion or control), limb (right or left), and percentage (10, 25, or 40) of maximal voluntary isometric contraction (MVIC) were the independent variables. The dependent variables were the precision or variability (the standard deviation of mean isometric force) and the accuracy or targeting error (the root mean square error) of the isometric force for each percentage of MVIC. Subjects: Fifteen volunteer college students (8 men, 7 women; age = 22 ± 3 years; mass = 72 ± 21.9 kg; height = 183.4 ± 11.6 cm). Measurements: We measured sensation in the distal palmar aspect of the index finger and thumb. Sensation of pressure and 2-point discrimination were measured before treatment (baseline), after treatment (15 minutes of ice immersion or control), and at the completion of isometric testing (final). Variability (standard deviation of mean isometric force) of the submaximal isometric finger forces was measured by having the subjects exert a pinching force with the thumb and index finger for 30 seconds. Subjects performed the pinching task at the 3 submaximal levels of MVIC (10%, 25%, and 40%), with the order of trials assigned randomly. The subjects were given a target representing the submaximal percentage of MVIC and visual feedback of the force produced as they pinched the testing device. The force exerted
Random walks on reductive groups
Benoist, Yves
2016-01-01
The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.
Directory of Open Access Journals (Sweden)
Giannasi Lilian
2012-05-01
Full Text Available Abstract Background Few studies demonstrate effectiveness of therapies for oral rehabilitation of patients with cerebral palsy (CP, given the difficulties in chewing, swallowing and speech, besides the intellectual, sensory and social limitations. Due to upper airway obstruction, they are also vulnerable to sleep disorders. This study aims to assess the sleep variables, through polysomnography, and masticatory dynamics, using electromiography, before and after neuromuscular electrical stimulation, associated or not with low power laser (Gallium Arsenide- Aluminun, =780 nm and LED (= 660 nm irradiation in CP patients. Methods/design 50 patients with CP, both gender, aged between 19 and 60 years will be enrolled in this study. The inclusion criteria are: voluntary participation, patient with hemiparesis, quadriparesis or diparetic CP, with ability to understand and respond to verbal commands. The exclusion criteria are: patients undergoing/underwent orthodontic, functional maxillary orthopedic or botulinum toxin treatment. Polysomnographic and surface electromyographic exams on masseter, temporalis and suprahyoid will be carry out in all sample. Questionnaire assessing oral characteristics will be applied. The sample will be divided into 5 treatment groups: Group 1: neuromuscular electrical stimulation; Group 2: laser therapy; Group 3: LED therapy; Group 4: neuromuscular electrical stimulation and laser therapy and Group 5: neuromuscular electrical stimulation and LED therapy. All patients will be treated during 8 consecutive weeks. After treatment, polysomnographic and electromiographic exams will be collected again. Discussion This paper describes a five arm clinical trial assessing the examination of sleep quality and masticatory function in patients with CP under non-invasive therapies. Trial registration The protocol for this study is registered with the Brazilian Registry of Clinical Trials - ReBEC RBR-994XFS Descriptors Cerebral Palsy
Giannasi, Lilian Chrystiane; Matsui, Miriam Yumi; de Freitas Batista, Sandra Regina; Hardt, Camila Teixeira; Gomes, Carla Paes; Amorim, José Benedito Oliveira; de Carvalho Aguiar, Isabella; Collange, Luanda; Dos Reis Dos Santos, Israel; Dias, Ismael Souza; de Oliveira, Cláudia Santos; de Oliveira, Luis Vicente Franco; Gomes, Mônica Fernandes
2012-05-15
Few studies demonstrate effectiveness of therapies for oral rehabilitation of patients with cerebral palsy (CP), given the difficulties in chewing, swallowing and speech, besides the intellectual, sensory and social limitations. Due to upper airway obstruction, they are also vulnerable to sleep disorders. This study aims to assess the sleep variables, through polysomnography, and masticatory dynamics, using electromiography, before and after neuromuscular electrical stimulation, associated or not with low power laser (Gallium Arsenide- Aluminun, =780 nm) and LED (= 660 nm) irradiation in CP patients. 50 patients with CP, both gender, aged between 19 and 60 years will be enrolled in this study. The inclusion criteria are: voluntary participation, patient with hemiparesis, quadriparesis or diparetic CP, with ability to understand and respond to verbal commands. The exclusion criteria are: patients undergoing/underwent orthodontic, functional maxillary orthopedic or botulinum toxin treatment. Polysomnographic and surface electromyographic exams on masseter, temporalis and suprahyoid will be carry out in all sample. Questionnaire assessing oral characteristics will be applied. The sample will be divided into 5 treatment groups: Group 1: neuromuscular electrical stimulation; Group 2: laser therapy; Group 3: LED therapy; Group 4: neuromuscular electrical stimulation and laser therapy and Group 5: neuromuscular electrical stimulation and LED therapy. All patients will be treated during 8 consecutive weeks. After treatment, polysomnographic and electromiographic exams will be collected again. This paper describes a five arm clinical trial assessing the examination of sleep quality and masticatory function in patients with CP under non-invasive therapies. The protocol for this study is registered with the Brazilian Registry of Clinical Trials - ReBEC RBR-994XFS.
14 CFR 1274.906 - Designation of New Technology Representative and Patent Representative.
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Designation of New Technology... Conditions § 1274.906 Designation of New Technology Representative and Patent Representative. Designation of New Technology Representative and Patent Representative July 2002 (a) For purposes of administration...
Fuzziness and randomness in an optimization framework
International Nuclear Information System (INIS)
Luhandjula, M.K.
1994-03-01
This paper presents a semi-infinite approach for linear programming in the presence of fuzzy random variable coefficients. As a byproduct a way for dealing with optimization problems including both fuzzy and random data is obtained. Numerical examples are provided for the sake of illustration. (author). 13 refs
Towards Representative Metallurgical Sampling and Gold Recovery Testwork Programmes
Directory of Open Access Journals (Sweden)
Simon C. Dominy
2018-05-01
Full Text Available When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and high quality testwork. The quality and type of samples used are as important as the testwork itself. The key characteristic required of any set of samples is that they represent a given domain and quantify its variability. There are those who think that stating a sample(s is representative makes it representative without justification. There is a need to consider both (1 in-situ and (2 testwork sub-sample representativity. Early ore/waste characterisation and domain definition are required, so that sampling and testwork protocols can be designed to suit the style of mineralisation in question. The Theory of Sampling (TOS provides an insight into the causes and magnitude of errors that may occur during the sampling of particulate materials (e.g., broken rock and is wholly applicable to metallurgical sampling. Quality assurance/quality control (QAQC is critical throughout all programmes. Metallurgical sampling and testwork should be fully integrated into geometallurgical studies. Traditional metallurgical testwork is critical for plant design and is an inherent part of geometallurgy. In a geometallurgical study, multiple spatially distributed small-scale tests are used as proxies for process parameters. These will be validated against traditional testwork results. This paper focusses on sampling and testwork for gold recovery determination. It aims to provide the reader with the background to move towards the design, implementation and reporting of representative and fit-for-purpose sampling and testwork programmes. While the paper does not intend to provide a definitive commentary, it critically assesses the hard-rock sampling methods used and their optimal collection and preparation. The need for representative sampling and quality testwork to avoid financial and intangible losses is
Verification of Representative Sampling in RI waste
International Nuclear Information System (INIS)
Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop
2009-01-01
For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'
Representing Uncertainty by Probability and Possibility
DEFF Research Database (Denmark)
of uncertain parameters. Monte Carlo simulation is readily used for practical calculations. However, an alternative approach is offered by possibility theory making use of possibility distributions such as intervals and fuzzy intervals. This approach is well suited to represent lack of knowledge or imprecision......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
International Nuclear Information System (INIS)
Khrennikov, Andrei
2010-01-01
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical random fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.
Seehaus, Frank; Schwarze, Michael; Flörkemeier, Thilo; von Lewinski, Gabriela; Kaptein, Bart L; Jakubowitz, Eike; Hurschler, Christof
2016-05-01
Implant migration can be accurately quantified by model-based Roentgen stereophotogrammetric analysis (RSA), using an implant surface model to locate the implant relative to the bone. In a clinical situation, a single reverse engineering (RE) model for each implant type and size is used. It is unclear to what extent the accuracy and precision of migration measurement is affected by implant manufacturing variability unaccounted for by a single representative model. Individual RE models were generated for five short-stem hip implants of the same type and size. Two phantom analyses and one clinical analysis were performed: "Accuracy-matched models": one stem was assessed, and the results from the original RE model were compared with randomly selected models. "Accuracy-random model": each of the five stems was assessed and analyzed using one randomly selected RE model. "Precision-clinical setting": implant migration was calculated for eight patients, and all five available RE models were applied to each case. For the two phantom experiments, the 95%CI of the bias ranged from -0.28 mm to 0.30 mm for translation and -2.3° to 2.5° for rotation. In the clinical setting, precision is less than 0.5 mm and 1.2° for translation and rotation, respectively, except for rotations about the proximodistal axis (RSA can be achieved and are not biased by using a single representative RE model. At least for implants similar in shape to the investigated short-stem, individual models are not necessary. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:903-910, 2016. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Quantifying intrinsic and extrinsic variability in stochastic gene expression models.
Singh, Abhyudai; Soltani, Mohammad
2013-01-01
Genetically identical cell populations exhibit considerable intercellular variation in the level of a given protein or mRNA. Both intrinsic and extrinsic sources of noise drive this variability in gene expression. More specifically, extrinsic noise is the expression variability that arises from cell-to-cell differences in cell-specific factors such as enzyme levels, cell size and cell cycle stage. In contrast, intrinsic noise is the expression variability that is not accounted for by extrinsic noise, and typically arises from the inherent stochastic nature of biochemical processes. Two-color reporter experiments are employed to decompose expression variability into its intrinsic and extrinsic noise components. Analytical formulas for intrinsic and extrinsic noise are derived for a class of stochastic gene expression models, where variations in cell-specific factors cause fluctuations in model parameters, in particular, transcription and/or translation rate fluctuations. Assuming mRNA production occurs in random bursts, transcription rate is represented by either the burst frequency (how often the bursts occur) or the burst size (number of mRNAs produced in each burst). Our analysis shows that fluctuations in the transcription burst frequency enhance extrinsic noise but do not affect the intrinsic noise. On the contrary, fluctuations in the transcription burst size or mRNA translation rate dramatically increase both intrinsic and extrinsic noise components. Interestingly, simultaneous fluctuations in transcription and translation rates arising from randomness in ATP abundance can decrease intrinsic noise measured in a two-color reporter assay. Finally, we discuss how these formulas can be combined with single-cell gene expression data from two-color reporter experiments for estimating model parameters.
Services Subcontract Technical Representative (STR) handbook
International Nuclear Information System (INIS)
Houston, D.H.
1997-06-01
The purpose of this handbook is to provide guidance to Bechtel Hanford, Inc. Subcontract Representatives in their assignments. It is the intention of this handbook to ensure that subcontract work is performed in accordance with the subcontract documents
REFractions: The Representing Equivalent Fractions Game
Tucker, Stephen I.
2014-01-01
Stephen Tucker presents a fractions game that addresses a range of fraction concepts including equivalence and computation. The REFractions game also improves students' fluency with representing, comparing and adding fractions.
Representing Boolean Functions by Decision Trees
Chikalov, Igor
2011-01-01
A Boolean or discrete function can be represented by a decision tree. A compact form of decision tree named binary decision diagram or branching program is widely known in logic design [2, 40]. This representation is equivalent to other forms
Request by the Resident Representative of Iraq
International Nuclear Information System (INIS)
1990-01-01
The attached clarification by a spokesman of the Iraqi Ministry of Foreign Affairs is being circulated for the information of Member States pursuant to a request made by the Resident Representative of Iraq
8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models
Energy Technology Data Exchange (ETDEWEB)
Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-08-03
Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve load over many years or decades. CEMs can be computationally complex and are often forced to estimate key parameters using simplified methods to achieve acceptable solve times or for other reasons. In this paper, we discuss one of these parameters -- capacity value (CV). We first provide a high-level motivation for and overview of CV. We next describe existing modeling simplifications and an alternate approach for estimating CV that utilizes hourly '8760' data of load and VG resources. We then apply this 8760 method to an established CEM, the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) model (Eurek et al. 2016). While this alternative approach for CV is not itself novel, it contributes to the broader CEM community by (1) demonstrating how a simplified 8760 hourly method, which can be easily implemented in other power sector models when data is available, more accurately captures CV trends than a statistical method within the ReEDS CEM, and (2) providing a flexible modeling framework from which other 8760-based system elements (e.g., demand response, storage, and transmission) can be added to further capture important dynamic interactions, such as curtailment.
Czech Academy of Sciences Publication Activity Database
Kebza, V.; Šolcová, Iva
2008-01-01
Roč. 23, Suppl. 1 (2008), s. 158 ISSN 0887-0446 R&D Projects: GA AV ČR(CZ) IAA 700250701 Institutional research plan: CEZ:AV0Z70250504 Keywords : hardiness * Czech population sample * social support Subject RIV: AN - Psychology
Statistical auditing and randomness test of lotto k/N-type games
Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.
2008-11-01
One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.
Crisis Relocation Workshops for Transportation Industry Representatives
1979-12-01
executive, two National Guard members, one Air Force transporta- _a ,tion representative, two Red Cross representatives, one school bus z coordinators...manaaers, local transitCD operator, and miiltary busingA authorityA Local Governa-ient 2 Fire chief, assistantI Air Force 3 Liaison Support D CPA :z...to attend the workshop. Majur trans- portation problems anticipated during crisis relocation include: 1. Transportation of carless residents; 2. The
Random distributed feedback fibre lasers
Energy Technology Data Exchange (ETDEWEB)
Turitsyn, Sergei K., E-mail: s.k.turitsyn@aston.ac.uk [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Babin, Sergey A. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Churkin, Dmitry V. [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Vatnik, Ilya D.; Nikulin, Maxim [Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Podivilov, Evgenii V. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation)
2014-09-10
generation of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres). This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1–1.6 μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide
Random distributed feedback fibre lasers
International Nuclear Information System (INIS)
Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.
2014-01-01
generation of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres). This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1–1.6 μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide
Classification and prediction of port variables
Energy Technology Data Exchange (ETDEWEB)
Molina Serrano, B.
2016-07-01
Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)
International Nuclear Information System (INIS)
Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing
2007-01-01
Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)
Random matrices and random difference equations
International Nuclear Information System (INIS)
Uppuluri, V.R.R.
1975-01-01
Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models
Random survival forests for competing risks
DEFF Research Database (Denmark)
Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B
2014-01-01
We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...
Algebraic polynomials with random coefficients
Directory of Open Access Journals (Sweden)
K. Farahmand
2002-01-01
Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.
Temperatures and heating energy in New Zealand houses from a nationally representative study - HEEP
Energy Technology Data Exchange (ETDEWEB)
French, L.J.; Camilleri, M.J.; Isaacs, N.P.; Pollard, A.R. [BRANZ Ltd., Private Bag 50 908, Porirua City (New Zealand)
2007-07-15
The household energy end-use project (HEEP) has collected energy and temperature data from a randomly selected, nationally representative sample of about 400 houses throughout New Zealand. This database has been used to explore the drivers of indoor temperatures and heating energy. Initial analysis of the winter living room temperatures shows that heating type, climate and house age are the key drivers. On average, houses heated by solid fuel are the warmest, with houses heated by portable LPG and electric heaters the coldest. Over the three winter months, living rooms are below 20 {sup o}C for 83% of the time - and the living room is typically the warmest room. Central heating is in only 5% of houses. Solid fuel is the dominant heating fuel in houses. The lack of air conditioning means that summer temperatures are affected by passive influences (e.g. house design, construction). Summer temperatures are strongly influenced by the house age and the local climate - together these variables explain 69% of the variation in daytime (9 a.m. to 5 p.m.) living room temperatures. In both summer and winter newer (post-1978) houses are warmer - this is beneficial in winter, but the high temperatures in summer are potentially uncomfortable. (author)
Effect of the Young modulus variability on the mechanical behaviour of a nuclear containment vessel
Energy Technology Data Exchange (ETDEWEB)
Larrard, T. de, E-mail: delarrard@lmt.ens-cachan.f [LMT-ENS Cachan, CNRS/UPMC/PRES UniverSud Paris (France); Colliat, J.B.; Benboudjema, F. [LMT-ENS Cachan, CNRS/UPMC/PRES UniverSud Paris (France); Torrenti, J.M. [Universite Paris-Est, LCPC (France); Nahas, G. [IRSN/DSR/SAMS/BAGS, Fontenay-aux-Roses (France)
2010-12-15
This study aims at investigating the influence of the Young modulus variability on the mechanical behaviour of a nuclear containment vessel in case of a loss of cooling agent accident and under the assumption of an elastic behaviour. To achieve this investigation, the Monte-Carlo Method is carried out thanks to a middleware which encapsulates the different components (random field generation, FE simulations) and enables calculations parallelisation. The main goal is to quantify the uncertainty propagation by comparing the maximal values of outputs of interest (orthoradial stress and Mazars equivalent strain) for each realisation of the considered random field with the ones obtained from a reference calculation taking into account uniform field (equal to the expected value of the random field). The Young modulus is supposed to be accurately represented by a weakly homogeneous random field and realisations are provided through its truncated Karhunen-Loeve expansion. This study reveals that the expected value for the maximal equivalent strain in the structure is more important when considering the Young modulus spatial variability than the value obtained from a deterministic approach with a uniform Young modulus field. The influence of the correlation length is investigated too. Finally it is shown that there is no correlation between the maximal values location of equivalent strain and the ones where the Young modulus extreme values are observed for each realisation.
48 CFR 1852.227-72 - Designation of new technology representative and patent representative.
2010-10-01
... CONTRACT CLAUSES Texts of Provisions and Clauses 1852.227-72 Designation of new technology representative... of New Technology Representative and Patent Representative (JUL 1997) (a) For purposes of administration of the clause of this contract entitled “New Technology” or “Patent Rights—Retention by the...
14 CFR 1260.58 - Designation of new technology representative and patent representative.
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Designation of new technology... of new technology representative and patent representative. Designation of New Technology... of this grant entitled “New Technology,” the following named representatives are hereby designated by...
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Changing Hg designated representative and alternate Hg designated representative; changes in owners and operators. 60.4112 Section 60.4112... Generating Units Hg Designated Representative for Hg Budget Sources § 60.4112 Changing Hg designated...
Risk Gambling and Personality: Results from a Representative Swedish Sample.
Sundqvist, Kristina; Wennberg, Peter
2015-12-01
The association between personality and gambling has been explored previously. However, few studies are based on representative populations. This study aimed at examining the association between risk gambling and personality in a representative Swedish population. A random Swedish sample (N = 19,530) was screened for risk gambling using the Lie/Bet questionnaire. The study sample (N = 257) consisted of those screening positive on Lie/Bet and completing a postal questionnaire about gambling and personality (measured with the NODS-PERC and the HP5i respectively). Risk gambling was positively correlated with Negative Affectivity (a facet of Neuroticism) and Impulsivity (an inversely related facet of Conscientiousness), but all associations were weak. When taking age and gender into account, there were no differences in personality across game preference groups, though preferred game correlated with level of risk gambling. Risk gamblers scored lower than the population norm data with respect to Negative Affectivity, but risk gambling men scored higher on Impulsivity. The association between risk gambling and personality found in previous studies was corroborated in this study using a representative sample. We conclude that risk and problem gamblers should not be treated as a homogeneous group, and prevention and treatment interventions should be adapted according to differences in personality, preferred type of game and the risk potential of the games.
Directory of Open Access Journals (Sweden)
Shuiqing Yu
2013-01-01
Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.
Representing uncertainty on model analysis plots
Directory of Open Access Journals (Sweden)
Trevor I. Smith
2016-09-01
Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Representative Democracy in Australian Local Government
Directory of Open Access Journals (Sweden)
Colin Hearfield
2009-01-01
Full Text Available In an assessment of representative democracy in Australian local government, this paper considers long-run changes in forms of political representation, methods of vote counting, franchise arrangements, numbers of local government bodies and elected representatives, as well as the thorny question of constitutional recognition. This discussion is set against the background of ongoing tensions between the drive for economic efficiency and the maintenance of political legitimacy, along with more deep-seated divisions emerging from the legal relationship between local and state governments and the resultant problems inherent in local government autonomy versus state intervention.
Enhancing policy innovation by redesigning representative democracy
DEFF Research Database (Denmark)
Sørensen, Eva
2016-01-01
Policy innovation is a key aspect of public innovation, which has been largely overlooked. Political leadership, competition and collaboration are key drivers of policy innovation. It is a barrier in traditional models of representative democracy that they provide weak conditions for collaboration....... Two Danish case studies indicate that collaboration between politicians and relevant and affected stakeholders can promote policy innovation, but also that a redesign of representative democracy is needed in order to establish a productive combination of political leadership, competition...... and collaboration in political life....
Representativeness elements of an hybrid reactor demonstrator
International Nuclear Information System (INIS)
Kerdraon, D.; Billebaud, A.; Brissot, R.; David, S.; Giorni, A.; Heuer, D.; Loiseaux, J.M.; Meplan, O.
2000-11-01
This document deals with the quantification of the minimum thermal power level for a demonstrator and the definition of the physical criteria which define the representative character of a demonstrator towards a power reactor. Solutions allowing to keep an acceptable flow in an industrial core, have also been studied. The document is divided in three parts: the representativeness elements, the considered solutions and the characterization of the neutrons flows at the interfaces and the dose rates at the outer surface of the vessel. (A.L.B.)
Representative process sampling for reliable data analysis
DEFF Research Database (Denmark)
Julius, Lars Petersen; Esbensen, Kim
2005-01-01
(sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...
Machine learning search for variable stars
Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis
2018-04-01
Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.
Topics in random walks in random environment
International Nuclear Information System (INIS)
Sznitman, A.-S.
2004-01-01
Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)
A Review on asymptotic normality of sums of associated random ...
African Journals Online (AJOL)
Association between random variables is a generalization of independence of these random variables. This concept is more and more commonly used in current trends in any research elds in Statistics. In this paper, we proceed to a simple, clear and rigorous introduction to it. We will present the fundamental asymptotic ...
Qualitatively Assessing Randomness in SVD Results
Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.
2012-12-01
Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.
Attributes Heeded When Representing an Osmosis Problem.
Zuckerman, June Trop
Eighteen high school science students were involved in a study to determine what attributes in the problem statement they need when representing a typical osmosis problem. In order to realize this goal students were asked to solve problems aloud and to explain their answers. Included as a part of the results are the attributes that the students…
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…
Strong imploding shock, the representative curve
International Nuclear Information System (INIS)
Mishkin, E.A.; Alejaldre, C.
1981-01-01
The representative curve of the ideal gas behind the front of a spherically, or cylindrically, symmetric strong imploding shock is shown to pass through the point where the reduced pressure is maximum, P(xisub(m)) = Psub(m)sub(a)sub(x). (orig.)
28 CFR 104.4 - Personal Representative.
2010-07-01
... addition, as provided in § 104.21(b)(5) of this part, the Special Master may publish a list of individuals... Special Master may, in his discretion, determine that the Personal Representative for purposes of... administrator of the decedent's estate. In the event no will exists, the Special Master may, in his discretion...
Adjustment Following Disability: Representative Case Studies.
Heinemann, Allen W.; Shontz, Franklin C.
1984-01-01
Examined adjustment following physical disability using the representative case method with two persons with quadriplegia. Results highlighted the importance of previously established coping styles as well as the role of the environment in adjustment. Willingness to mourn aided in later growth. (JAC)
Padilla, Alberto
2009-01-01
Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...
International Nuclear Information System (INIS)
Colbeck, Roger; Kent, Adrian
2006-01-01
Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT
Colbeck, Roger; Kent, Adrian
2006-03-01
Alice is a charismatic quantum cryptographer who believes her parties are unmissable; Bob is a (relatively) glamorous string theorist who believes he is an indispensable guest. To prevent possibly traumatic collisions of self-perception and reality, their social code requires that decisions about invitation or acceptance be made via a cryptographically secure variable-bias coin toss (VBCT). This generates a shared random bit by the toss of a coin whose bias is secretly chosen, within a stipulated range, by one of the parties; the other party learns only the random bit. Thus one party can secretly influence the outcome, while both can save face by blaming any negative decisions on bad luck. We describe here some cryptographic VBCT protocols whose security is guaranteed by quantum theory and the impossibility of superluminal signaling, setting our results in the context of a general discussion of secure two-party computation. We also briefly discuss other cryptographic applications of VBCT.
Comparison of variability in pork carcass composition and quality between barrows and gilts.
Overholt, M F; Arkfeld, E K; Mohrhauser, D A; King, D A; Wheeler, T L; Dilger, A C; Shackelford, S D; Boler, D D
2016-10-01
Pigs ( = 8,042) raised in 8 different barns representing 2 seasons (cold and hot) and 2 production focuses (lean growth and meat quality) were used to characterize variability of carcass composition and quality traits between barrows and gilts. Data were collected on 7,684 pigs at the abattoir. Carcass characteristics, subjective loin quality, and fresh ham face color (muscles) were measured on a targeted 100% of carcasses. Fresh belly characteristics, boneless loin weight, instrumental loin color, and ultimate loin pH measurements were collected from 50% of the carcasses each slaughter day. Adipose tissue iodine value (IV), 30-min loin pH, LM slice shear force, and fresh ham muscle characteristic measurements were recorded on 10% of carcasses each slaughter day. Data were analyzed using the MIXED procedure of SAS as a 1-way ANOVA in a randomized complete block design with 2 levels (barrows and gilts). Barn (block), marketing group, production focus, and season were random variables. A 2-variance model was fit using the REPEATED statement of the MIXED procedure, grouped by sex for analysis of least squares means. Homogeneity of variance was tested on raw data using Levene's test of the GLM procedure. Hot carcass weight of pigs (94.6 kg) in this study was similar to U.S. industry average HCW (93.1 kg). Therefore, these data are representative of typical U.S. pork carcasses. There was no difference ( ≥ 0.09) in variability of HCW or loin depth between barrow and gilt carcasses. Back fat depth and estimated carcass lean were more variable ( ≤ 0.0001) and IV was less variable ( = 0.05) in carcasses from barrows than in carcasses from gilts. Fresh belly weight and thickness were more variable ( ≤ 0.01) for bellies of barrows than bellies of gilts, but there was no difference in variability for belly length, width, or flop distance ( ≥ 0.06). Fresh loin subjective color was less variable ( ham traits. Overall, traits associated with carcass fatness, including
Funaki, Tadahisa
2016-01-01
Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...
International Nuclear Information System (INIS)
Antignani, Sara; Carelli, Vinicio; Cordedda, Carlo; Zonno, Fedele; Ampollini, Marco; Carpentieri, Carmela; Venoso, Gennaro; Bochicchio, Francesco
2013-01-01
Representative national surveys in dwellings are important to unbiasedly evaluate the exposure of the general population to radon. In Italy, a representative national survey was conducted from 1989 to 1996, which involved about 5600 dwellings in 232 towns. Later on, some Regions carried out more detailed surveys, but a new national survey in dwellings is necessary in order to obtain a more thorough estimate of radon concentration distribution over the Italian territory. The need to make this survey in an affordable way led to implement a new approach based on the collaboration between the Istituto Superiore di Sanità and a national company with workplaces and employees' homes throughout the country. The intent is to carry out a proxy of a population representative survey by measuring radon concentration in the homes of a random sample of the company employees. The realisation of this survey was affordable, thanks to the availability of corporate e-mail for each employee, intranet service, and company internal mail service. A dedicated web procedure and e-questionnaires allowed to automatically manage the contact with employees and to collect their data, which were both cost- and time-saving. Using this e-mail contact approach, 53% of contacted employees consented to participate in the survey. Radon concentration passive measuring devices were distributed to about 7000 dwellings, using about 14000 CR-39 detectors (two measured rooms per dwelling). In order to reduce costs, the devices were exposed for 12 months instead of two consecutive 6-month periods (as with the former national survey). A first checking of the actual representativeness of the sample was done by comparing characteristics of dwellings and occupants in the sample with corresponding data from the latest National Census. This was accomplished thanks to the fact that the questions in the survey questionnaire were tailored to the categories adopted for the Census questionnaire. A preliminary
Machine learning techniques to select variable stars
Directory of Open Access Journals (Sweden)
García-Varela Alejandro
2017-01-01
Full Text Available In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.
Stable limits for sums of dependent infinite variance random variables
DEFF Research Database (Denmark)
Bartkiewicz, Katarzyna; Jakubowski, Adam; Mikosch, Thomas
2011-01-01
The aim of this paper is to provide conditions which ensure that the affinely transformed partial sums of a strictly stationary process converge in distribution to an infinite variance stable distribution. Conditions for this convergence to hold are known in the literature. However, most of these...
Asymptotics of sums of lognormal random variables with Gaussian copula
DEFF Research Database (Denmark)
Asmussen, Søren; Rojas-Nandayapa, Leonardo
2008-01-01
Let (Y1, ..., Yn) have a joint n-dimensional Gaussian distribution with a general mean vector and a general covariance matrix, and let Xi = eYi, Sn = X1 + ⋯ + Xn. The asymptotics of P (Sn > x) as n → ∞ are shown to be the same as for the independent case with the same lognormal marginals. In part...
Genetic variability of Indian yaks using random amplified ...
African Journals Online (AJOL)
Only five primers (ILO 526, OPAV 15, ILO 1127, ILO 1065 and ILO 876) out of the ten primers tried produced consistant polymorphic fingerprints. Of the 76 fingerprints produced, 49 were present in all types, 21 were individual specific and 6 were polymorphic for different types. The pair wise comparison studied for different ...
Generation of correlated finite alphabet waveforms using gaussian random variables
Jardak, Seifallah; Ahmed, Sajid; Alouini, Mohamed-Slim
2014-01-01
, the proposed scheme is general, the main focus of this paper is to generate finite alphabet waveforms for multiple-input multiple-output radar, where correlated waveforms are used to achieve desired beampatterns. © 2014 IEEE.
Fast analytical method for the addition of random variables
International Nuclear Information System (INIS)
Senna, V.; Milidiu, R.L.; Fleming, P.V.; Salles, M.R.; Oliveria, L.F.S.
1983-01-01
Using the minimal cut sets representation of a fault tree, a new approach to the method of moments is proposed in order to estimate confidence bounds to the top event probability. The method utilizes two or three moments either to fit a distribution (the normal and lognormal families) or to evaluate bounds from standard inequalities (e.g. Markov, Tchebycheff, etc.) Examples indicate that the results obtained by the log-normal family are in good agreement with those obtained by Monte Carlo simulation
Representing Boolean Functions by Decision Trees
Chikalov, Igor
2011-01-01
A Boolean or discrete function can be represented by a decision tree. A compact form of decision tree named binary decision diagram or branching program is widely known in logic design [2, 40]. This representation is equivalent to other forms, and in some cases it is more compact than values table or even the formula [44]. Representing a function in the form of decision tree allows applying graph algorithms for various transformations [10]. Decision trees and branching programs are used for effective hardware [15] and software [5] implementation of functions. For the implementation to be effective, the function representation should have minimal time and space complexity. The average depth of decision tree characterizes the expected computing time, and the number of nodes in branching program characterizes the number of functional elements required for implementation. Often these two criteria are incompatible, i.e. there is no solution that is optimal on both time and space complexity. © Springer-Verlag Berlin Heidelberg 2011.
Climate Change and the Representative Agent
Energy Technology Data Exchange (ETDEWEB)
Howarth, R.B. [Environmental Studies Program, Dartmouth College, Hanover, New Hampshire 03755 (United States)
2000-02-01
The artifice of an infinitely-lived representative agent is commonly invoked to balance the present costs and future benefits of climate stabilization policies. Since actual economies are populated by overlapping generations of finite-lived persons, this approach begs important questions of welfare aggregation. This paper compares the results of representative agent and overlapping generations models that are numerically calibrated based on standard assumptions regarding climate economy interactions. Under two social choice rules - Pareto efficiency and classical utilitarianism - the models generate closely similar simulation results. In the absence of policies to redistribute income between present and future generations, efficient rates of carbon dioxide emissions abatement rise from 15 to 20% between the years 2000 and 2105. Under classical utilitarianism, in contrast, optimal control rates rise from 48 to 79% this same period. 23 refs.
Meeting staff representatives of the European Agencies
Staff Association
2014-01-01
The AASC (Assembly of Agency Staff Committee) held its 27th Meeting of the specialized European Agencies on 26 and 27 May on the premises of the OHIM (Office for Harmonization in the Internal Market) in Alicante, Spain. Two representatives of the CERN Staff Association, in charge of External Relations, attended as observers. This participation is a useful complement to regular contacts we have with FICSA (Federation of International Civil Servants' Associations), which groups staff associations of the UN Agencies, and the annual CSAIO conferences (Conference of Staff Associations of International Organizations), where each Autumn representatives of international organizations based in Europe meet to discuss themes of common interest to better promote and defend the rights of the international civil servants. All these meetings allow us to remain informed on items that are directly or indirectly related to employment and social conditions of our colleagues in other international and Europ...
Data structures and apparatuses for representing knowledge
Hohimer, Ryan E; Thomson, Judi R; Harvey, William J; Paulson, Patrick R; Whiting, Mark A; Tratz, Stephen C; Chappell, Alan R; Butner, Robert S
2014-02-18
Data structures and apparatuses to represent knowledge are disclosed. The processes can comprise labeling elements in a knowledge signature according to concepts in an ontology and populating the elements with confidence values. The data structures can comprise knowledge signatures stored on computer-readable media. The knowledge signatures comprise a matrix structure having elements labeled according to concepts in an ontology, wherein the value of the element represents a confidence that the concept is present in an information space. The apparatus can comprise a knowledge representation unit having at least one ontology stored on a computer-readable medium, at least one data-receiving device, and a processor configured to generate knowledge signatures by comparing datasets obtained by the data-receiving devices to the ontologies.
Allelic variability in species and stocks of Lake Superior ciscoes (Coregoninae)
Todd, Thomas N.
1981-01-01
Starch gel electrophoresis was used as a means of recognizing species and stocks in Lake Superior Coregonus. Allelic variability at isocitrate dehydrogenase and glycerol-3-phosphate dehydrogenase loci was recorded for samples of lake herring (Coregonus artedii), bloater (C. hoyi), kiyi (C. kiyi), and shortjaw cisco (C. zenithicus) from five Lake Superior localities. The observed frequencies of genotypes within each subsample did not differ significantly from those expected on the basis of random mating, and suggested that each subsample represented either a random sample from a larger randomly mating population or an independent and isolated subpopulation within which mating was random. Significant contingency X2 values for comparisons between both localities and species suggested that more than one randomly mating population occurred among the Lake Superior ciscoes, but did not reveal how many such populations there were. In contrast to the genetic results of this study, morphology seems to be a better descriptor of cisco stocks, and identification of cisco stocks and species will still have to be based on morphological criteria until more data are forthcoming. Where several species are sympatric, management should strive to preserve the least abundant. Failure to do so could result in the extinction or depletion of the rarer forms.
Using semantics for representing experimental protocols.
Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar
2017-11-13
An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .
The local brand representative in reseller networks
Gupta, Suraksha; Malhotra, Naresh K; Czinkota, Michael; Foroudi, Pantea
2016-01-01
This study investigates the characteristics of local individuals who represent a brand to its resellers by first conceptualizing these characteristics by employing complexity theory and then testing the conceptualization. This research revealed that four characteristics ‘native’, ‘entrepreneurial’, ‘advisor’, and ‘compatible’ are the main ones that influence reseller brand preferences. The study finds a link between reseller brand preference and reseller brand loyalty which is useful for mana...
Citizen's initiatives and the representative system
International Nuclear Information System (INIS)
Guggenberger, B.; Kempf, U.
1978-01-01
This anthology containing contributions of 19 sociologists is a systematic investigation of the locality, the possibilities and the effective radius of citizen's initiatives under the functional conditions of the parliamentary - representative system. The intellectual and political surroundings, the sociologic context, the institutional, political and judical overall conditions as well as the consequences of this movement for the whole political system of the Federal Republic of Germany. (orig.) [de
Yucca Mountain Climate Technical Support Representative
International Nuclear Information System (INIS)
Sharpe, Saxon E
2007-01-01
The primary objective of Project Activity ORD-FY04-012, 'Yucca Mountain Climate Technical Support Representative', was to provide the Office of Civilian Radioactive Waste Management (OCRWM) with expertise on past, present, and future climate scenarios and to support the technical elements of the Yucca Mountain Project (YMP) climate program. The Climate Technical Support Representative was to explain, defend, and interpret the YMP climate program to the various audiences during Site Recommendation and License Application. This technical support representative was to support DOE management in the preparation and review of documents, and to participate in comment response for the Final Environmental Impact Statement, the Site Recommendation Hearings, the NRC Sufficiency Comments, and other forums as designated by DOE management. Because the activity was terminated 12 months early and experience a 27% reduction in budget, it was not possible to complete all components of the tasks as originally envisioned. Activities not completed include the qualification of climate datasets and the production of a qualified technical report. The following final report is an unqualified summary of the activities that were completed given the reduced time and funding
Diversity and representativeness: two key factors
Staff Association
2013-01-01
In the past few weeks many of you have filled out the questionnaire for preparing the upcoming Five-yearly review. Similarly, Staff Association members have elected their delegates to the Staff Council for the coming two years. Once again we would like to thank all those who have taken the time and effort to make their voice heard on these two occasions. Elections to the Staff Council Below we publish the new Staff Council with its forty delegates who will represent in 2014 and 2015 all CERN staff in the discussions with Management and Member States in the various bodies and committees. Therefore it is important that the Staff Council represents as far as possible the diversity of the CERN population. By construction, the election process with its electoral colleges and two-step voting procedure guarantees that all Departments, even the small ones, and the various staff categories are correctly represented. Figure 1 shows the participation rate in the elections. The average rate is just above 52 %, with ...
Soil variability in engineering applications
Vessia, Giovanna
2014-05-01
Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random
Directory of Open Access Journals (Sweden)
Zhensheng Wang
2017-02-01
Full Text Available The spatial variation of geographical phenomena is a classical problem in spatial data analysis and can provide insight into underlying processes. Traditional exploratory methods mostly depend on the planar distance assumption, but many spatial phenomena are constrained to a subset of Euclidean space. In this study, we apply a method based on a hierarchical Bayesian model to analyse the spatial variation of network-constrained phenomena represented by a link attribute in conjunction with two experiments based on a simplified hypothetical network and a complex road network in Shenzhen that includes 4212 urban facility points of interest (POIs for leisure activities. Then, the methods named local indicators of network-constrained clusters (LINCS are applied to explore local spatial patterns in the given network space. The proposed method is designed for phenomena that are represented by attribute values of network links and is capable of removing part of random variability resulting from small-sample estimation. The effects of spatial dependence and the base distribution are also considered in the proposed method, which could be applied in the fields of urban planning and safety research.
Variable importance in latent variable regression models
Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.
2014-01-01
The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable
Examining the "Veggie" personality: Results from a representative German sample.
Pfeiler, Tamara M; Egloff, Boris
2018-01-01
An increasing proportion of people choose to follow a vegetarian diet. To date, however, little is known about if and how individual differences in personality relate to following a vegetarian diet. In the two studies presented here, we aimed to (1) estimate the prevalence of self-defined vegetarians in two waves of a German representative sample (N = 4496 and 5125, respectively), (2) analyze the effect of socio-demographic variables on dietary behavior, and (3) examine individual differences between vegetarians and meat eaters in personality traits, political attitudes, and health-related variables. In Study 1, a strict definition of vegetarians was used, while in Study 2 the definition was laxer, to include also individuals who only predominantly followed a vegetarian diet. The prevalence of self-defined vegetarians was 2.74% in Study 1, and 5.97% in Study 2. Participants who were female, younger, and more educated were more likely to report following a vegetarian diet in both studies, and vegetarians had higher income as compared to meat eaters in Study 2. We also found differences between vegetarians and meat eaters with regard to personality traits, political attitudes, and health-related variables. Stepwise logistic regression analyses showed a unique effect beyond socio-demographic variables for openness (Studies 1 and 2), conscientiousness (Study 1), trust (Study 2), conservatism (Studies 1 and 2), and level of interest in politics (Study 1) on diet: Individuals with higher scores in openness and political interest had a higher probability of being vegetarian, whereas people with higher scores in conscientiousness and conservatism had a smaller likelihood of being vegetarian. We conclude that there are individual differences between vegetarians and meat eaters in socio-demographics, personality traits, and political attitudes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Entropy as a collective variable
Parrinello, Michele
Sampling complex free energy surfaces that exhibit long lived metastable states separated by kinetic bottlenecks is one of the most pressing issues in the atomistic simulations of matter. Not surprisingly many solutions to this problem have been suggested. Many of them are based on the identification of appropriate collective variables that span the manifold of the slow varying modes of the system. While much effort has been put in devising and even constructing on the fly appropriate collective variables there is still a cogent need of introducing simple, generic, physically transparent, and yet effective collective variables. Motivated by the physical observation that in many case transitions between one metastable state and another result from a trade off between enthalpy and entropy we introduce appropriate collective variables that are able to represent in a simple way these two physical properties. We use these variables in the context of the recently introduced variationally enhanced sampling and apply it them with success to the simulation of crystallization from the liquid and to conformational transitions in protein. Department of Chemistry and Applied Biosciences, ETH Zurich, and Facolta' di Informatica, Istituto di Scienze Computazionali, Universita' della Svizzera Italiana, Via G. Buffi 13, 6900 Lugano, Switzerland.
REPRESENTING GENDER IN COMMUNIST AND POSTCOMMUNIST ROMANIA
Directory of Open Access Journals (Sweden)
Roxana - Elisabeta MARINESCU
2017-05-01
Full Text Available This article examines various representations of gender in communist and postcommunist Romania, with a focus on how women and men were both led towards and sometimes forced into gender roles better suited to the state policies of the respective contexts rather than to their own interests. Over the years, the state and/or party(ies public agenda, from women’s liberation through gender equality to equal opportunities, has met real Romanian women’s and men’s needs to different extents and with variable success.
Stability and complexity of small random linear systems
Hastings, Harold
2010-03-01
We explore the stability of the small random linear systems, typically involving 10-20 variables, motivated by dynamics of the world trade network and the US and Canadian power grid. This report was prepared as an account of work sponsored by an agency of the US Government. Neither the US Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the US Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the US Government or any agency thereof.
Travel time variability and rational inattention
DEFF Research Database (Denmark)
Fosgerau, Mogens; Jiang, Gege
2017-01-01
This paper sets up a rational inattention model for the choice of departure time for a traveler facing random travel time. The traveler chooses how much information to acquire about the travel time out-come before choosing departure time. This reduces the cost of travel time variability compared...
Directory of Open Access Journals (Sweden)
Steiner Adrian
2006-12-01
Full Text Available Abstract Background Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. Methods 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21–35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. Results There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024. Conclusion The results of this study indicate that the administration of PGF2alpha or a combination
Hirsbrunner, Gaby; Burkhardt, Heinz W; Steiner, Adrian
2006-12-21
Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21-35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI) to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024). The results of this study indicate that the administration of PGF2alpha or a combination of PGF2alpha and PGE2 21 to 35 days post partum had no beneficial
Hirsbrunner, Gaby; Burkhardt, Heinz W; Steiner, Adrian
2006-01-01
Background Delayed uterine involution has negative effects on the fertility of cows; use of prostaglandin F2alpha alone as a single treatment has not been shown to consistently improve fertility. Combined administration of PGF2alpha and PGE2 increased uterine pressure in healthy cows. We hypothesized, that the combination of both prostaglandins would accelerate uterine involution and have, therefore, a positive effect on fertility variables. In commercial dairy farming, the benefit of a single post partum combined prostaglandin treatment should be demonstrated. Methods 383 cows from commercial dairy farms were included in this study. Uterine size and secretion were evaluated at treatment 21–35 days post partum and 14 days later. Cows were randomly allocated to one of three treatment groups: PGF2alpha and PGE2, PGF2alpha or placebo. For every animal participating in the study, the following reproduction variables were recorded: Interval from calving to first insemination, days open, number of artificial inseminations (AI) to conception; subsequent treatment of uterus, subsequent treatment of ovaries. Plasma progesterone level at time of treatment was used as a covariable. For continuous measurements, analysis of variance was performed. Fisher's exact test for categorical non-ordered data and exact Kruskal-Wallis test for ordered data were used; pairwise group comparisons with Bonferroni adjustment of significance level were performed. Results There was no significant difference among treatment groups in uterine size. Furthermore, there was no significant difference among treatments concerning days open, number of AI, and subsequent treatment of uterus and ovaries. Days from calving to first insemination tended to be shorter for cows with low progesterone level given PGF2alpha and PGE2 in combination than for the placebo-group (P = 0.024). Conclusion The results of this study indicate that the administration of PGF2alpha or a combination of PGF2alpha and PGE2 21 to
A random matrix approach to VARMA processes
International Nuclear Information System (INIS)
Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata
2010-01-01
We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.
Exploring Representativeness and Informativeness for Active Learning.
Du, Bo; Wang, Zengmao; Zhang, Lefei; Zhang, Liangpei; Liu, Wei; Shen, Jialie; Tao, Dacheng
2017-01-01
How can we find a general way to choose the most suitable samples for training a classifier? Even with very limited prior information? Active learning, which can be regarded as an iterative optimization procedure, plays a key role to construct a refined training set to improve the classification performance in a variety of applications, such as text analysis, image recognition, social network modeling, etc. Although combining representativeness and informativeness of samples has been proven promising for active sampling, state-of-the-art methods perform well under certain data structures. Then can we find a way to fuse the two active sampling criteria without any assumption on data? This paper proposes a general active learning framework that effectively fuses the two criteria. Inspired by a two-sample discrepancy problem, triple measures are elaborately designed to guarantee that the query samples not only possess the representativeness of the unlabeled data but also reveal the diversity of the labeled data. Any appropriate similarity measure can be employed to construct the triple measures. Meanwhile, an uncertain measure is leveraged to generate the informativeness criterion, which can be carried out in different ways. Rooted in this framework, a practical active learning algorithm is proposed, which exploits a radial basis function together with the estimated probabilities to construct the triple measures and a modified best-versus-second-best strategy to construct the uncertain measure, respectively. Experimental results on benchmark datasets demonstrate that our algorithm consistently achieves superior performance over the state-of-the-art active learning algorithms.
Identifying optimal models to represent biochemical systems.
Directory of Open Access Journals (Sweden)
Mochamad Apri
Full Text Available Biochemical systems involving a high number of components with intricate interactions often lead to complex models containing a large number of parameters. Although a large model could describe in detail the mechanisms that underlie the system, its very large size may hinder us in understanding the key elements of the system. Also in terms of parameter identification, large models are often problematic. Therefore, a reduced model may be preferred to represent the system. Yet, in order to efficaciously replace the large model, the reduced model should have the same ability as the large model to produce reliable predictions for a broad set of testable experimental conditions. We present a novel method to extract an "optimal" reduced model from a large model to represent biochemical systems by combining a reduction method and a model discrimination method. The former assures that the reduced model contains only those components that are important to produce the dynamics observed in given experiments, whereas the latter ensures that the reduced model gives a good prediction for any feasible experimental conditions that are relevant to answer questions at hand. These two techniques are applied iteratively. The method reveals the biological core of a model mathematically, indicating the processes that are likely to be responsible for certain behavior. We demonstrate the algorithm on two realistic model examples. We show that in both cases the core is substantially smaller than the full model.
Anthropomorphic Networks as Representatives of Global Consciousness
Directory of Open Access Journals (Sweden)
Sergii Yahodzinskyi
2018-02-01
Full Text Available There has been analyzed a phenomenon of global consciousness, and its cultural and historical, civilizational dimensions have been substantiated. There has been demonstrated that the concept of planetary consciousness, global thinking, noosphere was described for the first time in the philosophy of cosmism. However, in modern conditions ideas of representatives of the naturalistic philosophical direction of cosmism have not lost their heuristic potential. They can be reconsidered in a new fashion within the context of emerging anthropomorphic (human dimension networks. There has been proved that global consciousness is a component of the social and cultural potential of global information networks defining vectors to prospects of humanity progress in the 21st century. Relying on methodology of the structural and functional analysis, the author arrives at a conclusion about global networks obtaining the status of representatives of global consciousness. This is the area of networks where all relevant information is concentrated – from statistical data to scientific and technical information. Access to these data is limited by human abilities and is realized in the form of discrete requests with using heuristic algorithms of information procession. A suggestion is introduced considering the fact that modern society being a self-organized system seeks to gain stable condition. Anthropomorphic networks are means of decreasing social entropy, which is growing as a result of any kind of human intervention into social processes. Thus, for the first time a human is challenged by their intellect, ability to create, discover and control.
How Are Feedbacks Represented in Land Models?
Directory of Open Access Journals (Sweden)
Yang Chen
2016-09-01
Full Text Available Land systems are characterised by many feedbacks that can result in complex system behaviour. We defined feedbacks as the two-way influences between the land use system and a related system (e.g., climate, soils and markets, both of which are encompassed by the land system. Land models that include feedbacks thus probably more accurately mimic how land systems respond to, e.g., policy or climate change. However, representing feedbacks in land models is a challenge. We reviewed articles incorporating feedbacks into land models and analysed each with predefined indicators. We found that (1 most modelled feedbacks couple land use systems with transport, soil and market systems, while only a few include feedbacks between land use and social systems or climate systems; (2 equation-based land use models that follow a top-down approach prevail; and (3 feedbacks’ effects on system behaviour remain relatively unexplored. We recommend that land system modellers (1 consider feedbacks between land use systems and social systems; (2 adopt (bottom-up approaches suited to incorporating spatial heterogeneity and better representing land use decision-making; and (3 pay more attention to nonlinear system behaviour and its implications for land system management and policy.
May 2013 Council of Chapter Representatives Notes
Directory of Open Access Journals (Sweden)
Robbins RA
2013-05-01
Full Text Available No abstract available. Article truncated at 150 words. The Council of Chapter Representatives met in conjunction with the ATS meeting in Philadelphia on May 18, 2012.Roll Call. The meeting was called to order at 11 AM. Representatives from Arizona, California, DC Metro, Louisiana, Michigan, New Mexico, New York, Oregon, and Rhode Island were in attendance, and by telephone from Washington.Chapter Updates. Information on chapter activities and a chapter brochure. There are currently 19 active chapters. Most are having annual meetings. Advocacy. Gary Ewart from ATS Government Relations gave a presentation on Washington activities. Highlights included activities on the SGR, a number of air pollution regulations and a letter campaign advocating regulation of cigars. ATS President 2013-14-vision for the coming year. Patrician Finn gave a summary of what she hopes to accomplish over the next year. The theme of her presidency will be health equality. ATS Executive Director-update. Steve Crane gave a positive presentation on the …
Response variability in balanced cortical networks
DEFF Research Database (Denmark)
Lerchner, Alexander; Ursta, C.; Hertz, J.
2006-01-01
We study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external...
Protecting chips against hold time violations due to variability
Neuberger, Gustavo; Reis, Ricardo
2013-01-01
With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus
Energy Technology Data Exchange (ETDEWEB)
Zentner, I. [IMSIA, UMR EDF-ENSTA-CNRS-CEA 9219, Université Paris-Saclay, 828 Boulevard des Maréchaux, 91762 Palaiseau Cedex (France); Ferré, G., E-mail: gregoire.ferre@ponts.org [CERMICS – Ecole des Ponts ParisTech, 6 et 8 avenue Blaise Pascal, Cité Descartes, Champs sur Marne, 77455 Marne la Vallée Cedex 2 (France); Poirion, F. [Department of Structural Dynamics and Aeroelasticity, ONERA, BP 72, 29 avenue de la Division Leclerc, 92322 Chatillon Cedex (France); Benoit, M. [Institut de Recherche sur les Phénomènes Hors Equilibre (IRPHE), UMR 7342 (CNRS, Aix-Marseille Université, Ecole Centrale Marseille), 49 rue Frédéric Joliot-Curie, BP 146, 13384 Marseille Cedex 13 (France)
2016-06-01
In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated by applications to earthquakes (seismic ground motion) and sea states (wave heights).
Random broadcast on random geometric graphs
Energy Technology Data Exchange (ETDEWEB)
Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY
2009-01-01
In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.
Quantumness, Randomness and Computability
International Nuclear Information System (INIS)
Solis, Aldo; Hirsch, Jorge G
2015-01-01
Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics. (paper)
Gaussian random bridges and a geometric model for information equilibrium
Mengütürk, Levent Ali
2018-03-01
The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.
Energy Technology Data Exchange (ETDEWEB)
Sudhoff, M; Lamba, M; Kumar, N; Ward, A; Elson, H [University of Cincinnati, Cincinnati, OH (United States)
2015-06-15
Purpose: To systematically characterize inter-fraction breast variability and determine implications on delivered dose. Methods: Weekly port films were used to characterize breast setup variability. Five evenly spaced representative positions across the contour of each breast were chosen on the electronic port film in reference to graticule, and window and level was set such that the skin surface of the breast was visible. Measurements from the skin surface to treatment field edge were taken on each port film at each position and compared to the planning DRR, quantifying the variability. The systematic measurement technique was repeated for all port films for 20 recently treated breast cancer patients. Measured setup variability for each patient was modeled as a normal distribution. The distribution was randomly sampled from the model and applied as isocentric shifts in the treatment planning computer, representing setup variability for each fraction. Dose was calculated for each shifted fraction and summed to obtain DVHs and BEDs that modeled the dose with daily setup variability. Patients were categorized in to relevant groupings that were chosen to investigate the rigorousness of immobilization types, treatment techniques, and inherent anatomical difficulties. Mean position differences and dosimetric differences were evaluated between planned and delivered doses. Results: The setup variability was found to follow a normal distribution with mean position differences between the DRR and port film between − 8.6–3.5 mm with sigma range of 5.3–9.8 mm. Setup position was not found to be significantly different than zero. The mean seroma or whole breast PTV dosimetric difference, calculated as BED, ranged from a −0.23 to +1.13Gy. Conclusion: A systematic technique to quantify and model setup variability was used to calculate the dose in 20 breast cancer patients including variable setup. No statistically significant PTV or OAR BED differences were found between
Variability of fractal dimension of solar radio flux
Bhatt, Hitaishi; Sharma, Som Kumar; Trivedi, Rupal; Vats, Hari Om
2018-04-01
In the present communication, the variation of the fractal dimension of solar radio flux is reported. Solar radio flux observations on a day to day basis at 410, 1415, 2695, 4995, and 8800 MHz are used in this study. The data were recorded at Learmonth Solar Observatory, Australia from 1988 to 2009 covering an epoch of two solar activity cycles (22 yr). The fractal dimension is calculated for the listed frequencies for this period. The fractal dimension, being a measure of randomness, represents variability of solar radio flux at shorter time-scales. The contour plot of fractal dimension on a grid of years versus radio frequency suggests high correlation with solar activity. Fractal dimension increases with increasing frequency suggests randomness increases towards the inner corona. This study also shows that the low frequency is more affected by solar activity (at low frequency fractal dimension difference between solar maximum and solar minimum is 0.42) whereas, the higher frequency is less affected by solar activity (here fractal dimension difference between solar maximum and solar minimum is 0.07). A good positive correlation is found between fractal dimension averaged over all frequencies and yearly averaged sunspot number (Pearson's coefficient is 0.87).
Using resource graphs to represent conceptual change
Directory of Open Access Journals (Sweden)
Michael C. Wittmann
2006-08-01
Full Text Available We introduce resource graphs, a representation of linked ideas used when reasoning about specific contexts in physics. Our model is consistent with previous descriptions of coordination classes and resources. It represents mesoscopic scales that are neither knowledge-in-pieces nor large-scale concepts. We use resource graphs to describe several forms of conceptual change: incremental, cascade, wholesale, and dual construction. For each, we give evidence from the physics education research literature to show examples of each form of conceptual change. Where possible, we compare our representation to models used by other researchers. Building on our representation, we analyze another form of conceptual change, differentiation, and suggest several experimental studies that would help understand the differences between reform-based curricula.
STATISTICAL MODELS OF REPRESENTING INTELLECTUAL CAPITAL
Directory of Open Access Journals (Sweden)
Andreea Feraru
2016-06-01
Full Text Available This article entitled Statistical Models of Representing Intellectual Capital approaches and analyses the concept of intellectual capital, as well as the main models which can support enterprisers/managers in evaluating and quantifying the advantages of intellectual capital. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. Among the group of static models for evaluating organisational intellectual capital the canonical model stands out. This model enables the structuring of organisational intellectual capital in: human capital, structural capital and relational capital. Although the model is widely spread, it is a static one and can thus create a series of errors in the process of evaluation, because all the three entities mentioned above are not independent from the viewpoint of their contents, as any logic of structuring complex entities requires.
Towards a representative periphytic diatom sample
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available The need to acquire a representative periphytic diatom sample for river water quality monitoring has been recognised in the development of existing diatom indices, important in the development and employment of diatom monitoring tools for the Water Framework Directive. In this study, a nested design with replication is employed to investigate the magnitude of variation in diatom biomass, composition and Trophic Diatom Index at varying scales within a small chalk river. The study shows that the use of artificial substrates may not result in diatom communities that are typical of the surrounding natural substrates. Periphytic diatom biomass and composition varies between artificial and natural substrates, riffles and glides and between two stretches of the river channel. The study also highlights the existence of high variation in diatom frustule frequency and biovolume at the individual replicate scale which may have implications for the use of diatoms in routine monitoring.
Multiple Imputation of Predictor Variables Using Generalized Additive Models
de Jong, Roel; van Buuren, Stef; Spiess, Martin
2016-01-01
The sensitivity of multiple imputation methods to deviations from their distributional assumptions is investigated using simulations, where the parameters of scientific interest are the coefficients of a linear regression model, and values in predictor variables are missing at random. The
How random is a random vector?
Eliazar, Iddo
2015-12-01
Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.
How random is a random vector?
International Nuclear Information System (INIS)
Eliazar, Iddo
2015-01-01
Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.
An analysis of spatial representativeness of air temperature monitoring stations
Liu, Suhua; Su, Hongbo; Tian, Jing; Wang, Weizhen
2018-05-01
Surface air temperature is an essential variable for monitoring the atmosphere, and it is generally acquired at meteorological stations that can provide information about only a small area within an r m radius ( r-neighborhood) of the station, which is called the representable radius. In studies on a local scale, ground-based observations of surface air temperatures obtained from scattered stations are usually interpolated using a variety of methods without ascertaining their effectiveness. Thus, it is necessary to evaluate the spatial representativeness of ground-based observations of surface air temperature before conducting studies on a local scale. The present study used remote sensing data to estimate the spatial distribution of surface air temperature using the advection-energy balance for air temperature (ADEBAT) model. Two target stations in the study area were selected to conduct an analysis of spatial representativeness. The results showed that one station (AWS 7) had a representable radius of about 400 m with a possible error of less than 1 K, while the other station (AWS 16) had the radius of about 250 m. The representable radius was large when the heterogeneity of land cover around the station was small.
In search of a representative sample of residential building work.
Lobb, Brenda; Woods, Gregory R
2012-09-01
Most research investigating injuries in construction work is limited by reliance on work samples unrepresentative of the multiple, variable-cycle tasks involved, resulting in incomplete characterisation of ergonomic exposures. In this case study, a participatory approach was used including hierarchical task analysis and site observations of a typical team of house builders in New Zealand, over several working days, to obtain a representative work sample. The builders' work consisted of 14 goal-defined jobs using varying subsets of 15 task types, each taking from less than 1 s to more than 1 h and performed in a variety of postures. Task type and duration varied within and between participants and days, although all participants spent at least 25% of the time moving from place to place, mostly carrying materials, and more than half the time either reaching up or bending down to work. This research has provided a description of residential building work based on a work sample more nearly representative than those previously published and has demonstrated a simple, low-cost but robust field observation method that can provide a valid basis for further study of hazard exposures. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Individualized Anemia Management Reduces Hemoglobin Variability in Hemodialysis Patients
Gaweda, Adam E.; Aronoff, George R.; Jacobs, Alfred A.; Rai, Shesh N.; Brier, Michael E.
2013-01-01
One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA ...
Unsupervised classification of variable stars
Valenzuela, Lucas; Pichara, Karim
2018-03-01
During the past 10 years, a considerable amount of effort has been made to develop algorithms for automatic classification of variable stars. That has been primarily achieved by applying machine learning methods to photometric data sets where objects are represented as light curves. Classifiers require training sets to learn the underlying patterns that allow the separation among classes. Unfortunately, building training sets is an expensive process that demands a lot of human efforts. Every time data come from new surveys; the only available training instances are the ones that have a cross-match with previously labelled objects, consequently generating insufficient training sets compared with the large amounts of unlabelled sources. In this work, we present an algorithm that performs unsupervised classification of variable stars, relying only on the similarity among light curves. We tackle the unsupervised classification problem by proposing an untraditional approach. Instead of trying to match classes of stars with clusters found by a clustering algorithm, we propose a query-based method where astronomers can find groups of variable stars ranked by similarity. We also develop a fast similarity function specific for light curves, based on a novel data structure that allows scaling the search over the entire data set of unlabelled objects. Experiments show that our unsupervised model achieves high accuracy in the classification of different types of variable stars and that the proposed algorithm scales up to massive amounts of light curves.
Representativeness and seasonality of major ion records derived from NEEM firn cores
Directory of Open Access Journals (Sweden)
G. Gfeller
2014-10-01
Full Text Available The seasonal and annual representativeness of ionic aerosol proxies (among others, calcium, sodium, ammonium and nitrate in various firn cores in the vicinity of the NEEM drill site in northwest Greenland have been assessed. Seasonal representativeness is very high as one core explains more than 60% of the variability within the area. The inter-annual representativeness, however, can be substantially lower (depending on the species making replicate coring indispensable to derive the atmospheric variability of aerosol species. A single core at the NEEM site records only 30% of the inter-annual atmospheric variability in some species, while five replicate cores are already needed to cover approximately 70% of the inter-annual atmospheric variability in all species. The spatial representativeness is very high within 60 cm, rapidly decorrelates within 10 m but does not diminish further within 3 km. We attribute this to wind reworking of the snow pack leading to sastrugi formation. Due to the high resolution and seasonal representativeness of the records we can derive accurate seasonalities of the measured species for modern (AD 1990–2010 times as well as for pre-industrial (AD 1623–1750 times. Sodium and calcium show similar seasonality (peaking in February and March respectively for modern and pre-industrial times, whereas ammonium and nitrate are influenced by anthropogenic activities. Nitrate and ammonium both peak in May during modern times, whereas during pre-industrial times ammonium peaked during July–August and nitrate during June–July.
Model parameters for representative wetland plant functional groups
Williams, Amber S.; Kiniry, James R.; Mushet, David M.; Smith, Loren M.; McMurry, Scott T.; Attebury, Kelly; Lang, Megan; McCarty, Gregory W.; Shaffer, Jill A.; Effland, William R.; Johnson, Mari-Vaughn V.
2017-01-01
Wetlands provide a wide variety of ecosystem services including water quality remediation, biodiversity refugia, groundwater recharge, and floodwater storage. Realistic estimation of ecosystem service benefits associated with wetlands requires reasonable simulation of the hydrology of each site and realistic simulation of the upland and wetland plant growth cycles. Objectives of this study were to quantify leaf area index (LAI), light extinction coefficient (k), and plant nitrogen (N), phosphorus (P), and potassium (K) concentrations in natural stands of representative plant species for some major plant functional groups in the United States. Functional groups in this study were based on these parameters and plant growth types to enable process-based modeling. We collected data at four locations representing some of the main wetland regions of the United States. At each site, we collected on-the-ground measurements of fraction of light intercepted, LAI, and dry matter within the 2013–2015 growing seasons. Maximum LAI and k variables showed noticeable variations among sites and years, while overall averages and functional group averages give useful estimates for multisite simulation modeling. Variation within each species gives an indication of what can be expected in such natural ecosystems. For P and K, the concentrations from highest to lowest were spikerush (Eleocharis macrostachya), reed canary grass (Phalaris arundinacea), smartweed (Polygonum spp.), cattail (Typha spp.), and hardstem bulrush (Schoenoplectus acutus). Spikerush had the highest N concentration, followed by smartweed, bulrush, reed canary grass, and then cattail. These parameters will be useful for the actual wetland species measured and for the wetland plant functional groups they represent. These parameters and the associated process-based models offer promise as valuable tools for evaluating environmental benefits of wetlands and for evaluating impacts of various agronomic practices in
Amplification factor variable amplifier
Akitsugu, Oshita; Nauta, Bram
2007-01-01
PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ; SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and
Amplification factor variable amplifier
Akitsugu, Oshita; Nauta, Bram
2010-01-01
PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ;SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and can
Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.
Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip
2015-01-01
Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.
Intraspecific chromosome variability
Directory of Open Access Journals (Sweden)
N Dubinin
2010-12-01
Full Text Available (Editorial preface. The publication is presented in order to remind us of one of dramatic pages of the history of genetics. It re-opens for the contemporary reader a comprehensive work marking the priority change from plant cytogenetics to animal cytogenetics led by wide population studies which were conducted on Drosophila polytene chromosomes. The year of the publication (1937 became the point of irretrievable branching between the directions of Old World and New World genetics connected with the problems of chromosome variability and its significance for the evolution of the species. The famous book of T. Dobzhansky (1937 was published by Columbia University in the US under the title “Genetics and the origin of species”, and in the shadow of this American ‘skybuilding’ all other works grew dim. It is remarkable that both Dobzhansky and Dubinin come to similar conclusions about the role of chromosomes in speciation. This is not surprising given that they both might be considered as representatives of the Russian genetic school, by their birth and education. Interestingly, Dobzhansky had never referred to the full paper of Dubinin et al. (1937, though a previous short communication in Nature (1936 was included together with all former papers on the related subject. In full, the volume of the original publication printed in the Biological Journal in Moscow comprised 47 pages, in that number 41 pages of the Russian text accompanied by 16 Figs, a table and reference list, and, above all, 6 pages of the English summary. This final part in English is now reproduced in the authors’ version with the only addition being the reference list in the originally printed form.
Representing Water Scarcity in Future Agricultural Assessments
Winter, Jonathan M.; Lopez, Jose R.; Ruane, Alexander C.; Young, Charles A.; Scanlon, Bridget R.; Rosenzweig, Cynthia
2017-01-01
Globally, irrigated agriculture is both essential for food production and the largest user of water. A major challenge for hydrologic and agricultural research communities is assessing the sustainability of irrigated croplands under climate variability and change. Simulations of irrigated croplands generally lack key interactions between water supply, water distribution, and agricultural water demand. In this article, we explore the critical interface between water resources and agriculture by motivating, developing, and illustrating the application of an integrated modeling framework to advance simulations of irrigated croplands. We motivate the framework by examining historical dynamics of irrigation water withdrawals in the United States and quantitatively reviewing previous modeling studies of irrigated croplands with a focus on representations of water supply, agricultural water demand, and impacts on crop yields when water demand exceeds water supply. We then describe the integrated modeling framework for simulating irrigated croplands, which links trends and scenarios with water supply, water allocation, and agricultural water demand. Finally, we provide examples of efforts that leverage the framework to improve simulations of irrigated croplands as well as identify opportunities for interventions that increase agricultural productivity, resiliency, and sustainability.
Dhakal, Rajat; Seale, R Brent; Deeth, Hilton C; Craven, Heather; Turner, Mark S
2014-06-01
The spore-forming bacterium Bacillus licheniformis is a common contaminant of milk and milk products. Strains of this species isolated from dairy products can be differentiated into three major groups, namely, G, F1, and F2, using random amplification of polymorphic DNA (RAPD) analysis; however, little is known about the genomic differences between these groups and the identity of the fragments that make up their RAPD profiles. In this work we obtained high-quality draft genomes of representative strains from each of the three RAPD groups (designated strain G-1, strain F1-1, and strain F2-1) and compared them to each other and to B. licheniformis ATCC 14580 and Bacillus subtilis 168. Whole-genome comparison and multilocus sequence typing revealed that strain G-1 contains significant sequence variability and belongs to a lineage distinct from the group F strains. Strain G-1 was found to contain genes coding for a type I restriction modification system, urease production, and bacitracin synthesis, as well as the 8-kbp plasmid pFL7, and these genes were not present in strains F1-1 and F2-1. In agreement with this, all isolates of group G, but no group F isolates, were found to possess urease activity and antimicrobial activity against Micrococcus. Identification of RAPD band sequences revealed that differences in the RAPD profiles were due to differences in gene lengths, 3' ends of predicted primer binding sites, or gene presence or absence. This work provides a greater understanding of the phylogenetic and phenotypic differences observed within the B. licheniformis species.
Recent activities of the Seismology Division Early Career Representative(s)
Agius, Matthew; Van Noten, Koen; Ermert, Laura; Mai, P. Martin; Krawczyk, CharLotte
2016-04-01
The European Geosciences Union is a bottom-up-organisation, in which its members are represented by their respective scientific divisions, committees and council. In recent years, EGU has embarked on a mission to reach out for its numerous 'younger' members by giving awards to outstanding young scientists and the setting up of Early Career Scientists (ECS) representatives. The division representative's role is to engage in discussions that concern students and early career scientists. Several meetings between all the division representatives are held throughout the year to discuss ideas and Union-wide issues. One important impact ECS representatives have had on EGU is the increased number of short courses and workshops run by ECS during the annual General Assembly. Another important contribution of ECS representatives was redefining 'Young Scientist' to 'Early Career Scientist', which avoids discrimination due to age. Since 2014, the Seismology Division has its own ECS representative. In an effort to more effectively reach out for young seismologists, a blog and a social media page dedicated to seismology have been set up online. With this dedicated blog, we'd like to give more depth to the average browsing experience by enabling young researchers to explore various seismology topics in one place while making the field more exciting and accessible to the broader community. These pages are used to promote the latest research especially of young seismologists and to share interesting seismo-news. Over the months the pages proved to be popular, with hundreds of views every week and an increased number of followers. An online survey was conducted to learn more about the activities and needs of early career seismologists. We present the results from this survey, and the work that has been carried out over the last two years, including detail of what has been achieved so far, and what we would like the ECS representation for Seismology to achieve. Young seismologists are
Statistical conditional sampling for variable-resolution video compression.
Directory of Open Access Journals (Sweden)
Alexander Wong
Full Text Available In this study, we investigate a variable-resolution approach to video compression based on Conditional Random Field and statistical conditional sampling in order to further improve compression rate while maintaining high-quality video. In the proposed approach, representative key-frames within a video shot are identified and stored at full resolution. The remaining frames within the video shot are stored and compressed at a reduced resolution. At the decompression stage, a region-based dictionary is constructed from the key-frames and used to restore the reduced resolution frames to the original resolution via statistical conditional sampling. The sampling approach is based on the conditional probability of the CRF modeling by use of the constructed dictionary. Experimental results show that the proposed variable-resolution approach via statistical conditional sampling has potential for improving compression rates when compared to compressing the video at full resolution, while achieving higher video quality when compared to compressing the video at reduced resolution.
Multicriteria analysis of ontologically represented information
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
What does self rated mental health represent
Directory of Open Access Journals (Sweden)
Daphna Levinson
2014-12-01
Full Text Available Background. Unlike the widely used self rated health, the self rated mental health was found unsuitable as a proxy for mental illness. This paper analyses the relationships between the self ratings of physical health, mental health and overall health, and their association of with the objective indicators for physical and mental health. Design and methods. The study is a secondary analysis of data from a nationwide representative sample of the non-institutionalized adult residents of Israel in 2003 that was collected via computer-assisted personal interview methods [n=4859].Results. The self rated physical health and the self rated mental health were strongly related to each other yet the self rated mental health was not related to chronic physical conditions and the self rated physical health was not related to mental disorders. In a multiple logistic regression analysis, those with positive self rated mental health had 93 times the odds of reporting positive overall health whereas those with positive self rated physical health had 40 times the odds of reporting positive overall health. Conclusions. The self rating of mental health presents a qualitatively different dimension from mental illness. The self rated mental health is two times more important than the self rated physical health in predicting the self rated overall health
Instrumental variable analysis
Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine; Jager, Kitty J.
2013-01-01
The main advantage of the randomized controlled trial (RCT) is the random assignment of treatment that prevents selection by prognosis. Nevertheless, only few RCTs can be performed given their high cost and the difficulties in conducting such studies. Therefore, several analytical methods for
Directory of Open Access Journals (Sweden)
Regina Menezes
2017-02-01
Full Text Available Several epidemiological studies have linked flavonols with decreased risk of cardiovascular disease (CVD. However, some heterogeneity in the individual physiological responses to the consumption of these compounds has been identified. This meta‐analysis aimed to study the effect of flavonol supplementation on biomarkers of CVD risk such as, blood lipids, blood pressure and plasma glucose, as well as factors affecting their inter‐individual variability. Data from 18 human randomized controlled trials were pooled and the effect was estimated using fixed or random effects meta‐analysis model and reported as difference in means (DM. Variability in the response of blood lipids to supplementation with flavonols was assessed by stratifying various population subgroups: age, sex, country, and health status. Results showed significant reductions in total cholesterol (DM = −0.10 mmol/L; 95% CI: −0.20, −0.01, LDL cholesterol (DM = −0.14 mmol/L; Nutrients 2017, 9, 117 2 of 21 95% CI: −0.21, 0.07, and triacylglycerol (DM = −0.10 mmol/L; 95% CI: −0.18, 0.03, and a significant increase in HDL cholesterol (DM = 0.05 mmol/L; 95% CI: 0.02, 0.07. A significant reduction was also observed in fasting plasma glucose (DM = −0.18 mmol/L; 95%CI: −0.29, −0.08, and in blood pressure (SBP: DM = −4.84 mmHg; 95% CI: −5.64, −4.04; DBP: DM = −3.32 mmHg; 95% CI: -4.09, -2.55. Subgroup analysis showed a more pronounced effect of flavonol intake in participants from Asian countries and in participants with diagnosed disease or dyslipidemia, compared to healthy and normal baseline values. In conclusion, flavonol consumption improved biomarkers of CVD risk, however, country of
Variable mechanical ventilation.
Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini, Luiz Alberto; Friedman, Gilberto
2017-01-01
To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation.
Musculoskeletal symptoms in pharmaceutical sales representatives.
Sang, Katherine; Gyi, Diane; Haslam, Cheryl
2010-03-01
Musculoskeletal disorders (MSDs) are a leading cause of work-related ill health. Existing literature indicates that pharmaceutical sales representatives (PSRs) report a high prevalence of MSDs, possibly exacerbated by the nature of work (prolonged driving and manual handling). In addition, they experience difficulty in accessing occupational health services. To assess the prevalence of musculoskeletal symptoms and associated risk factors among PSRs in order to assist their occupational health management through raising risk awareness. A self-completed questionnaire distributed to 205 PSRs within a UK pharmaceutical company was used to assess the prevalence of musculoskeletal symptoms, psychosocial factors, work tasks undertaken and company car use. To assist understanding of work tasks and organizational factors, semi-structured interviews were undertaken with a sample of 12 key personnel. The questionnaire response rate was 68%. PSRs reported high mileage and 100% reported working from the car in a typical day. Forty-seven per cent reported both manual handling for > or = 4 h/day and 'often' or 'sometimes' working from the car. Fifty-seven per cent reported low back symptoms in the last 12 months. Interview data revealed issues relating to car choice, storage in the boot and working from the car, which should be considered when developing priorities for preventive management of MSDs. Musculoskeletal symptoms appear to be a problem for PSRs, with risk factors reported as prolonged driving, sitting in the car, working from the car and manual handling. Interventions to facilitate their occupational health management should focus on raising awareness of the risks of prolonged driving and working from the car.
Evaluation of 7Be fallout spatial variability
International Nuclear Information System (INIS)
Pinto, Victor Meriguetti
2011-01-01
The cosmogenic radionuclide beryllium-7 (Be) is produced in the atmosphere by cosmic particle reactions and is being used as a tracer for soil erosion and climatic processes research. After the production, 7 Be bonds to aerosol particles in the atmosphere and is deposited on the soil surface with other radionuclide species by rainfall. Because of the high adsorption on soil particles and its short half-life of 53.2 days, this radionuclide follows of the erosion process and can be used as a tracer to evaluate the sediment transport that occurs during a single rain event or short period of rain events. A key assumption for the erosion evaluation through this radiotracer is the uniformity of the spatial distribution of the 7 Be fallout. The 7 Be method was elaborated recently and due to its few applications, some assumptions related to the method were not yet properly investigated yet, and the hypothesis of 7 Be fallout uniformity needs to be evaluated. The aim of this study was to evaluate the 7 Be fallout spatial distribution through the rain water 7 Be activity analysis of the first five millimeters of single rain events. The rain water was sampled using twelve collectors distributed on an experimental area of about 300 m2 , located in the campus of Sao Paulo University, Piracicaba. The 7 Be activities were measured using a 53% efficiency gamma-ray spectrometer from the Radioisotope laboratory of CENA. The 7 Be activities in rain water varied from 0.26 to 1.81 Sq.L - 1, with the highest values in summer and lowest in spring. In each one of the 5 single events, the spatial variability of 7 Se activity in rain water was high, showing the high randomness of the fallout spatial distribution. A simulation using the 7 Be spatial variability values obtained here and 7 Se average reference inventories taken from the literature was performed determining the lowest detectable erosion rate estimated by 7 Be model. The importance of taking a representative number of samples to
HOW TO REPRESENT THE GENETIC CODE?
Directory of Open Access Journals (Sweden)
N.S. Santos-Magalhães
2004-05-01
Full Text Available The advent of molecular genetic comprises a true revolution of far-reaching consequences for human-kind, which evolved into a specialized branch of the modern-day Biochemistry. The analysis of specicgenomic information are gaining wide-ranging interest because of their signicance to the early diag-nosis of disease, and the discovery of modern drugs. In order to take advantage of a wide assortmentof signal processing (SP algorithms, the primary step of modern genomic SP involves convertingsymbolic-DNA sequences into complex-valued signals. How to represent the genetic code? Despitebeing extensively known, the DNA mapping into proteins is one of the relevant discoveries of genetics.The genetic code (GC is revisited in this work, addressing other descriptions for it, which can beworthy for genomic SP. Three original representations are discussed. The inner-to-outer map buildson the unbalanced role of nucleotides of a codon. A two-dimensional-Gray genetic representationis oered as a structured map that can help interpreting DNA spectrograms or scalograms. Theseare among the powerful visual tools for genome analysis, which depends on the choice of the geneticmapping. Finally, the world-chart for the GC is investigated. Evoking the cyclic structure of thegenetic mapping, it can be folded joining the left-right borders, and the top-bottom frontiers. As aresult, the GC can be drawn on the surface of a sphere resembling a world-map. Eight parallels oflatitude are required (four in each hemisphere as well as four meridians of longitude associated tofour corresponding anti-meridians. The tropic circles have 11.25o, 33.75o, 56.25o, and 78.5o (Northand South. Starting from an arbitrary Greenwich meridian, the meridians of longitude can be plottedat 22.5o, 67.5o, 112.5o, and 157.5o (East and West. Each triplet is assigned to a single point on thesurface that we named Nirenberg-Kohamas Earth. Despite being valuable, usual representations forthe GC can be
Hidden variables and locality in quantum theory
International Nuclear Information System (INIS)
Shiva, Vandana.
1978-12-01
The status of hidden variables in quantum theory has been debated since the 1920s. The author examines the no-hidden-variable theories of von Neumann, Kochen, Specker and Bell, and finds that they all share one basic assumption: averaging over the hidden variables should reproduce the quantum mechanical probabilities. Von Neumann also makes a linearity assumption, Kochen and Specker require the preservation of certain functional relations between magnitudes, and Bell proposes a locality condition. It has been assumed that the extrastatistical requirements are needed to serve as criteria of success for the introduction of hidden variables because the statistical condition is trivially satisfied, and that Bell's result is based on a locality condition that is physically motivated. The author shows that the requirement of weak locality, which is not physically motivated, is enough to give Bell's result. The proof of Bell's inequality works equally well for any pair of commuting magnitudes satisfying a condition called the degeneracy principle. None of the no-hidden-variable proofs apply to a class of hidden variable theories that are not phase-space reconstructions of quantum mechanics. The author discusses one of these theories, the Bohm-Bub theory, and finds that hidden variable theories that re all the quantum statistics, for single and sequential measurements, must introduce a randomization process for the hidden variables after each measurement. The philosophical significance of this theory lies in the role it can play in solving the conceptual puzzles posed by quantum theory
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Burcharth, Hans F.
1999-01-01
Reliability analyses are performed for three Japanese vertical wall breakwaters in this chapter. Only the geotechnical failure modes described in chapter 3 are investigated. For none of the breakwaters detailed data are available for the wave climate and for the soil conditions. Therefore represe...
Birge, Max; Duffy, Stephen; Miler, Joanna Astrid; Hajek, Peter
2017-11-04
The 'conversion rate' from initial experimentation to daily smoking is a potentially important metric of smoking behavior, but estimates of it based on current representative data are lacking. The Global Health Data Exchange was searched for representative surveys conducted in English speaking, developed countries after year 2000 that included questions about ever trying a cigarette and ever smoking daily. The initial search identified 2776 surveys that were further screened for language, location, year, sample size, survey structure and representativeness. 44 surveys that passed the screening process were accessed and their codebooks were examined to see whether the two questions of interest were included. Eight datasets allowed extraction or estimation of relevant information. Survey quality was assessed with regards to response rates, sampling methods and data collection procedures. PRISMA guidelines were followed, with explicit rules for approaching derived variables and skip patterns. Proportions were pooled using random effects meta-analysis. The eight surveys used representative samples of the general adult population. Response rates varied from 45% to 88%. Survey methods were on par with the best practice in this field. Altogether 216,314 respondents were included of whom 60.3% (95%CI 51.3-69.3) ever tried a cigarette. Among those, 68.9% (95% CI 60.9-76.9%) progressed to daily smoking. Over two thirds of people who try one cigarette become, at least temporarily, daily smokers. The finding provides strong support for the current efforts to reduce cigarette experimentation among adolescents. The transition from trying the first cigarette through occasional to daily smoking usually implies that a recreational activity is turning into a compulsive need that has to be satisfied virtually continuously. The 'conversion rate' from initial experimentation to daily smoking is thus a potentially important metric of smoking behavior, but estimates of it based on
Quantifying and mapping spatial variability in simulated forest plots
Gavin R. Corral; Harold E. Burkhart
2016-01-01
We used computer simulations to test the efficacy of multivariate statistical methods to detect,Â quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spacedÂ plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but randomÂ variability was added to individual tree characteristics...
A Generalized Random Regret Minimization Model
Chorus, C.G.
2013-01-01
This paper presents, discusses and tests a generalized Random Regret Minimization (G-RRM) model. The G-RRM model is created by replacing a fixed constant in the attribute-specific regret functions of the RRM model, by a regret-weight variable. Depending on the value of the regret-weights, the G-RRM
Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni
2017-12-01
The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITV BI ) and among populations (ITV POP ), relatively few studies have analyzed intraspecific variability within individuals (ITV WI ). Here, we provide an analysis of ITV WI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITV WI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITV WI , comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.
Analysis and Computation of Acoustic and Elastic Wave Equations in Random Media
Motamed, Mohammad
2014-01-06
We propose stochastic collocation methods for solving the second order acoustic and elastic wave equations in heterogeneous random media and subject to deterministic boundary and initial conditions [1, 4]. We assume that the medium consists of non-overlapping sub-domains with smooth interfaces. In each sub-domain, the materials coefficients are smooth and given or approximated by a finite number of random variable. One important example is wave propagation in multi-layered media with smooth interfaces. The numerical scheme consists of a finite difference or finite element method in the physical space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space. We provide a rigorous convergence analysis and demonstrate different types of convergence of the probability error with respect to the number of collocation points under some regularity assumptions on the data. In particular, we show that, unlike in elliptic and parabolic problems [2, 3], the solution to hyperbolic problems is not in general analytic with respect to the random variables. Therefore, the rate of convergence is only algebraic. A fast spectral rate of convergence is still possible for some quantities of interest and for the wave solutions with particular types of data. We also show that the semi-discrete solution is analytic with respect to the random variables with the radius of analyticity proportional to the grid/mesh size h. We therefore obtain an exponential rate of convergence which deteriorates as the quantity h p gets smaller, with p representing the polynomial degree in the stochastic space. We have shown that analytical results and numerical examples are consistent and that the stochastic collocation method may be a valid alternative to the more traditional Monte Carlo method. Here we focus on the stochastic acoustic wave equation. Similar results are obtained for stochastic elastic equations.
Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling
Galelli, S.; Castelletti, A.
2013-07-01
Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.
Crack Propagation Test Results for Variable Amplitude Spectrum Loading in Surface Flawed D6ac Steel
National Research Council Canada - National Science Library
Wood, H
1971-01-01
.... All spectra used in the program represented the critical wing pivot locations for the F-lll aircraft and were applied in a randomized block sequence containing 58 layers representing 200 flight hours...
Hellier, Coel
2001-01-01
Cataclysmic variable stars are the most variable stars in the night sky, fluctuating in brightness continually on timescales from seconds to hours to weeks to years. The changes can be recorded using amateur telescopes, yet are also the subject of intensive study by professional astronomers. That study has led to an understanding of cataclysmic variables as binary stars, orbiting so closely that material transfers from one star to the other. The resulting process of accretion is one of the most important in astrophysics. This book presents the first account of cataclysmic variables at an introductory level. Assuming no previous knowledge of the field, it explains the basic principles underlying the variability, while providing an extensive compilation of cataclysmic variable light curves. Aimed at amateur astronomers, undergraduates, and researchers, the main text is accessible to those with no mathematical background, while supplementary boxes present technical details and equations.
Understanding Brown Dwarf Variability
Marley, Mark S.
2013-01-01
Surveys of brown dwarf variability continue to find that roughly half of all brown dwarfs are variable. While variability is observed amongst all types of brown dwarfs, amplitudes are typically greatest for L-T transition objects. In my talk I will discuss the possible physical mechanisms that are responsible for the observed variability. I will particularly focus on comparing and contrasting the effects of changes in atmospheric thermal profile and cloud opacity. The two different mechanisms will produce different variability signatures and I will discuss the extent to which the current datasets constrain both mechanisms. By combining constraints from studies of variability with existing spectral and photometric datasets we can begin to construct and test self-consistent models of brown dwarf atmospheres. These models not only aid in the interpretation of existing objects but also inform studies of directly imaged giant planets.
Representativeness-based sampling network design for the State of Alaska
Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove
2013-01-01
Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
Effect of random edge failure on the average path length
Energy Technology Data Exchange (ETDEWEB)
Guo Dongchao; Liang Mangui; Li Dandan; Jiang Zhongyuan, E-mail: mgliang58@gmail.com, E-mail: 08112070@bjtu.edu.cn [Institute of Information Science, Beijing Jiaotong University, 100044, Beijing (China)
2011-10-14
We study the effect of random removal of edges on the average path length (APL) in a large class of uncorrelated random networks in which vertices are characterized by hidden variables controlling the attachment of edges between pairs of vertices. A formula for approximating the APL of networks suffering random edge removal is derived first. Then, the formula is confirmed by simulations for classical ER (Erdoes and Renyi) random graphs, BA (Barabasi and Albert) networks, networks with exponential degree distributions as well as random networks with asymptotic power-law degree distributions with exponent {alpha} > 2. (paper)
On the Wigner law in dilute random matrices
Khorunzhy, A.; Rodgers, G. J.
1998-12-01
We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.
Random inbreeding, isonymy, and population isolates in Argentina.
Dipierri, José; Rodríguez-Larralde, Alvaro; Barrai, Italo; Camelo, Jorge López; Redomero, Esperanza Gutiérrez; Rodríguez, Concepción Alonso; Ramallo, Virginia; Bronberg, Rubén; Alfaro, Emma
2014-07-01
Population isolates are an important tool in identifying and mapping genes of Mendelian diseases and complex traits. The geographical identification of isolates represents a priority from a genetic and health care standpoint. The purpose of this study is to analyze the spatial distribution of consanguinity by random isonymy (F ST) in Argentina and its relationship with the isolates previously identified in the country. F ST was estimated from the surname distribution of 22.6 million electors registered for the year 2001 in the 24 provinces, 5 geographical regions, and 510 departments of the country. Statistically significant spatial clustering of F ST was determined using the SaTScan V5.1 software. F ST exhibited a marked regional and departamental variation, showing the highest values towards the North and West of Argentina. The clusters of high consanguinity by random isonymy followed the same distribution. Recognized Argentinean genetic isolates are mainly localized at the north of the country, in clusters of high inbreeding. Given the availability of listings of surnames in high-capacity storage devices for different countries, estimating F ST from them can provide information on inbreeding for all levels of administrative subdivisions, to be used as a demographic variable for the identification of isolates within the country for public health purposes.
Glasby, John S
1974-01-01
The Nebular Variables focuses on the nebular variables and their characteristics. Discussions are organized by type of nebular variable, namely, RW Aurigae stars, T Orionis stars, T Tauri stars, and peculiar nebular objects. Topics range from light variations of the stars to their spectroscopic and physical characteristics, spatial distribution, interaction with nebulosity, and evolutionary features. This volume is divided into four sections and consists of 25 chapters, the first of which provides general information on nebular variables, including their stellar associations and their classifi
Ultrasonic variables affecting inspection
International Nuclear Information System (INIS)
Lautzenheiser, C.E.; Whiting, A.R.; McElroy, J.T.
1977-01-01
There are many variables which affect the detection of the effects and reproducibility of results when utilizing ultrasonic techniques. The most important variable is the procedure, as this document specifies, to a great extent, the controls that are exercised over the other variables. The most important variable is personnel with regards to training, qualification, integrity, data recording, and data analysis. Although the data is very limited, these data indicate that, if the procedure is carefully controlled, reliability of defect detection and reproducibility of results are both approximately 90 percent for reliability of detection, this applies to relatively small defects as reliability increases substantially as defect size increases above the recording limit. (author)
THE STRUCTURE OF HAPPINESS REPRESENTATION FOR RUSSIAN AND AMERICAN REPRESENTATIVES
Directory of Open Access Journals (Sweden)
S. Yu. Zhdanova
2017-01-01
Full Text Available Introduction. Emotional state of students exerts direct impact on their ability and readiness to cope with challenges when studying, gives rise to the success of educational process and its effectiveness. In this regard, the search of methods and determination of the tasks of psychological diagnostics is brought into focus. Above all, the teacher should consider mentality and valuable attitudes of representatives of various cultures, including their understanding of happiness and personal well-being in the activity against the background of the increasing scales of the international and interethnic mobility.The development of Russian psychology has recently acquired the direction of positive psychology, the focus of which is happiness and positive functioning of the individual. Modern research reveals significant differences in the indicators of happiness and satisfaction with life between representatives of different cultures. However, the diagnostic tools used in such studies are based primarily on the model of happiness image that has been developed in American psychology. In this connection, the question arises as to what extent the image of happiness in American culture correlates with the image of happiness in Russian culture.The aim of this work is to study the representation of happiness between representatives of American and Russian culture, the definition of invariable and variable components in the structure of representation.Methodology and research methods. The study included several stages. At the first stage, the theoretical analysis and development of the ontology of the subject area “Psychology of Happiness” was carried out. At the second stage, an empirical study of the representations of American and Russian respondents was carried out. The main method of data collection was a narrative interview; a method of early personal memories was used to obtain the narrative of happiness. Subsequent processing of verbal
Idris, K M; Mustafa, A F; Yousif, M A
2012-08-01
Pharmaceutical representatives are an important promotional tool for pharmaceutical companies. This cross-sectional, exploratory study aimed to determine pharmaceutical representatives' beliefs and practices about their professional practice in Sudan. A random sample of 160 pharmaceutical representatives were interviewed using a pretested questionnaire. The majority were male (84.4%) and had received training in professional sales skills (86.3%) and about the products being promoted (82.5%). Only 65.6% agreed that they provided full and balanced information about products. Not providing balanced information was attributed by 23.1% to doctors' lack of time. However, 28.1% confessed they sometimes felt like hiding unfavourable information, 21.9% were sometimes or always inclined to give untrue information to make sales and 66.9% considered free gifts as ethically acceptable. More attention is needed to dissemination of ethical codes of conduct and training about the ethics of drug promotion for pharmaceutical representatives in Sudan.
Effects of variable transformations on errors in FORM results
International Nuclear Information System (INIS)
Qin Quan; Lin Daojin; Mei Gang; Chen Hao
2006-01-01
On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors
Registration Appointment and Services for Representatives Management Information
Social Security Administration — A new internet/intranet application that collects all representative information and establishes the relationship between the claimant and the representative. Allow...
Simulations of Chemotaxis and Random Motility in Finite Domains
National Research Council Canada - National Science Library
Jabbarzadeh, Ehsan; Abrams, Cameron F
2005-01-01
.... The model couples fully time-dependent finite-difference solution of a reaction-diffusion equation for the concentration field of a generic chemoattractant to biased random walks representing individual moving cells...
Ostebee, Heath Michael; Ziminsky, Willy Steve; Johnson, Thomas Edward; Keener, Christopher Paul
2017-01-17
The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a linear actuator so as to maneuver the micro-mixer fuel nozzles axially along the liner.
Collective variables and dissipation
International Nuclear Information System (INIS)
Balian, R.
1984-09-01
This is an introduction to some basic concepts of non-equilibrium statistical mechanics. We emphasize in particular the relevant entropy relative to a given set of collective variables, the meaning of the projection method in the Liouville space, its use to establish the generalized transport equations for these variables, and the interpretation of dissipation in the framework of information theory
Variability: A Pernicious Hypothesis.
Noddings, Nel
1992-01-01
The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)
Reinforcing Saccadic Amplitude Variability
Paeye, Celine; Madelain, Laurent
2011-01-01
Saccadic endpoint variability is often viewed as the outcome of neural noise occurring during sensorimotor processing. However, part of this variability might result from operant learning. We tested this hypothesis by reinforcing dispersions of saccadic amplitude distributions, while maintaining constant their medians. In a first experiment we…
International Nuclear Information System (INIS)
Stairs, Allen
2007-01-01
Recent results by Paul Busch and Adan Cabello claim to show that by appealing to POVMs, non-contextual hidden variables can be ruled out in two dimensions. While the results of Busch and Cabello are mathematically correct, interpretive problems render them problematic as no hidden variable proofs
Interdependence Among Organizational Variables
Knowles, M. C.
1975-01-01
The interrelationship between a set of organizational variables was investigated at 14 work organizations within a company. The variables were production, quality, costs, job satisfaction of operatives, job satisfaction of supervisors, work anxiety, accidents, absence, labor turnover, and industrial unrest. (Author)
Inoue, Akiomi; Kawakami, Norito; Tsuchiya, Masao; Sakurai, Keiko; Hashimoto, Hideki
2010-01-01
The purpose of this study was to investigate the cross-sectional association of employment contract, company size, and occupation with psychological distress using a nationally representative sample of the Japanese population. From June through July 2007, a total of 9,461 male and 7,717 female employees living in the community were randomly selected and surveyed using a self-administered questionnaire and interview including questions about occupational class variables, psychological distress (K6 scale), treatment for mental disorders, and other covariates. Among males, part-time workers had a significantly higher prevalence of psychological distress than permanent workers. Among females, temporary/contract workers had a significantly higher prevalence of psychological distress than permanent workers. Among males, those who worked at companies with 300-999 employees had a significantly higher prevalence of psychological distress than those who worked at the smallest companies (with 1-29 employees). Company size was not significantly associated with psychological distress among females. Additionally, occupation was not significantly associated with psychological distress among males or females. Similar patterns were observed when the analyses were conducted for those who had psychological distress and/or received treatment for mental disorders. Working as part-time workers, for males, and as temporary/contract workers, for females, may be associated with poor mental health in Japan. No clear gradient in mental health along company size or occupation was observed in Japan.
More randomness from the same data
International Nuclear Information System (INIS)
Bancal, Jean-Daniel; Sheridan, Lana; Scarani, Valerio
2014-01-01
Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment (Christensen et al 2013 Phys. Rev. Lett. 111 130406), we show that about twice as much randomness as previously reported can be potentially extracted from this setup. (paper)
Describing temporal variability of the mean Estonian precipitation series in climate time scale
Post, P.; Kärner, O.
2009-04-01
Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0
Behavioral neurocardiac training in hypertension: a randomized, controlled trial.
Nolan, Robert P; Floras, John S; Harvey, Paula J; Kamath, Markad V; Picton, Peter E; Chessex, Caroline; Hiscock, Natalie; Powell, Jonathan; Catt, Michael; Hendrickx, Hilde; Talbot, Duncan; Chen, Maggie H
2010-04-01
It is not established whether behavioral interventions add benefit to pharmacological therapy for hypertension. We hypothesized that behavioral neurocardiac training (BNT) with heart rate variability biofeedback would reduce blood pressure further by modifying vagal heart rate modulation during reactivity and recovery from standardized cognitive tasks ("mental stress"). This randomized, controlled trial enrolled 65 patients with uncomplicated hypertension to BNT or active control (autogenic relaxation), with six 1-hour sessions over 2 months with home practice. Outcomes were analyzed with linear mixed models that adjusted for antihypertensive drugs. BNT reduced daytime and 24-hour systolic blood pressures (-2.4+/-0.9 mm Hg, P=0.009, and -2.1+/-0.9 mm Hg, P=0.03, respectively) and pulse pressures (-1.7+/-0.6 mm Hg, P=0.004, and -1.4+/-0.6 mm Hg, P=0.02, respectively). No effect was observed for controls (P>0.10 for all indices). BNT also increased RR-high-frequency power (0.15 to 0.40 Hz; P=0.01) and RR interval (P0.10). In contrast to relaxation therapy, BNT with heart rate variability biofeedback modestly lowers ambulatory blood pressure during wakefulness, and it augments tonic vagal heart rate modulation. It is unknown whether efficacy of this treatment can be improved with biofeedback of baroreflex gain. BNT, alone or as an adjunct to drug therapy, may represent a promising new intervention for hypertension.
A Note on the Correlated Random Coefficient Model
DEFF Research Database (Denmark)
Kolodziejczyk, Christophe
In this note we derive the bias of the OLS estimator for a correlated random coefficient model with one random coefficient, but which is correlated with a binary variable. We provide set-identification to the parameters of interest of the model. We also show how to reduce the bias of the estimator...
Strong result for real zeros of random algebraic polynomials
Directory of Open Access Journals (Sweden)
T. Uno
2001-01-01
Full Text Available An estimate is given for the lower bound of real zeros of random algebraic polynomials whose coefficients are non-identically distributed dependent Gaussian random variables. Moreover, our estimated measure of the exceptional set, which is independent of the degree of the polynomials, tends to zero as the degree of the polynomial tends to infinity.
On a randomly imperfect spherical cap pressurized by a random ...
African Journals Online (AJOL)
On a randomly imperfect spherical cap pressurized by a random dynamic load. ... In this paper, we investigate a dynamical system in a random setting of dual ... characterization of the random process for determining the dynamic buckling load ...
Energy Technology Data Exchange (ETDEWEB)
Jung, T.
2000-07-01
The North Atlantic oscillation (NAO) represents the dominant mode of atmospheric variability in the North Atlantic region and describes the strengthening and weakening of the midlatitude westerlies. In this study, variability of the NAO during wintertime and its relationship to the North Atlantic ocean and Arctic sea ice is investigated. For this purpose, observational data are analyzed along with integrations of models for the Atlantic ocean, Arctic sea ice, and the coupled global climate system. From a statistical point of view, the observed NAO index shows unusually high variance on interdecadal time scales during the 20th century. Variability on other time scales is consistent with realizations of random processes (''white noise''). Recurrence of wintertime NAO anomalies from winter-to-winter with missing signals during the inbetween nonwinter seasons is primarily associated with interdecadal variability of the NAO. This recurrence indicates that low-frequency changes of the NAO during the 20th century were in part externally forced. (orig.)
Illusory correlation: a function of availability or representativeness heuristics?
MacDonald, M G
2000-08-01
The present study sought to investigate the illusory correlation phenomenon by experimentally manipulating the availability of information through the use of the "lag" effect (Madigan, 1969). Seventy-four university students voluntarily participated in this study. Similar to Starr and Katkin's (1969) methodology, subjects were visually presented with each possible combination of four experimental problem descriptions and four sentence completions that were paired and shown twice at each of four lags (i.e., with 0, 2, 8 and 20 intervening variables). Subjects were required to make judgements concerning the frequency with which sentence completions and problem descriptions co-occurred. In agreement with previous research (Starr & Katkin, 1969), the illusory correlation effect was found for specific descriptions and sentence completions. Results also yielded a significant effect of lag for mean ratings between 0 and 2 lags; however, there was no reliable increase in judged co-occurrence at lags 8 and 20. Evidence failed to support the hypothesis that greater availability, through the experimental manipulation of lag, would result in increased frequency of co-occurrence judgements. Findings indicate that, in the present study, the illusory correlation effect is probably due to a situational bias based on the representativeness heuristic.
Blocked Randomization with Randomly Selected Block Sizes
Directory of Open Access Journals (Sweden)
Jimmy Efird
2010-12-01
Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.
The effects of variable practice on locomotor adaptation to a novel asymmetric gait.
Hinkel-Lipsker, Jacob W; Hahn, Michael E
2017-09-01
Very little is known about the effects of specific practice on motor learning of predictive balance control during novel bipedal gait. This information could provide an insight into how the direction and magnitude of predictive errors during acquisition of a novel gait task influence transfer of balance control, as well as yield a practice protocol for the restoration of balance for those with locomotor impairments. This study examined the effect of a variable practice paradigm on transfer of a novel asymmetric gait pattern in able-bodied individuals. Using a split-belt treadmill, one limb was driven at a constant velocity (constant limb) and the other underwent specific changes in velocity (variable limb) during practice according to one of three prescribed practice paradigms: serial, where the variable limb velocity increased linearly; random blocked, where variable limb underwent random belt velocity changes every 20 strides; and random practice, where the variable limb underwent random step-to-step changes in velocity. Random practice showed the highest balance control variability during acquisition compared to serial and random blocked practice which demonstrated the best transfer of balance control on one transfer test. Both random and random blocked practices showed significantly less balance control variability during a second transfer test compared to serial practice. These results indicate that random blocked practice may be best for generalizability of balance control while learning a novel gait, perhaps, indicating that individuals who underwent this practice paradigm were able to find the most optimal balance control solution during practice.
20 CFR 266.7 - Accountability of a representative payee.
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Accountability of a representative payee. 266.7 Section 266.7 Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD RETIREMENT ACT REPRESENTATIVE PAYMENT § 266.7 Accountability of a representative payee. (a) A representative...
A simplified method for random vibration analysis of structures with random parameters
International Nuclear Information System (INIS)
Ghienne, Martin; Blanzé, Claude
2016-01-01
Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues. (paper)
Nonlinear deterministic structures and the randomness of protein sequences
Huang Yan Zhao
2003-01-01
To clarify the randomness of protein sequences, we make a detailed analysis of a set of typical protein sequences representing each structural classes by using nonlinear prediction method. No deterministic structures are found in these protein sequences and this implies that they behave as random sequences. We also give an explanation to the controversial results obtained in previous investigations.
DEFF Research Database (Denmark)
Burgess, Stephen; Thompson, Simon G; Thompson, Grahame
2010-01-01
Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...