WorldWideScience

Sample records for vector-valued random processes

  1. A study of biorthogonal multiple vector-valued wavelets

    International Nuclear Information System (INIS)

    Han Jincang; Cheng Zhengxing; Chen Qingjiang

    2009-01-01

    The notion of vector-valued multiresolution analysis is introduced and the concept of biorthogonal multiple vector-valued wavelets which are wavelets for vector fields, is introduced. It is proved that, like in the scalar and multiwavelet case, the existence of a pair of biorthogonal multiple vector-valued scaling functions guarantees the existence of a pair of biorthogonal multiple vector-valued wavelet functions. An algorithm for constructing a class of compactly supported biorthogonal multiple vector-valued wavelets is presented. Their properties are investigated by means of operator theory and algebra theory and time-frequency analysis method. Several biorthogonality formulas regarding these wavelet packets are obtained.

  2. Noncommutative and vector-valued Rosenthal inequalities

    NARCIS (Netherlands)

    Dirksen, S.

    2011-01-01

    This thesis is dedicated to the study of a class of probabilistic inequalities, called Rosenthal inequalities. These inequalities provide two-sided estimates for the p-th moments of the sum of a sequence of independent, mean zero random variables in terms of a suitable norm on the sequence itself.

  3. A Campbell random process

    International Nuclear Information System (INIS)

    Reuss, J.D.; Misguich, J.H.

    1993-02-01

    The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution

  4. Some New Lacunary Strong Convergent Vector-Valued Sequence Spaces

    Directory of Open Access Journals (Sweden)

    M. Mursaleen

    2014-01-01

    Full Text Available We introduce some vector-valued sequence spaces defined by a Musielak-Orlicz function and the concepts of lacunary convergence and strong (A-convergence, where A=(aik is an infinite matrix of complex numbers. We also make an effort to study some topological properties and some inclusion relations between these spaces.

  5. Some New Lacunary Strong Convergent Vector-Valued Sequence Spaces

    OpenAIRE

    Mursaleen, M.; Alotaibi, A.; Sharma, Sunil K.

    2014-01-01

    We introduce some vector-valued sequence spaces defined by a Musielak-Orlicz function and the concepts of lacunary convergence and strong ( $A$ )-convergence, where $A=({a}_{ik})$ is an infinite matrix of complex numbers. We also make an effort to study some topological properties and some inclusion relations between these spaces.

  6. Some BMO estimates for vector-valued multilinear singular integral ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    the multilinear operator related to some singular integral operators is obtained. The main purpose of this paper is to establish the BMO end-point estimates for some vector-valued multilinear operators related to certain singular integral operators. First, let us introduce some notations [10,16]. Throughout this paper, Q = Q(x,r).

  7. Isometric multipliers of a vector valued Beurling algebra on a ...

    Indian Academy of Sciences (India)

    Home; Journals; Proceedings – Mathematical Sciences; Volume 127; Issue 1. Isometric multipliers of a vector valued Beurling algebra on a discrete semigroup. Research Article Volume 127 Issue 1 February 2017 pp 109- ... Keywords. Weighted semigroup; multipliers of a semigroup; Beurling algebra; isometric multipliers.

  8. Multiview vector-valued manifold regularization for multilabel image classification.

    Science.gov (United States)

    Luo, Yong; Tao, Dacheng; Xu, Chang; Xu, Chao; Liu, Hong; Wen, Yonggang

    2013-05-01

    In computer vision, image datasets used for classification are naturally associated with multiple labels and comprised of multiple views, because each image may contain several objects (e.g., pedestrian, bicycle, and tree) and is properly characterized by multiple visual features (e.g., color, texture, and shape). Currently, available tools ignore either the label relationship or the view complementarily. Motivated by the success of the vector-valued function that constructs matrix-valued kernels to explore the multilabel structure in the output space, we introduce multiview vector-valued manifold regularization (MV(3)MR) to integrate multiple features. MV(3)MR exploits the complementary property of different features and discovers the intrinsic local geometry of the compact support shared by different features under the theme of manifold regularization. We conduct extensive experiments on two challenging, but popular, datasets, PASCAL VOC' 07 and MIR Flickr, and validate the effectiveness of the proposed MV(3)MR for image classification.

  9. Extensions of vector-valued functions with preservation of derivatives

    Czech Academy of Sciences Publication Activity Database

    Koc, M.; Kolář, Jan

    2017-01-01

    Roč. 449, č. 1 (2017), s. 343-367 ISSN 0022-247X R&D Projects: GA ČR(CZ) GA14-07880S Institutional support: RVO:67985840 Keywords : vector-valued differentiable functions * extensions * strict differentiability * partitions of unity Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.064, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022247X16307703

  10. On the uncertainty relations for vector-valued operators

    International Nuclear Information System (INIS)

    Chistyakov, A.L.

    1976-01-01

    In analogy with the expression for the Heisenberg incertainty principle in terms of dispersions by means of the Weyl inequality, in the case of one-dimensional quantum mechanical quantities, the principle for many-dimensional quantities can be expressed in terms of generalized dispersions and covariance matrices by means of inequalities similar to the Weyl unequality. The proofs of these inequalities are given in an abstract form, not only for the physical vector quantities, but also for arbitrary vector-valued operators with commuting self-adjoint components

  11. Construction and decomposition of biorthogonal vector-valued wavelets with compact support

    International Nuclear Information System (INIS)

    Chen Qingjiang; Cao Huaixin; Shi Zhi

    2009-01-01

    In this article, we introduce vector-valued multiresolution analysis and the biorthogonal vector-valued wavelets with four-scale. The existence of a class of biorthogonal vector-valued wavelets with compact support associated with a pair of biorthogonal vector-valued scaling functions with compact support is discussed. A method for designing a class of biorthogonal compactly supported vector-valued wavelets with four-scale is proposed by virtue of multiresolution analysis and matrix theory. The biorthogonality properties concerning vector-valued wavelet packets are characterized with the aid of time-frequency analysis method and operator theory. Three biorthogonality formulas regarding them are presented.

  12. Random processes in nuclear reactors

    CERN Document Server

    Williams, M M R

    1974-01-01

    Random Processes in Nuclear Reactors describes the problems that a nuclear engineer may meet which involve random fluctuations and sets out in detail how they may be interpreted in terms of various models of the reactor system. Chapters set out to discuss topics on the origins of random processes and sources; the general technique to zero-power problems and bring out the basic effect of fission, and fluctuations in the lifetime of neutrons, on the measured response; the interpretation of power reactor noise; and associated problems connected with mechanical, hydraulic and thermal noise sources

  13. Vector-valued measure and the necessary conditions for the optimal control problems of linear systems

    International Nuclear Information System (INIS)

    Xunjing, L.

    1981-12-01

    The vector-valued measure defined by the well-posed linear boundary value problems is discussed. The maximum principle of the optimal control problem with non-convex constraint is proved by using the vector-valued measure. Especially, the necessary conditions of the optimal control of elliptic systems is derived without the convexity of the control domain and the cost function. (author)

  14. A signal theoretic introduction to random processes

    CERN Document Server

    Howard, Roy M

    2015-01-01

    A fresh introduction to random processes utilizing signal theory By incorporating a signal theory basis, A Signal Theoretic Introduction to Random Processes presents a unique introduction to random processes with an emphasis on the important random phenomena encountered in the electronic and communications engineering field. The strong mathematical and signal theory basis provides clarity and precision in the statement of results. The book also features:  A coherent account of the mathematical fundamentals and signal theory that underpin the presented material Unique, in-depth coverage of

  15. On the Stone-Weierstrass theorem for scalar and vector valued functions

    International Nuclear Information System (INIS)

    Khan, L.A.

    1991-09-01

    In this paper we discuss the formulation of the Stone-Weierstrass approximation theorem for vector-valued functions and then determine whether the classical Stone-Weierstrass theorem for scalar-valued functions can be deduced from the above one. We also state some open problems in this area. (author). 15 refs

  16. Positive solutions for a nonlocal boundary-value problem with vector-valued response

    Directory of Open Access Journals (Sweden)

    Andrzej Nowakowski

    2002-05-01

    Full Text Available Using variational methods, we study the existence of positive solutions for a nonlocal boundary-value problem with vector-valued response. We develop duality and variational principles for this problem and present a numerical version which enables the approximation of solutions and gives a measure of a duality gap between primal and dual functional for approximate solutions for this problem.

  17. A Riesz Representation Theorem for the Space of Henstock Integrable Vector-Valued Functions

    Directory of Open Access Journals (Sweden)

    Tomás Pérez Becerra

    2018-01-01

    Full Text Available Using a bounded bilinear operator, we define the Henstock-Stieltjes integral for vector-valued functions; we prove some integration by parts theorems for Henstock integral and a Riesz-type theorem which provides an alternative proof of the representation theorem for real functions proved by Alexiewicz.

  18. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  19. Pseudo random signal processing theory and application

    CERN Document Server

    Zepernick, Hans-Jurgen

    2013-01-01

    In recent years, pseudo random signal processing has proven to be a critical enabler of modern communication, information, security and measurement systems. The signal's pseudo random, noise-like properties make it vitally important as a tool for protecting against interference, alleviating multipath propagation and allowing the potential of sharing bandwidth with other users. Taking a practical approach to the topic, this text provides a comprehensive and systematic guide to understanding and using pseudo random signals. Covering theoretical principles, design methodologies and applications

  20. Elements of random walk and diffusion processes

    CERN Document Server

    Ibe, Oliver C

    2013-01-01

    Presents an important and unique introduction to random walk theory Random walk is a stochastic process that has proven to be a useful model in understanding discrete-state discrete-time processes across a wide spectrum of scientific disciplines. Elements of Random Walk and Diffusion Processes provides an interdisciplinary approach by including numerous practical examples and exercises with real-world applications in operations research, economics, engineering, and physics. Featuring an introduction to powerful and general techniques that are used in the application of physical and dynamic

  1. Extensions of vector-valued Baire one functions with preservation of points of continuity

    Czech Academy of Sciences Publication Activity Database

    Koc, M.; Kolář, Jan

    2016-01-01

    Roč. 442, č. 1 (2016), s. 138-148 ISSN 0022-247X R&D Projects: GA ČR(CZ) GA14-07880S Institutional support: RVO:67985840 Keywords : vector-valued Baire one functions * extensions * non-tangential limit * continuity points Subject RIV: BA - General Mathematics Impact factor: 1.064, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022247X1630097X

  2. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  3. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  4. 2013 CIME Course Vector-valued Partial Differential Equations and Applications

    CERN Document Server

    Marcellini, Paolo

    2017-01-01

    Collating different aspects of Vector-valued Partial Differential Equations and Applications, this volume is based on the 2013 CIME Course with the same name which took place at Cetraro, Italy, under the scientific direction of John Ball and Paolo Marcellini. It contains the following contributions: The pullback equation (Bernard Dacorogna), The stability of the isoperimetric inequality (Nicola Fusco), Mathematical problems in thin elastic sheets: scaling limits, packing, crumpling and singularities (Stefan Müller), and Aspects of PDEs related to fluid flows (Vladimir Sverák). These lectures are addressed to graduate students and researchers in the field.

  5. A representation result for hysteresis operators with vector valued inputs and its application to models for magnetic materials

    Energy Technology Data Exchange (ETDEWEB)

    Klein, Olaf, E-mail: Olaf.Klein@wias-berlin.de

    2014-02-15

    In this work, hysteresis operators mapping continuous vector-valued input functions being piecewise monotaffine, i.e. being piecewise the composition of a monotone with an affine function, to vector-valued output functions are considered. It is shown that the operator can be generated by a unique defined function on the convexity triple free strings. A formulation of a congruence property for periodic inputs is presented and reformulated as a condition for the generating string function.

  6. Provable quantum advantage in randomness processing

    OpenAIRE

    Dale, H; Jennings, D; Rudolph, T

    2015-01-01

    Quantum advantage is notoriously hard to find and even harder to prove. For example the class of functions computable with classical physics actually exactly coincides with the class computable quantum-mechanically. It is strongly believed, but not proven, that quantum computing provides exponential speed-up for a range of problems, such as factoring. Here we address a computational scenario of "randomness processing" in which quantum theory provably yields, not only resource reduction over c...

  7. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  8. Multi-fidelity Gaussian process regression for prediction of random fields

    Energy Technology Data Exchange (ETDEWEB)

    Parussini, L. [Department of Engineering and Architecture, University of Trieste (Italy); Venturi, D., E-mail: venturi@ucsc.edu [Department of Applied Mathematics and Statistics, University of California Santa Cruz (United States); Perdikaris, P. [Department of Mechanical Engineering, Massachusetts Institute of Technology (United States); Karniadakis, G.E. [Division of Applied Mathematics, Brown University (United States)

    2017-05-01

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgers equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.

  9. Multi-fidelity Gaussian process regression for prediction of random fields

    International Nuclear Information System (INIS)

    Parussini, L.; Venturi, D.; Perdikaris, P.; Karniadakis, G.E.

    2017-01-01

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgers equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.

  10. Asymptotic theory of weakly dependent random processes

    CERN Document Server

    Rio, Emmanuel

    2017-01-01

    Presenting tools to aid understanding of asymptotic theory and weakly dependent processes, this book is devoted to inequalities and limit theorems for sequences of random variables that are strongly mixing in the sense of Rosenblatt, or absolutely regular. The first chapter introduces covariance inequalities under strong mixing or absolute regularity. These covariance inequalities are applied in Chapters 2, 3 and 4 to moment inequalities, rates of convergence in the strong law, and central limit theorems. Chapter 5 concerns coupling. In Chapter 6 new deviation inequalities and new moment inequalities for partial sums via the coupling lemmas of Chapter 5 are derived and applied to the bounded law of the iterated logarithm. Chapters 7 and 8 deal with the theory of empirical processes under weak dependence. Lastly, Chapter 9 describes links between ergodicity, return times and rates of mixing in the case of irreducible Markov chains. Each chapter ends with a set of exercises. The book is an updated and extended ...

  11. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  12. A Parallel Framework with Block Matrices of a Discrete Fourier Transform for Vector-Valued Discrete-Time Signals

    Directory of Open Access Journals (Sweden)

    Pablo Soto-Quiros

    2015-01-01

    Full Text Available This paper presents a parallel implementation of a kind of discrete Fourier transform (DFT: the vector-valued DFT. The vector-valued DFT is a novel tool to analyze the spectra of vector-valued discrete-time signals. This parallel implementation is developed in terms of a mathematical framework with a set of block matrix operations. These block matrix operations contribute to analysis, design, and implementation of parallel algorithms in multicore processors. In this work, an implementation and experimental investigation of the mathematical framework are performed using MATLAB with the Parallel Computing Toolbox. We found that there is advantage to use multicore processors and a parallel computing environment to minimize the high execution time. Additionally, speedup increases when the number of logical processors and length of the signal increase.

  13. Investigating Efficiency of Vector-Valued Intensity Measures in Seismic Demand Assessment of Concrete Dams

    Directory of Open Access Journals (Sweden)

    Mohammad Alembagheri

    2018-01-01

    Full Text Available The efficiency of vector-valued intensity measures for predicting the seismic demand in gravity dams is investigated. The Folsom gravity dam-reservoir coupled system is selected and numerically analyzed under a set of two-hundred actual ground motions. First, the well-defined scalar IMs are separately investigated, and then they are coupled to form two-parameter vector IMs. After that, IMs consisting of spectral acceleration at the first-mode natural period of the dam-reservoir system along with a measure of the spectral shape (the ratio of spectral acceleration at a second period to the first-mode spectral acceleration value are considered. It is attempted to determine the optimal second period by categorizing the spectral acceleration at the first-mode period of vibration. The efficiency of the proposed vector IMs is compared with scalar ones considering various structural responses as EDPs. Finally, the probabilistic seismic behavior of the dam is investigated by calculating its fragility curves employing scalar and vector IMs considering the effect of zero response values.

  14. Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.

    Science.gov (United States)

    Cheng, Ching-An; Huang, Han-Pang

    2016-12-01

    We study the modeling of Lagrangian systems with multiple degrees of freedom. Based on system dynamics, canonical parametric models require ad hoc derivations and sometimes simplification for a computable solution; on the other hand, due to the lack of prior knowledge in the system's structure, modern nonparametric models in machine learning face the curse of dimensionality, especially in learning large systems. In this paper, we bridge this gap by unifying the theories of Lagrangian systems and vector-valued reproducing kernel Hilbert space. We reformulate Lagrangian systems with kernels that embed the governing Euler-Lagrange equation-the Lagrangian kernels-and show that these kernels span a subspace capturing the Lagrangian's projection as inverse dynamics. By such property, our model uses only inputs and outputs as in machine learning and inherits the structured form as in system dynamics, thereby removing the need for the mundane derivations for new systems as well as the generalization problem in learning from scratches. In effect, it learns the system's Lagrangian, a simpler task than directly learning the dynamics. To demonstrate, we applied the proposed kernel to identify the robot inverse dynamics in simulations and experiments. Our results present a competitive novel approach to identifying Lagrangian systems, despite using only inputs and outputs.

  15. Traffic and random processes an introduction

    CERN Document Server

    Mauro, Raffaele

    2015-01-01

    This book deals in a basic and systematic manner with a the fundamentals of random function theory and looks at some aspects related to arrival, vehicle headway and operational speed processes at the same time. The work serves as a useful practical and educational tool and aims at providing stimulus and motivation to investigate issues of such a strong applicative interest. It has a clearly discursive and concise structure, in which numerical examples are given to clarify the applications of the suggested theoretical model. Some statistical characterizations are fully developed in order to illustrate the peculiarities of specific modeling approaches; finally, there is a useful bibliography for in-depth thematic analysis.

  16. UNDERSTANDING SEVERE WEATHER PROCESSES THROUGH SPATIOTEMPORAL RELATIONAL RANDOM FORESTS

    Data.gov (United States)

    National Aeronautics and Space Administration — UNDERSTANDING SEVERE WEATHER PROCESSES THROUGH SPATIOTEMPORAL RELATIONAL RANDOM FORESTS AMY MCGOVERN, TIMOTHY SUPINIE, DAVID JOHN GAGNE II, NATHANIEL TROUTMAN,...

  17. A Computerized Approach to Trickle-Process, Random Assignment.

    Science.gov (United States)

    Braucht, G. Nicholas; Reichardt, Charles S.

    1993-01-01

    Procedures for implementing random assignment with trickle processing and ways they can be corrupted are described. A computerized method for implementing random assignment with trickle processing is presented as a desirable alternative in many situations and a way of protecting against threats to assignment validity. (SLD)

  18. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  19. Discrete random signal processing and filtering primer with Matlab

    CERN Document Server

    Poularikas, Alexander D

    2013-01-01

    Engineers in all fields will appreciate a practical guide that combines several new effective MATLAB® problem-solving approaches and the very latest in discrete random signal processing and filtering.Numerous Useful Examples, Problems, and Solutions - An Extensive and Powerful ReviewWritten for practicing engineers seeking to strengthen their practical grasp of random signal processing, Discrete Random Signal Processing and Filtering Primer with MATLAB provides the opportunity to doubly enhance their skills. The author, a leading expert in the field of electrical and computer engineering, offe

  20. Level sets and extrema of random processes and fields

    CERN Document Server

    Azais, Jean-Marc

    2009-01-01

    A timely and comprehensive treatment of random field theory with applications across diverse areas of study Level Sets and Extrema of Random Processes and Fields discusses how to understand the properties of the level sets of paths as well as how to compute the probability distribution of its extremal values, which are two general classes of problems that arise in the study of random processes and fields and in related applications. This book provides a unified and accessible approach to these two topics and their relationship to classical theory and Gaussian processes and fields, and the most modern research findings are also discussed. The authors begin with an introduction to the basic concepts of stochastic processes, including a modern review of Gaussian fields and their classical inequalities. Subsequent chapters are devoted to Rice formulas, regularity properties, and recent results on the tails of the distribution of the maximum. Finally, applications of random fields to various areas of mathematics a...

  1. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  2. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  3. Renewal theory for perturbed random walks and similar processes

    CERN Document Server

    Iksanov, Alexander

    2016-01-01

    This book offers a detailed review of perturbed random walks, perpetuities, and random processes with immigration. Being of major importance in modern probability theory, both theoretical and applied, these objects have been used to model various phenomena in the natural sciences as well as in insurance and finance. The book also presents the many significant results and efficient techniques and methods that have been worked out in the last decade. The first chapter is devoted to perturbed random walks and discusses their asymptotic behavior and various functionals pertaining to them, including supremum and first-passage time. The second chapter examines perpetuities, presenting results on continuity of their distributions and the existence of moments, as well as weak convergence of divergent perpetuities. Focusing on random processes with immigration, the third chapter investigates the existence of moments, describes long-time behavior and discusses limit theorems, both with and without scaling. Chapters fou...

  4. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  5. Money creation process in a random redistribution model

    Science.gov (United States)

    Chen, Siyan; Wang, Yougui; Li, Keqiang; Wu, Jinshan

    2014-01-01

    In this paper, the dynamical process of money creation in a random exchange model with debt is investigated. The money creation kinetics are analyzed by both the money-transfer matrix method and the diffusion method. From both approaches, we attain the same conclusion: the source of money creation in the case of random exchange is the agents with neither money nor debt. These analytical results are demonstrated by computer simulations.

  6. Scaling behaviour of randomly alternating surface growth processes

    International Nuclear Information System (INIS)

    Raychaudhuri, Subhadip; Shapir, Yonathan

    2002-01-01

    The scaling properties of the roughness of surfaces grown by two different processes randomly alternating in time are addressed. The duration of each application of the two primary processes is assumed to be independently drawn from given distribution functions. We analytically address processes in which the two primary processes are linear and extend the conclusions to nonlinear processes as well. The growth scaling exponent of the average roughness with the number of applications is found to be determined by the long time tail of the distribution functions. For processes in which both mean application times are finite, the scaling behaviour follows that of the corresponding cyclical process in which the uniform application time of each primary process is given by its mean. If the distribution functions decay with a small enough power law for the mean application times to diverge, the growth exponent is found to depend continuously on this power-law exponent. In contrast, the roughness exponent does not depend on the timing of the applications. The analytical results are supported by numerical simulations of various pairs of primary processes and with different distribution functions. Self-affine surfaces grown by two randomly alternating processes are common in nature (e.g., due to randomly changing weather conditions) and in man-made devices such as rechargeable batteries

  7. Melnikov processes and chaos in randomly perturbed dynamical systems

    Science.gov (United States)

    Yagasaki, Kazuyuki

    2018-07-01

    We consider a wide class of randomly perturbed systems subjected to stationary Gaussian processes and show that chaotic orbits exist almost surely under some nondegenerate condition, no matter how small the random forcing terms are. This result is very contrasting to the deterministic forcing case, in which chaotic orbits exist only if the influence of the forcing terms overcomes that of the other terms in the perturbations. To obtain the result, we extend Melnikov’s method and prove that the corresponding Melnikov functions, which we call the Melnikov processes, have infinitely many zeros, so that infinitely many transverse homoclinic orbits exist. In addition, a theorem on the existence and smoothness of stable and unstable manifolds is given and the Smale–Birkhoff homoclinic theorem is extended in an appropriate form for randomly perturbed systems. We illustrate our theory for the Duffing oscillator subjected to the Ornstein–Uhlenbeck process parametrically.

  8. Continuous state branching processes in random environment: The Brownian case

    OpenAIRE

    Palau, Sandra; Pardo, Juan Carlos

    2015-01-01

    We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...

  9. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  10. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  11. On the joint statistics of stable random processes

    International Nuclear Information System (INIS)

    Hopcraft, K I; Jakeman, E

    2011-01-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

  12. Generation and monitoring of a discrete stable random process

    CERN Document Server

    Hopcraft, K I; Matthews, J O

    2002-01-01

    A discrete stochastic process with stationary power law distribution is obtained from a death-multiple immigration population model. Emigrations from the population form a random series of events which are monitored by a counting process with finite-dynamic range and response time. It is shown that the power law behaviour of the population is manifested in the intermittent behaviour of the series of events. (letter to the editor)

  13. Spatial birth-and-death processes in random environment

    OpenAIRE

    Fernandez, Roberto; Ferrari, Pablo A.; Guerberoff, Gustavo R.

    2004-01-01

    We consider birth-and-death processes of objects (animals) defined in ${\\bf Z}^d$ having unit death rates and random birth rates. For animals with uniformly bounded diameter we establish conditions on the rate distribution under which the following holds for almost all realizations of the birth rates: (i) the process is ergodic with at worst power-law time mixing; (ii) the unique invariant measure has exponential decay of (spatial) correlations; (iii) there exists a perfect-simulation algorit...

  14. Scaling behaviour of randomly alternating surface growth processes

    CERN Document Server

    Raychaudhuri, S

    2002-01-01

    The scaling properties of the roughness of surfaces grown by two different processes randomly alternating in time are addressed. The duration of each application of the two primary processes is assumed to be independently drawn from given distribution functions. We analytically address processes in which the two primary processes are linear and extend the conclusions to nonlinear processes as well. The growth scaling exponent of the average roughness with the number of applications is found to be determined by the long time tail of the distribution functions. For processes in which both mean application times are finite, the scaling behaviour follows that of the corresponding cyclical process in which the uniform application time of each primary process is given by its mean. If the distribution functions decay with a small enough power law for the mean application times to diverge, the growth exponent is found to depend continuously on this power-law exponent. In contrast, the roughness exponent does not depe...

  15. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  16. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  17. Solution-Processed Carbon Nanotube True Random Number Generator.

    Science.gov (United States)

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  18. Optimal redundant systems for works with random processing time

    International Nuclear Information System (INIS)

    Chen, M.; Nakagawa, T.

    2013-01-01

    This paper studies the optimal redundant policies for a manufacturing system processing jobs with random working times. The redundant units of the parallel systems and standby systems are subject to stochastic failures during the continuous production process. First, a job consisting of only one work is considered for both redundant systems and the expected cost functions are obtained. Next, each redundant system with a random number of units is assumed for a single work. The expected cost functions and the optimal expected numbers of units are derived for redundant systems. Subsequently, the production processes of N tandem works are introduced for parallel and standby systems, and the expected cost functions are also summarized. Finally, the number of works is estimated by a Poisson distribution for the parallel and standby systems. Numerical examples are given to demonstrate the optimization problems of redundant systems

  19. Multifractal detrended fluctuation analysis of analog random multiplicative processes

    Energy Technology Data Exchange (ETDEWEB)

    Silva, L.B.M.; Vermelho, M.V.D. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil); Lyra, M.L. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)], E-mail: marcelo@if.ufal.br; Viswanathan, G.M. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)

    2009-09-15

    We investigate non-Gaussian statistical properties of stationary stochastic signals generated by an analog circuit that simulates a random multiplicative process with weak additive noise. The random noises are originated by thermal shot noise and avalanche processes, while the multiplicative process is generated by a fully analog circuit. The resulting signal describes stochastic time series of current interest in several areas such as turbulence, finance, biology and environment, which exhibit power-law distributions. Specifically, we study the correlation properties of the signal by employing a detrended fluctuation analysis and explore its multifractal nature. The singularity spectrum is obtained and analyzed as a function of the control circuit parameter that tunes the asymptotic power-law form of the probability distribution function.

  20. Random migration processes between two stochastic epidemic centers.

    Science.gov (United States)

    Sazonov, Igor; Kelbert, Mark; Gravenor, Michael B

    2016-04-01

    We consider the epidemic dynamics in stochastic interacting population centers coupled by random migration. Both the epidemic and the migration processes are modeled by Markov chains. We derive explicit formulae for the probability distribution of the migration process, and explore the dependence of outbreak patterns on initial parameters, population sizes and coupling parameters, using analytical and numerical methods. We show the importance of considering the movement of resident and visitor individuals separately. The mean field approximation for a general migration process is derived and an approximate method that allows the computation of statistical moments for networks with highly populated centers is proposed and tested numerically. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Apparent scale correlations in a random multifractal process

    DEFF Research Database (Denmark)

    Cleve, Jochen; Schmiegel, Jürgen; Greiner, Martin

    2008-01-01

    We discuss various properties of a homogeneous random multifractal process, which are related to the issue of scale correlations. By design, the process has no built-in scale correlations. However, when it comes to observables like breakdown coefficients, which are based on a coarse......-graining of the multifractal field, scale correlations do appear. In the log-normal limit of the model process, the conditional distributions and moments of breakdown coefficients reproduce the observations made in fully developed small-scale turbulence. These findings help to understand several puzzling empirical details...

  2. Network formation determined by the diffusion process of random walkers

    International Nuclear Information System (INIS)

    Ikeda, Nobutoshi

    2008-01-01

    We studied the diffusion process of random walkers in networks formed by their traces. This model considers the rise and fall of links determined by the frequency of transports of random walkers. In order to examine the relation between the formed network and the diffusion process, a situation in which multiple random walkers start from the same vertex is investigated. The difference in diffusion rate of random walkers according to the difference in dimension of the initial lattice is very important for determining the time evolution of the networks. For example, complete subgraphs can be formed on a one-dimensional lattice while a graph with a power-law vertex degree distribution is formed on a two-dimensional lattice. We derived some formulae for predicting network changes for the 1D case, such as the time evolution of the size of nearly complete subgraphs and conditions for their collapse. The networks formed on the 2D lattice are characterized by the existence of clusters of highly connected vertices and their life time. As the life time of such clusters tends to be small, the exponent of the power-law distribution changes from γ ≅ 1-2 to γ ≅ 3

  3. Matrix product approach for the asymmetric random average process

    International Nuclear Information System (INIS)

    Zielen, F; Schadschneider, A

    2003-01-01

    We consider the asymmetric random average process which is a one-dimensional stochastic lattice model with nearest-neighbour interaction but continuous and unbounded state variables. First, the explicit functional representations, so-called beta densities, of all local interactions leading to steady states of product measure form are rigorously derived. This also completes an outstanding proof given in a previous publication. Then we present an alternative solution for the processes with factorized stationary states by using a matrix product ansatz. Due to continuous state variables we obtain a matrix algebra in the form of a functional equation which can be solved exactly

  4. Order out of Randomness: Self-Organization Processes in Astrophysics

    Science.gov (United States)

    Aschwanden, Markus J.; Scholkmann, Felix; Béthune, William; Schmutz, Werner; Abramenko, Valentina; Cheung, Mark C. M.; Müller, Daniel; Benz, Arnold; Chernov, Guennadi; Kritsuk, Alexei G.; Scargle, Jeffrey D.; Melatos, Andrew; Wagoner, Robert V.; Trimble, Virginia; Green, William H.

    2018-03-01

    Self-organization is a property of dissipative nonlinear processes that are governed by a global driving force and a local positive feedback mechanism, which creates regular geometric and/or temporal patterns, and decreases the entropy locally, in contrast to random processes. Here we investigate for the first time a comprehensive number of (17) self-organization processes that operate in planetary physics, solar physics, stellar physics, galactic physics, and cosmology. Self-organizing systems create spontaneous " order out of randomness", during the evolution from an initially disordered system to an ordered quasi-stationary system, mostly by quasi-periodic limit-cycle dynamics, but also by harmonic (mechanical or gyromagnetic) resonances. The global driving force can be due to gravity, electromagnetic forces, mechanical forces (e.g., rotation or differential rotation), thermal pressure, or acceleration of nonthermal particles, while the positive feedback mechanism is often an instability, such as the magneto-rotational (Balbus-Hawley) instability, the convective (Rayleigh-Bénard) instability, turbulence, vortex attraction, magnetic reconnection, plasma condensation, or a loss-cone instability. Physical models of astrophysical self-organization processes require hydrodynamic, magneto-hydrodynamic (MHD), plasma, or N-body simulations. Analytical formulations of self-organizing systems generally involve coupled differential equations with limit-cycle solutions of the Lotka-Volterra or Hopf-bifurcation type.

  5. Polymers and Random graphs: Asymptotic equivalence to branching processes

    International Nuclear Information System (INIS)

    Spouge, J.L.

    1985-01-01

    In 1974, Falk and Thomas did a computer simulation of Flory's Equireactive RA/sub f/ Polymer model, rings forbidden and rings allowed. Asymptotically, the Rings Forbidden model tended to Stockmayer's RA/sub f/ distribution (in which the sol distribution ''sticks'' after gelation), while the Rings Allowed model tended to the Flory version of the RA/sub f/ distribution. In 1965, Whittle introduced the Tree and Pseudomultigraph models. We show that these random graphs generalize the Falk and Thomas models by incorporating first-shell substitution effects. Moreover, asymptotically the Tree model displays postgelation ''sticking.'' Hence this phenomenon results from the absence of rings and occurs independently of equireactivity. We also show that the Pseudomultigraph model is asymptotically identical to the Branching Process model introduced by Gordon in 1962. This provides a possible basis for the Branching Process model in standard statistical mechanics

  6. Nonstationary random acoustic and electromagnetic fields as wave diffusion processes

    International Nuclear Information System (INIS)

    Arnaut, L R

    2007-01-01

    We investigate the effects of relatively rapid variations of the boundaries of an overmoded cavity on the stochastic properties of its interior acoustic or electromagnetic field. For quasi-static variations, this field can be represented as an ideal incoherent and statistically homogeneous isotropic random scalar or vector field, respectively. A physical model is constructed showing that the field dynamics can be characterized as a generalized diffusion process. The Langevin-It o-hat and Fokker-Planck equations are derived and their associated statistics and distributions for the complex analytic field, its magnitude and energy density are computed. The energy diffusion parameter is found to be proportional to the square of the ratio of the standard deviation of the source field to the characteristic time constant of the dynamic process, but is independent of the initial energy density, to first order. The energy drift vanishes in the asymptotic limit. The time-energy probability distribution is in general not separable, as a result of nonstationarity. A general solution of the Fokker-Planck equation is obtained in integral form, together with explicit closed-form solutions for several asymptotic cases. The findings extend known results on statistics and distributions of quasi-stationary ideal random fields (pure diffusions), which are retrieved as special cases

  7. 5th Seminar on Stochastic Processes, Random Fields and Applications

    CERN Document Server

    Russo, Francesco; Dozzi, Marco

    2008-01-01

    This volume contains twenty-eight refereed research or review papers presented at the 5th Seminar on Stochastic Processes, Random Fields and Applications, which took place at the Centro Stefano Franscini (Monte Verità) in Ascona, Switzerland, from May 30 to June 3, 2005. The seminar focused mainly on stochastic partial differential equations, random dynamical systems, infinite-dimensional analysis, approximation problems, and financial engineering. The book will be a valuable resource for researchers in stochastic analysis and professionals interested in stochastic methods in finance. Contributors: Y. Asai, J.-P. Aubin, C. Becker, M. Benaïm, H. Bessaih, S. Biagini, S. Bonaccorsi, N. Bouleau, N. Champagnat, G. Da Prato, R. Ferrière, F. Flandoli, P. Guasoni, V.B. Hallulli, D. Khoshnevisan, T. Komorowski, R. Léandre, P. Lescot, H. Lisei, J.A. López-Mimbela, V. Mandrekar, S. Méléard, A. Millet, H. Nagai, A.D. Neate, V. Orlovius, M. Pratelli, N. Privault, O. Raimond, M. Röckner, B. Rüdiger, W.J. Runggaldi...

  8. Random number generation as an index of controlled processing.

    Science.gov (United States)

    Jahanshahi, Marjan; Saleem, T; Ho, Aileen K; Dirnberger, Georg; Fuller, R

    2006-07-01

    Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use. ((c) 2006 APA, all rights reserved).

  9. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  10. An empirical test of pseudo random number generators by means of an exponential decaying process

    International Nuclear Information System (INIS)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A.; Mora F, L.E.

    2007-01-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  11. A randomized controlled trial of an electronic informed consent process.

    Science.gov (United States)

    Rothwell, Erin; Wong, Bob; Rose, Nancy C; Anderson, Rebecca; Fedor, Beth; Stark, Louisa A; Botkin, Jeffrey R

    2014-12-01

    A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study. © The Author(s) 2014.

  12. Random skew plane partitions and the Pearcey process

    DEFF Research Database (Denmark)

    Reshetikhin, Nicolai; Okounkov, Andrei

    2007-01-01

    We study random skew 3D partitions weighted by q vol and, specifically, the q → 1 asymptotics of local correlations near various points of the limit shape. We obtain sine-kernel asymptotics for correlations in the bulk of the disordered region, Airy kernel asymptotics near a general point of the ...

  13. A Randomization Procedure for "Trickle-Process" Evaluations

    Science.gov (United States)

    Goldman, Jerry

    1977-01-01

    This note suggests a solution to the problem of achieving randomization in experimental settings where units deemed eligible for treatment "trickle in," that is, appear at any time. The solution permits replication of the experiment in order to test for time-dependent effects. (Author/CTM)

  14. Do MENA stock market returns follow a random walk process?

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2013-01-01

    Full Text Available In this research, three variance ratio tests: the standard variance ratio test, the wild bootstrap multiple variance ratio test, and the non-parametric rank scores test are adopted to test the random walk hypothesis (RWH of stock markets in Middle East and North Africa (MENA region using most recent data from January 2010 to September 2012. The empirical results obtained by all three econometric tests show that the RWH is strongly rejected for Kuwait, Tunisia, and Morocco. However, the standard variance ratio test and the wild bootstrap multiple variance ratio test reject the null hypothesis of random walk in Jordan and KSA, while non-parametric rank scores test do not. We may conclude that Jordan and KSA stock market are weak efficient. In sum, the empirical results suggest that return series in Kuwait, Tunisia, and Morocco are predictable. In other words, predictable patterns that can be exploited in these markets still exit. Therefore, investors may make profits in such less efficient markets.

  15. Investigation of Random Switching Driven by a Poisson Point Process

    DEFF Research Database (Denmark)

    Simonsen, Maria; Schiøler, Henrik; Leth, John-Josef

    2015-01-01

    This paper investigates the switching mechanism of a two-dimensional switched system, when the switching events are generated by a Poisson point process. A model, in the shape of a stochastic process, for such a system is derived and the distribution of the trajectory's position is developed...... together with marginal density functions for the coordinate functions. Furthermore, the joint probability distribution is given explicitly....

  16. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  17. Art Therapy and Cognitive Processing Therapy for Combat-Related PTSD: A Randomized Controlled Trial

    Science.gov (United States)

    Campbell, Melissa; Decker, Kathleen P.; Kruk, Kerry; Deaver, Sarah P.

    2016-01-01

    This randomized controlled trial was designed to determine if art therapy in conjunction with Cognitive Processing Therapy (CPT) was more effective for reducing symptoms of combat posttraumatic stress disorder (PTSD) than CPT alone. Veterans (N = 11) were randomized to receive either individual CPT, or individual CPT in conjunction with individual…

  18. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    Science.gov (United States)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  19. Interspinous process device versus standard conventional surgical decompression for lumbar spinal stenosis: Randomized controlled trial

    NARCIS (Netherlands)

    W.A. Moojen (Wouter); M.P. Arts (Mark); W.C.H. Jacobs (Wilco); E.W. van Zwet (Erik); M.E. van den Akker-van Marle (Elske); B.W. Koes (Bart); C.L.A.M. Vleggeert-Lankamp (Carmen); W.C. Peul (Wilco)

    2013-01-01

    markdownabstractAbstract Objective To assess whether interspinous process device implantation is more effective in the short term than conventional surgical decompression for patients with intermittent neurogenic claudication due to lumbar spinal stenosis. Design Randomized controlled

  20. Directed motion emerging from two coupled random processes

    DEFF Research Database (Denmark)

    Ambjörnsson, T.; Lomholt, Michael Andersen; Metzler, R.

    2005-01-01

    detail, we develop a dynamical description of the process in terms of a (2+1)-variable master equation for the probability of having m monomers on the target side of the membrane with n bound chaperones at time t. Emphasis is put on the calculation of the mean first passage time as a function of total...... dynamics ( and ), we perform the adiabatic elimination of the fast variable n, and find that for a very long polymer , but with a smaller prefactor than for ratchet-like dynamics. We solve the general case numerically as a function of the dimensionless parameters λ, κ and γ, and compare to the three...

  1. The Initial Regression Statistical Characteristics of Intervals Between Zeros of Random Processes

    Directory of Open Access Journals (Sweden)

    V. K. Hohlov

    2014-01-01

    Full Text Available The article substantiates the initial regression statistical characteristics of intervals between zeros of realizing random processes, studies their properties allowing the use these features in the autonomous information systems (AIS of near location (NL. Coefficients of the initial regression (CIR to minimize the residual sum of squares of multiple initial regression views are justified on the basis of vector representations associated with a random vector notion of analyzed signal parameters. It is shown that even with no covariance-based private CIR it is possible to predict one random variable through another with respect to the deterministic components. The paper studies dependences of CIR interval sizes between zeros of the narrowband stationary in wide-sense random process with its energy spectrum. Particular CIR for random processes with Gaussian and rectangular energy spectra are obtained. It is shown that the considered CIRs do not depend on the average frequency of spectra, are determined by the relative bandwidth of the energy spectra, and weakly depend on the type of spectrum. CIR properties enable its use as an informative parameter when implementing temporary regression methods of signal processing, invariant to the average rate and variance of the input implementations. We consider estimates of the average energy spectrum frequency of the random stationary process by calculating the length of the time interval corresponding to the specified number of intervals between zeros. It is shown that the relative variance in estimation of the average energy spectrum frequency of stationary random process with increasing relative bandwidth ceases to depend on the last process implementation in processing above ten intervals between zeros. The obtained results can be used in the AIS NL to solve the tasks of detection and signal recognition, when a decision is made in conditions of unknown mathematical expectations on a limited observation

  2. Random covering of the circle: the configuration-space of the free deposition process

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-12-12

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = {rho}, for some finite density {rho} of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Renyi's random sequential adsorption model.

  3. Timing of the Crab pulsar III. The slowing down and the nature of the random process

    International Nuclear Information System (INIS)

    Groth, E.J.

    1975-01-01

    The Crab pulsar arrival times are analyzed. The data are found to be consistent with a smooth slowing down with a braking index of 2.515+-0.005. Superposed on the smooth slowdown is a random process which has the same second moments as a random walk in the frequency. The strength of the random process is R 2 >=0.53 (+0.24, -0.12) x10 -22 Hz 2 s -1 , where R is the mean rate of steps and 2 > is the second moment of the step amplitude distribution. Neither the braking index nor the strength of the random process shows evidence of statistically significant time variations, although small fluctuations in the braking index and rather large fluctuations in the noise strength cannot be ruled out. There is a possibility that the random process contains a small component with the same second moments as a random walk in the phase. If so, a time scale of 3.5 days is indicated

  4. Post-processing Free Quantum Random Number Generator Based on Avalanche Photodiode Array

    International Nuclear Information System (INIS)

    Li Yang; Liao Sheng-Kai; Liang Fu-Tian; Shen Qi; Liang Hao; Peng Cheng-Zhi

    2016-01-01

    Quantum random number generators adopting single photon detection have been restricted due to the non-negligible dead time of avalanche photodiodes (APDs). We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32 × 32 APD array is up to tens of Gbits/s. (paper)

  5. Efficient tests for equivalence of hidden Markov processes and quantum random walks

    NARCIS (Netherlands)

    U. Faigle; A. Schönhuth (Alexander)

    2011-01-01

    htmlabstractWhile two hidden Markov process (HMP) resp.~quantum random walk (QRW) parametrizations can differ from one another, the stochastic processes arising from them can be equivalent. Here a polynomial-time algorithm is presented which can determine equivalence of two HMP parametrizations

  6. High-Performance Pseudo-Random Number Generation on Graphics Processing Units

    OpenAIRE

    Nandapalan, Nimalan; Brent, Richard P.; Murray, Lawrence M.; Rendell, Alistair

    2011-01-01

    This work considers the deployment of pseudo-random number generators (PRNGs) on graphics processing units (GPUs), developing an approach based on the xorgens generator to rapidly produce pseudo-random numbers of high statistical quality. The chosen algorithm has configurable state size and period, making it ideal for tuning to the GPU architecture. We present a comparison of both speed and statistical quality with other common parallel, GPU-based PRNGs, demonstrating favourable performance o...

  7. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    Science.gov (United States)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  8. Characterisation of random Gaussian and non-Gaussian stress processes in terms of extreme responses

    Directory of Open Access Journals (Sweden)

    Colin Bruno

    2015-01-01

    Full Text Available In the field of military land vehicles, random vibration processes generated by all-terrain wheeled vehicles in motion are not classical stochastic processes with a stationary and Gaussian nature. Non-stationarity of processes induced by the variability of the vehicle speed does not form a major difficulty because the designer can have good control over the vehicle speed by characterising the histogram of instantaneous speed of the vehicle during an operational situation. Beyond this non-stationarity problem, the hard point clearly lies in the fact that the random processes are not Gaussian and are generated mainly by the non-linear behaviour of the undercarriage and the strong occurrence of shocks generated by roughness of the terrain. This non-Gaussian nature is expressed particularly by very high flattening levels that can affect the design of structures under extreme stresses conventionally acquired by spectral approaches, inherent to Gaussian processes and based essentially on spectral moments of stress processes. Due to these technical considerations, techniques for characterisation of random excitation processes generated by this type of carrier need to be changed, by proposing innovative characterisation methods based on time domain approaches as described in the body of the text rather than spectral domain approaches.

  9. Multifractal properties of diffusion-limited aggregates and random multiplicative processes

    International Nuclear Information System (INIS)

    Canessa, E.

    1991-04-01

    We consider the multifractal properties of irreversible diffusion-limited aggregation (DLA) from the point of view of the self-similarity of fluctuations in random multiplicative processes. In particular we analyse the breakdown of multifractal behaviour and phase transition associated with the negative moments of the growth probabilities in DLA. (author). 20 refs, 5 figs

  10. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  11. Setting up a randomized clinical trial in the UK: approvals and process.

    Science.gov (United States)

    Greene, Louise Eleanor; Bearn, David R

    2013-06-01

    Randomized clinical trials are considered the 'gold standard' in primary research for healthcare interventions. However, they can be expensive and time-consuming to set up and require many approvals to be in place before they can begin. This paper outlines how to determine what approvals are required for a trial, the background of each approval and the process for obtaining them.

  12. Human norovirus inactivation in oysters by high hydrostatic pressure processing: A randomized double-blinded study

    Science.gov (United States)

    This randomized, double-blinded, clinical trial assessed the effect of high hydrostatic pressure processing (HPP) on genogroup I.1 human norovirus (HuNoV) inactivation in virus-seeded oysters when ingested by subjects. The safety and efficacy of HPP treatments were assessed in three study phases wi...

  13. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  14. Time at which the maximum of a random acceleration process is reached

    International Nuclear Information System (INIS)

    Majumdar, Satya N; Rosso, Alberto; Zoia, Andrea

    2010-01-01

    We study the random acceleration model, which is perhaps one of the simplest, yet nontrivial, non-Markov stochastic processes, and is key to many applications. For this non-Markov process, we present exact analytical results for the probability density p(t m |T) of the time t m at which the process reaches its maximum, within a fixed time interval [0, T]. We study two different boundary conditions, which correspond to the process representing respectively (i) the integral of a Brownian bridge and (ii) the integral of a free Brownian motion. Our analytical results are also verified by numerical simulations.

  15. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    Science.gov (United States)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  16. Pseudo-random number generators for Monte Carlo simulations on ATI Graphics Processing Units

    Science.gov (United States)

    Demchik, Vadim

    2011-03-01

    Basic uniform pseudo-random number generators are implemented on ATI Graphics Processing Units (GPU). The performance results of the realized generators (multiplicative linear congruential (GGL), XOR-shift (XOR128), RANECU, RANMAR, RANLUX and Mersenne Twister (MT19937)) on CPU and GPU are discussed. The obtained speed up factor is hundreds of times in comparison with CPU. RANLUX generator is found to be the most appropriate for using on GPU in Monte Carlo simulations. The brief review of the pseudo-random number generators used in modern software packages for Monte Carlo simulations in high-energy physics is presented.

  17. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    CERN Document Server

    Vamos, C; Vereecken, H

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested.

  18. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    International Nuclear Information System (INIS)

    Vamos, Calin; Suciu, Nicolae; Vereecken, Harry

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested

  19. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    Science.gov (United States)

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  20. To be and not to be: scale correlations in random multifractal processes

    DEFF Research Database (Denmark)

    Cleve, Jochen; Schmiegel, Jürgen; Greiner, Martin

    We discuss various properties of a random multifractal process, which are related to the issue of scale correlations. By design, the process is homogeneous, non-conservative and has no built-in scale correlations. However, when it comes to observables like breakdown coefficients, which are based...... on a coarse-graining of the multifractal field, scale correlations do appear. In the log-normal limit of the model process, the conditional distributions and moments of breakdown coefficients reproduce the observations made in fully developed small-scale turbulence. These findings help to understand several...

  1. Gaussian random-matrix process and universal parametric correlations in complex systems

    International Nuclear Information System (INIS)

    Attias, H.; Alhassid, Y.

    1995-01-01

    We introduce the framework of the Gaussian random-matrix process as an extension of Dyson's Gaussian ensembles and use it to discuss the statistical properties of complex quantum systems that depend on an external parameter. We classify the Gaussian processes according to the short-distance diffusive behavior of their energy levels and demonstrate that all parametric correlation functions become universal upon the appropriate scaling of the parameter. The class of differentiable Gaussian processes is identified as the relevant one for most physical systems. We reproduce the known spectral correlators and compute eigenfunction correlators in their universal form. Numerical evidence from both a chaotic model and weakly disordered model confirms our predictions

  2. Random Process Theory Approach to Geometric Heterogeneous Surfaces: Effective Fluid-Solid Interaction

    Science.gov (United States)

    Khlyupin, Aleksey; Aslyamov, Timur

    2017-06-01

    Realistic fluid-solid interaction potentials are essential in description of confined fluids especially in the case of geometric heterogeneous surfaces. Correlated random field is considered as a model of random surface with high geometric roughness. We provide the general theory of effective coarse-grained fluid-solid potential by proper averaging of the free energy of fluid molecules which interact with the solid media. This procedure is largely based on the theory of random processes. We apply first passage time probability problem and assume the local Markov properties of random surfaces. General expression of effective fluid-solid potential is obtained. In the case of small surface irregularities analytical approximation for effective potential is proposed. Both amorphous materials with large surface roughness and crystalline solids with several types of fcc lattices are considered. It is shown that the wider the lattice spacing in terms of molecular diameter of the fluid, the more obtained potentials differ from classical ones. A comparison with published Monte-Carlo simulations was discussed. The work provides a promising approach to explore how the random geometric heterogeneity affects on thermodynamic properties of the fluids.

  3. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  4. Art Therapy and Cognitive Processing Therapy for Combat-Related PTSD: A Randomized Controlled Trial

    Science.gov (United States)

    Campbell, Melissa; Decker, Kathleen P.; Kruk, Kerry; Deaver, Sarah P.

    2018-01-01

    This randomized controlled trial was designed to determine if art therapy in conjunction with Cognitive Processing Therapy (CPT) was more effective for reducing symptoms of combat posttraumatic stress disorder (PTSD) than CPT alone. Veterans (N = 11) were randomized to receive either individual CPT, or individual CPT in conjunction with individual art therapy. PTSD Checklist–Military Version and Beck Depression Inventory–II scores improved with treatment in both groups with no significant difference in improvement between the experimental and control groups. Art therapy in conjunction with CPT was found to improve trauma processing and veterans considered it to be an important part of their treatment as it provided healthy distancing, enhanced trauma recall, and increased access to emotions. PMID:29332989

  5. An Artificial Bee Colony Algorithm for the Job Shop Scheduling Problem with Random Processing Times

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2011-09-01

    Full Text Available Due to the influence of unpredictable random events, the processing time of each operation should be treated as random variables if we aim at a robust production schedule. However, compared with the extensive research on the deterministic model, the stochastic job shop scheduling problem (SJSSP has not received sufficient attention. In this paper, we propose an artificial bee colony (ABC algorithm for SJSSP with the objective of minimizing the maximum lateness (which is an index of service quality. First, we propose a performance estimate for preliminary screening of the candidate solutions. Then, the K-armed bandit model is utilized for reducing the computational burden in the exact evaluation (through Monte Carlo simulation process. Finally, the computational results on different-scale test problems validate the effectiveness and efficiency of the proposed approach.

  6. A random point process model for the score in sport matches

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2009-01-01

    Roč. 20, č. 2 (2009), s. 121-131 ISSN 1471-678X R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : sport statistics * scoring intensity * Cox’s regression model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/SI/volf-a random point process model for the score in sport matches.pdf

  7. The McMillan Theorem for Colored Branching Processes and Dimensions of Random Fractals

    Directory of Open Access Journals (Sweden)

    Victor Bakhtin

    2014-12-01

    Full Text Available For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines. In this setting, the role of Shannon’s entropy is played by the Kullback–Leibler divergence, and the Hausdorff dimensions are computed by means of the so-called Billingsley–Kullback entropy, defined in the paper.

  8. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  9. MINIMUM ENTROPY DECONVOLUTION OF ONE-AND MULTI-DIMENSIONAL NON-GAUSSIAN LINEAR RANDOM PROCESSES

    Institute of Scientific and Technical Information of China (English)

    程乾生

    1990-01-01

    The minimum entropy deconvolution is considered as one of the methods for decomposing non-Gaussian linear processes. The concept of peakedness of a system response sequence is presented and its properties are studied. With the aid of the peakedness, the convergence theory of the minimum entropy deconvolution is established. The problem of the minimum entropy deconvolution of multi-dimensional non-Gaussian linear random processes is first investigated and the corresponding theory is given. In addition, the relation between the minimum entropy deconvolution and parameter method is discussed.

  10. Is neutron evaporation from highly excited nuclei a poisson random process

    International Nuclear Information System (INIS)

    Simbel, M.H.

    1982-01-01

    It is suggested that neutron emission from highly excited nuclei follows a Poisson random process. The continuous variable of the process is the excitation energy excess over the binding energy of the emitted neutrons and the discrete variable is the number of emitted neutrons. Cross sections for (HI,xn) reactions are analyzed using a formula containing a Poisson distribution function. The post- and pre-equilibrium components of the cross section are treated separately. The agreement between the predictions of this formula and the experimental results is very good. (orig.)

  11. A prospective randomized trial of content expertise versus process expertise in small group teaching.

    Science.gov (United States)

    Peets, Adam D; Cooke, Lara; Wright, Bruce; Coderre, Sylvain; McLaughlin, Kevin

    2010-10-14

    Effective teaching requires an understanding of both what (content knowledge) and how (process knowledge) to teach. While previous studies involving medical students have compared preceptors with greater or lesser content knowledge, it is unclear whether process expertise can compensate for deficient content expertise. Therefore, the objective of our study was to compare the effect of preceptors with process expertise to those with content expertise on medical students' learning outcomes in a structured small group environment. One hundred and fifty-one first year medical students were randomized to 11 groups for the small group component of the Cardiovascular-Respiratory course at the University of Calgary. Each group was then block randomized to one of three streams for the entire course: tutoring exclusively by physicians with content expertise (n = 5), tutoring exclusively by physicians with process expertise (n = 3), and tutoring by content experts for 11 sessions and process experts for 10 sessions (n = 3). After each of the 21 small group sessions, students evaluated their preceptors' teaching with a standardized instrument. Students' knowledge acquisition was assessed by an end-of-course multiple choice (EOC-MCQ) examination. Students rated the process experts significantly higher on each of the instrument's 15 items, including the overall rating. Students' mean score (±SD) on the EOC-MCQ exam was 76.1% (8.1) for groups taught by content experts, 78.2% (7.8) for the combination group and 79.5% (9.2) for process expert groups (p = 0.11). By linear regression student performance was higher if they had been taught by process experts (regression coefficient 2.7 [0.1, 5.4], p teach first year medical students within a structured small group environment; preceptors with process expertise result in at least equivalent, if not superior, student outcomes in this setting.

  12. Generation and monitoring of discrete stable random processes using multiple immigration population models

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, J O; Hopcraft, K I; Jakeman, E [Applied Mathematics Division, School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD (United Kingdom)

    2003-11-21

    Some properties of classical population processes that comprise births, deaths and multiple immigrations are investigated. The rates at which the immigrants arrive can be tailored to produce a population whose steady state fluctuations are described by a pre-selected distribution. Attention is focused on the class of distributions with a discrete stable law, which have power-law tails and whose moments and autocorrelation function do not exist. The separate problem of monitoring and characterizing the fluctuations is studied, analysing the statistics of individuals that leave the population. The fluctuations in the size of the population are transferred to the times between emigrants that form an intermittent time series of events. The emigrants are counted with a detector of finite dynamic range and response time. This is modelled through clipping the time series or saturating it at an arbitrary but finite level, whereupon its moments and correlation properties become finite. Distributions for the time to the first counted event and for the time between events exhibit power-law regimes that are characteristic of the fluctuations in population size. The processes provide analytical models with which properties of complex discrete random phenomena can be explored, and in addition provide generic means by which random time series encompassing a wide range of intermittent and other discrete random behaviour may be generated.

  13. Generation and monitoring of discrete stable random processes using multiple immigration population models

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E

    2003-01-01

    Some properties of classical population processes that comprise births, deaths and multiple immigrations are investigated. The rates at which the immigrants arrive can be tailored to produce a population whose steady state fluctuations are described by a pre-selected distribution. Attention is focused on the class of distributions with a discrete stable law, which have power-law tails and whose moments and autocorrelation function do not exist. The separate problem of monitoring and characterizing the fluctuations is studied, analysing the statistics of individuals that leave the population. The fluctuations in the size of the population are transferred to the times between emigrants that form an intermittent time series of events. The emigrants are counted with a detector of finite dynamic range and response time. This is modelled through clipping the time series or saturating it at an arbitrary but finite level, whereupon its moments and correlation properties become finite. Distributions for the time to the first counted event and for the time between events exhibit power-law regimes that are characteristic of the fluctuations in population size. The processes provide analytical models with which properties of complex discrete random phenomena can be explored, and in addition provide generic means by which random time series encompassing a wide range of intermittent and other discrete random behaviour may be generated

  14. Simulation study on characteristics of long-range interaction in randomly asymmetric exclusion process

    Science.gov (United States)

    Zhao, Shi-Bo; Liu, Ming-Zhe; Yang, Lan-Ying

    2015-04-01

    In this paper we investigate the dynamics of an asymmetric exclusion process on a one-dimensional lattice with long-range hopping and random update via Monte Carlo simulations theoretically. Particles in the model will firstly try to hop over successive unoccupied sites with a probability q, which is different from previous exclusion process models. The probability q may represent the random access of particles. Numerical simulations for stationary particle currents, density profiles, and phase diagrams are obtained. There are three possible stationary phases: the low density (LD) phase, high density (HD) phase, and maximal current (MC) in the system, respectively. Interestingly, bulk density in the LD phase tends to zero, while the MC phase is governed by α, β, and q. The HD phase is nearly the same as the normal TASEP, determined by exit rate β. Theoretical analysis is in good agreement with simulation results. The proposed model may provide a better understanding of random interaction dynamics in complex systems. Project supported by the National Natural Science Foundation of China (Grant Nos. 41274109 and 11104022), the Fund for Sichuan Youth Science and Technology Innovation Research Team (Grant No. 2011JTD0013), and the Creative Team Program of Chengdu University of Technology.

  15. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  16. [The third lumbar transverse process syndrome treated with acupuncture at zygapophyseal joint and transverse process:a randomized controlled trial].

    Science.gov (United States)

    Li, Fangling; Bi, Dingyan

    2017-08-12

    To explore the effects differences for the third lumbar transverse process syndrome between acupuncture mainly at zygapophyseal joint and transverse process and conventional acupuncture. Eighty cases were randomly assigned into an observation group and a control group, 40 cases in each one. In the observation group, patients were treated with acupuncture at zygapophyseal joint, transverse process, the superior gluteus nerve into the hip point and Weizhong (BL 40), and those in the control group were treated with acupuncture at Qihaishu (BL 24), Jiaji (EX-B 2) of L 2 -L 4 , the superior gluteus nerve into the hip point and Weizhong (BL 40). The treatment was given 6 times a week for 2 weeks, once a day. The visual analogue scale (VAS), Japanese Orthopaedic Association (JOA) low back pain score and simplified Chinese Oswestry disability index (SC-ODI) were observed before and after treatment as well as 6 months after treatment, and the clinical effects were evaluated. The total effective rate in the observation group was 95.0% (38/40), which was significantly higher than 82.5% (33/40) in the control group ( P process for the third lumbar transverse process syndrome achieves good effect, which is better than that of conventional acupuncture on relieving pain, improving lumbar function and life quality.

  17. Finding Order in Randomness: Single-Molecule Studies Reveal Stochastic RNA Processing | Center for Cancer Research

    Science.gov (United States)

    Producing a functional eukaryotic messenger RNA (mRNA) requires the coordinated activity of several large protein complexes to initiate transcription, elongate nascent transcripts, splice together exons, and cleave and polyadenylate the 3’ end. Kinetic competition between these various processes has been proposed to regulate mRNA maturation, but this model could lead to multiple, randomly determined, or stochastic, pathways or outcomes. Regulatory checkpoints have been suggested as a means of ensuring quality control. However, current methods have been unable to tease apart the contributions of these processes at a single gene or on a time scale that could provide mechanistic insight. To begin to investigate the kinetic relationship between transcription and splicing, Daniel Larson, Ph.D., of CCR’s Laboratory of Receptor Biology and Gene Expression, and his colleagues employed a single-molecule RNA imaging approach to monitor production and processing of a human β-globin reporter gene in living cells.

  18. A Correlated Random Effects Model for Non-homogeneous Markov Processes with Nonignorable Missingness.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2013-05-01

    Life history data arising in clusters with prespecified assessment time points for patients often feature incomplete data since patients may choose to visit the clinic based on their needs. Markov process models provide a useful tool describing disease progression for life history data. The literature mainly focuses on time homogeneous process. In this paper we develop methods to deal with non-homogeneous Markov process with incomplete clustered life history data. A correlated random effects model is developed to deal with the nonignorable missingness, and a time transformation is employed to address the non-homogeneity in the transition model. Maximum likelihood estimate based on the Monte-Carlo EM algorithm is advocated for parameter estimation. Simulation studies demonstrate that the proposed method works well in many situations. We also apply this method to an Alzheimer's disease study.

  19. Quasi-steady-state analysis of two-dimensional random intermittent search processes

    KAUST Repository

    Bressloff, Paul C.

    2011-06-01

    We use perturbation methods to analyze a two-dimensional random intermittent search process, in which a searcher alternates between a diffusive search phase and a ballistic movement phase whose velocity direction is random. A hidden target is introduced within a rectangular domain with reflecting boundaries. If the searcher moves within range of the target and is in the search phase, it has a chance of detecting the target. A quasi-steady-state analysis is applied to the corresponding Chapman-Kolmogorov equation. This generates a reduced Fokker-Planck description of the search process involving a nonzero drift term and an anisotropic diffusion tensor. In the case of a uniform direction distribution, for which there is zero drift, and isotropic diffusion, we use the method of matched asymptotics to compute the mean first passage time (MFPT) to the target, under the assumption that the detection range of the target is much smaller than the size of the domain. We show that an optimal search strategy exists, consistent with previous studies of intermittent search in a radially symmetric domain that were based on a decoupling or moment closure approximation. We also show how the decoupling approximation can break down in the case of biased search processes. Finally, we analyze the MFPT in the case of anisotropic diffusion and find that anisotropy can be useful when the searcher starts from a fixed location. © 2011 American Physical Society.

  20. Random Walk on a Perturbation of the Infinitely-Fast Mixing Interchange Process

    Science.gov (United States)

    Salvi, Michele; Simenhaus, François

    2018-03-01

    We consider a random walk in dimension d≥1 in a dynamic random environment evolving as an interchange process with rate γ >0 . We prove that, if we choose γ large enough, almost surely the empirical velocity of the walker X_t/t eventually lies in an arbitrary small ball around the annealed drift. This statement is thus a perturbation of the case γ =+∞ where the environment is refreshed between each step of the walker. We extend three-way part of the results of Huveneers and Simenhaus (Electron J Probab 20(105):42, 2015), where the environment was given by the 1-dimensional exclusion process: (i) We deal with any dimension d≥1 ; (ii) We treat the much more general interchange process, where each particle carries a transition vector chosen according to an arbitrary law μ ; (iii) We show that X_t/t is not only in the same direction of the annealed drift, but that it is also close to it.

  1. Characteristics of the probability function for three random-walk models of reaction--diffusion processes

    International Nuclear Information System (INIS)

    Musho, M.K.; Kozak, J.J.

    1984-01-01

    A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes

  2. Random Walk on a Perturbation of the Infinitely-Fast Mixing Interchange Process

    Science.gov (United States)

    Salvi, Michele; Simenhaus, François

    2018-05-01

    We consider a random walk in dimension d≥ 1 in a dynamic random environment evolving as an interchange process with rate γ >0. We prove that, if we choose γ large enough, almost surely the empirical velocity of the walker X_t/t eventually lies in an arbitrary small ball around the annealed drift. This statement is thus a perturbation of the case γ =+∞ where the environment is refreshed between each step of the walker. We extend three-way part of the results of Huveneers and Simenhaus (Electron J Probab 20(105):42, 2015), where the environment was given by the 1-dimensional exclusion process: (i) We deal with any dimension d≥1; (ii) We treat the much more general interchange process, where each particle carries a transition vector chosen according to an arbitrary law μ ; (iii) We show that X_t/t is not only in the same direction of the annealed drift, but that it is also close to it.

  3. Quasi-steady-state analysis of two-dimensional random intermittent search processes

    KAUST Repository

    Bressloff, Paul C.; Newby, Jay M.

    2011-01-01

    We use perturbation methods to analyze a two-dimensional random intermittent search process, in which a searcher alternates between a diffusive search phase and a ballistic movement phase whose velocity direction is random. A hidden target is introduced within a rectangular domain with reflecting boundaries. If the searcher moves within range of the target and is in the search phase, it has a chance of detecting the target. A quasi-steady-state analysis is applied to the corresponding Chapman-Kolmogorov equation. This generates a reduced Fokker-Planck description of the search process involving a nonzero drift term and an anisotropic diffusion tensor. In the case of a uniform direction distribution, for which there is zero drift, and isotropic diffusion, we use the method of matched asymptotics to compute the mean first passage time (MFPT) to the target, under the assumption that the detection range of the target is much smaller than the size of the domain. We show that an optimal search strategy exists, consistent with previous studies of intermittent search in a radially symmetric domain that were based on a decoupling or moment closure approximation. We also show how the decoupling approximation can break down in the case of biased search processes. Finally, we analyze the MFPT in the case of anisotropic diffusion and find that anisotropy can be useful when the searcher starts from a fixed location. © 2011 American Physical Society.

  4. Spherical particle Brownian motion in viscous medium as non-Markovian random process

    International Nuclear Information System (INIS)

    Morozov, Andrey N.; Skripkin, Alexey V.

    2011-01-01

    The Brownian motion of a spherical particle in an infinite medium is described by the conventional methods and integral transforms considering the entrainment of surrounding particles of the medium by the Brownian particle. It is demonstrated that fluctuations of the Brownian particle velocity represent a non-Markovian random process. The features of Brownian motion in short time intervals and in small displacements are considered. -- Highlights: → Description of Brownian motion considering the entrainment of medium is developed. → We find the equations for statistical characteristics of impulse fluctuations. → Brownian motion at small time intervals is considered. → Theoretical results and experimental data are compared.

  5. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  6. An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System

    Science.gov (United States)

    Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed

    PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.

  7. Increased certification of semi-device independent random numbers using many inputs and more post-processing

    International Nuclear Information System (INIS)

    Mironowicz, Piotr; Tavakoli, Armin; Hameedi, Alley; Marques, Breno; Bourennane, Mohamed; Pawłowski, Marcin

    2016-01-01

    Quantum communication with systems of dimension larger than two provides advantages in information processing tasks. Examples include higher rates of key distribution and random number generation. The main disadvantage of using such multi-dimensional quantum systems is the increased complexity of the experimental setup. Here, we analyze a not-so-obvious problem: the relation between randomness certification and computational requirements of the post-processing of experimental data. In particular, we consider semi-device independent randomness certification from an experiment using a four dimensional quantum system to violate the classical bound of a random access code. Using state-of-the-art techniques, a smaller quantum violation requires more computational power to demonstrate randomness, which at some point becomes impossible with today’s computers although the randomness is (probably) still there. We show that by dedicating more input settings of the experiment to randomness certification, then by more computational postprocessing of the experimental data which corresponds to a quantum violation, one may increase the amount of certified randomness. Furthermore, we introduce a method that significantly lowers the computational complexity of randomness certification. Our results show how more randomness can be generated without altering the hardware and indicate a path for future semi-device independent protocols to follow. (paper)

  8. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    Science.gov (United States)

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  9. The emergence of typical entanglement in two-party random processes

    International Nuclear Information System (INIS)

    Dahlsten, O C O; Oliveira, R; Plenio, M B

    2007-01-01

    We investigate the entanglement within a system undergoing a random, local process. We find that there is initially a phase of very fast generation and spread of entanglement. At the end of this phase the entanglement is typically maximal. In Oliveira et al (2007 Phys. Rev. Lett. 98 130502) we proved that the maximal entanglement is reached to a fixed arbitrary accuracy within O(N 3 ) steps, where N is the total number of qubits. Here we provide a detailed and more pedagogical proof. We demonstrate that one can use the so-called stabilizer gates to simulate this process efficiently on a classical computer. Furthermore, we discuss three ways of identifying the transition from the phase of rapid spread of entanglement to the stationary phase: (i) the time when saturation of the maximal entanglement is achieved, (ii) the cutoff moment, when the entanglement probability distribution is practically stationary, and (iii) the moment block entanglement exhibits volume scaling. We furthermore investigate the mixed state and multipartite setting. Numerically, we find that the mutual information appears to behave similarly to the quantum correlations and that there is a well-behaved phase-space flow of entanglement properties towards an equilibrium. We describe how the emergence of typical entanglement can be used to create a much simpler tripartite entanglement description. The results form a bridge between certain abstract results concerning typical (also known as generic) entanglement relative to an unbiased distribution on pure states and the more physical picture of distributions emerging from random local interactions

  10. Random mutagenesis of aspergillus niger and process optimization for enhanced production of glucose oxidase

    International Nuclear Information System (INIS)

    Haq, I.; Nawaz, A.; Mukhtar, A.N.H.; Mansoor, H.M.Z.; Ameer, S.M.

    2014-01-01

    The study deals with the improvement of wild strain Aspergillus niger IIB-31 through random mutagenesis using chemical mutagens. The main aim of the work was to enhance the glucose oxidase (GOX) yield of wild strain (24.57+-0.01 U/g of cell mass) through random mutagenesis and process optimization. The wild strain of Aspergillus niger IIB-31 was treated with chemical mutagens such as Ethyl methane sulphonate (EMS) and nitrous acid for this purpose. Mutagen treated 98 variants indicating the positive results were picked and screened for the glucose oxidase production using submerged fermentation. EMS treated E45 mutant strain gave the highest glucose oxidase production (69.47 + 0.01 U/g of cell mass), which was approximately 3-folds greater than the wild strain IIB-31. The preliminary cultural conditions for the production of glucose oxidase using submerged fermentation from strain E45 were also optimized. The highest yield of GOD was obtained using 8% glucose as carbon and 0.3% peptone as nitrogen source at a medium pH of 7.0 after an incubation period of 72 hrs at 30 degree. (author)

  11. Topological characterization of antireflective and hydrophobic rough surfaces: are random process theory and fractal modeling applicable?

    Science.gov (United States)

    Borri, Claudia; Paggi, Marco

    2015-02-01

    The random process theory (RPT) has been widely applied to predict the joint probability distribution functions (PDFs) of asperity heights and curvatures of rough surfaces. A check of the predictions of RPT against the actual statistics of numerically generated random fractal surfaces and of real rough surfaces has been only partially undertaken. The present experimental and numerical study provides a deep critical comparison on this matter, providing some insight into the capabilities and limitations in applying RPT and fractal modeling to antireflective and hydrophobic rough surfaces, two important types of textured surfaces. A multi-resolution experimental campaign using a confocal profilometer with different lenses is carried out and a comprehensive software for the statistical description of rough surfaces is developed. It is found that the topology of the analyzed textured surfaces cannot be fully described according to RPT and fractal modeling. The following complexities emerge: (i) the presence of cut-offs or bi-fractality in the power-law power-spectral density (PSD) functions; (ii) a more pronounced shift of the PSD by changing resolution as compared to what was expected from fractal modeling; (iii) inaccuracy of the RPT in describing the joint PDFs of asperity heights and curvatures of textured surfaces; (iv) lack of resolution-invariance of joint PDFs of textured surfaces in case of special surface treatments, not accounted for by fractal modeling.

  12. Randomized clinical trials as reflexive-interpretative process in patients with rheumatoid arthritis: a qualitative study.

    Science.gov (United States)

    de Jorge, Mercedes; Parra, Sonia; de la Torre-Aboki, Jenny; Herrero-Beaumont, Gabriel

    2015-08-01

    Patients in randomized clinical trials have to adapt themselves to a restricted language to capture the necessary information to determine the safety and efficacy of a new treatment. The aim of this study was to explore the experience of patients with rheumatoid arthritis after completing their participation in a biologic therapy randomized clinical trial for a period of 3 years. A qualitative approach was used. The information was collected using 15 semi-structured interviews of patients with rheumatoid arthritis. Data collection was guided by the emergent analysis until no more relevant variations in the categories were found. The data were analysed using the grounded theory method. The objective of the patients when entering the study was to improve their quality of life by initiating the treatment. However, the experience changed the significance of the illness as they acquired skills and practical knowledge related to the management of their disease. The category "Interactional Empowerment" emerged as core category, as it represented the participative experience in a clinical trial. The process integrates the follow categories: "weight of systematisation", "working together", and the significance of the experience: "the duties". Simultaneously these categories evolved. The clinical trial monitoring activities enabled patients to engage in a reflexive-interpretative mechanism that transformed the emotional and symbolic significance of their disease and improved the empowerment of the patient. A better communicative strategy with the health professionals, the relatives of the patients, and the community was also achieved.

  13. Cognitive processing therapy versus supportive counseling for acute stress disorder following assault: a randomized pilot trial.

    Science.gov (United States)

    Nixon, Reginald D V

    2012-12-01

    The study tested the efficacy and tolerability of cognitive processing therapy (CPT) for survivors of assault with acute stress disorder. Participants (N=30) were randomly allocated to CPT or supportive counseling. Therapy comprised six individual weekly sessions of 90-min duration. Independent diagnostic assessment for PTSD was conducted at posttreatment. Participants completed self-report measures of posttraumatic stress, depression, and negative trauma-related beliefs at pre-, posttreatment, and 6-month follow-up. Results indicated that both interventions were successful in reducing symptoms at posttreatment with no statistical difference between the two; within and between-group effect sizes and the proportion of participants not meeting PTSD criteria was greater in CPT. Treatment gains were maintained for both groups at 6-month follow-up. Copyright © 2012. Published by Elsevier Ltd.

  14. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  16. Random Gap Detection Test (RGDT) performance of individuals with central auditory processing disorders from 5 to 25 years of age.

    Science.gov (United States)

    Dias, Karin Ziliotto; Jutras, Benoît; Acrani, Isabela Olszanski; Pereira, Liliane Desgualdo

    2012-02-01

    The aim of the present study was to assess the auditory temporal resolution ability in individuals with central auditory processing disorders, to examine the maturation effect and to investigate the relationship between the performance on a temporal resolution test with the performance on other central auditory tests. Participants were divided in two groups: 131 with Central Auditory Processing Disorder and 94 with normal auditory processing. They had pure-tone air-conduction thresholds no poorer than 15 dB HL bilaterally, normal admittance measures and presence of acoustic reflexes. Also, they were assessed with a central auditory test battery. Participants who failed at least one or more tests were included in the Central Auditory Processing Disorder group and those in the control group obtained normal performance on all tests. Following the auditory processing assessment, the Random Gap Detection Test was administered to the participants. A three-way ANOVA was performed. Correlation analyses were also done between the four Random Gap Detection Test subtests data as well as between Random Gap Detection Test data and the other auditory processing test results. There was a significant difference between the age-group performances in children with and without Central Auditory Processing Disorder. Also, 48% of children with Central Auditory Processing Disorder failed the Random Gap Detection Test and the percentage decreased as a function of age. The highest percentage (86%) was found in the 5-6 year-old children. Furthermore, results revealed a strong significant correlation between the four Random Gap Detection Test subtests. There was a modest correlation between the Random Gap Detection Test results and the dichotic listening tests. No significant correlation was observed between the Random Gap Detection Test data and the results of the other tests in the battery. Random Gap Detection Test should not be administered to children younger than 7 years old because

  17. Brain training game improves executive functions and processing speed in the elderly: a randomized controlled trial.

    Science.gov (United States)

    Nouchi, Rui; Taki, Yasuyuki; Takeuchi, Hikaru; Hashizume, Hiroshi; Akitsuki, Yuko; Shigemune, Yayoi; Sekiguchi, Atsushi; Kotozaki, Yuka; Tsukiura, Takashi; Yomogida, Yukihito; Kawashima, Ryuta

    2012-01-01

    The beneficial effects of brain training games are expected to transfer to other cognitive functions, but these beneficial effects are poorly understood. Here we investigate the impact of the brain training game (Brain Age) on cognitive functions in the elderly. Thirty-two elderly volunteers were recruited through an advertisement in the local newspaper and randomly assigned to either of two game groups (Brain Age, Tetris). This study was completed by 14 of the 16 members in the Brain Age group and 14 of the 16 members in the Tetris group. To maximize the benefit of the interventions, all participants were non-gamers who reported playing less than one hour of video games per week over the past 2 years. Participants in both the Brain Age and the Tetris groups played their game for about 15 minutes per day, at least 5 days per week, for 4 weeks. Each group played for a total of about 20 days. Measures of the cognitive functions were conducted before and after training. Measures of the cognitive functions fell into four categories (global cognitive status, executive functions, attention, and processing speed). Results showed that the effects of the brain training game were transferred to executive functions and to processing speed. However, the brain training game showed no transfer effect on any global cognitive status nor attention. Our results showed that playing Brain Age for 4 weeks could lead to improve cognitive functions (executive functions and processing speed) in the elderly. This result indicated that there is a possibility which the elderly could improve executive functions and processing speed in short term training. The results need replication in large samples. Long-term effects and relevance for every-day functioning remain uncertain as yet. UMIN Clinical Trial Registry 000002825.

  18. Brain training game improves executive functions and processing speed in the elderly: a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Rui Nouchi

    Full Text Available The beneficial effects of brain training games are expected to transfer to other cognitive functions, but these beneficial effects are poorly understood. Here we investigate the impact of the brain training game (Brain Age on cognitive functions in the elderly.Thirty-two elderly volunteers were recruited through an advertisement in the local newspaper and randomly assigned to either of two game groups (Brain Age, Tetris. This study was completed by 14 of the 16 members in the Brain Age group and 14 of the 16 members in the Tetris group. To maximize the benefit of the interventions, all participants were non-gamers who reported playing less than one hour of video games per week over the past 2 years. Participants in both the Brain Age and the Tetris groups played their game for about 15 minutes per day, at least 5 days per week, for 4 weeks. Each group played for a total of about 20 days. Measures of the cognitive functions were conducted before and after training. Measures of the cognitive functions fell into four categories (global cognitive status, executive functions, attention, and processing speed. Results showed that the effects of the brain training game were transferred to executive functions and to processing speed. However, the brain training game showed no transfer effect on any global cognitive status nor attention.Our results showed that playing Brain Age for 4 weeks could lead to improve cognitive functions (executive functions and processing speed in the elderly. This result indicated that there is a possibility which the elderly could improve executive functions and processing speed in short term training. The results need replication in large samples. Long-term effects and relevance for every-day functioning remain uncertain as yet.UMIN Clinical Trial Registry 000002825.

  19. Design of Energy Aware Adder Circuits Considering Random Intra-Die Process Variations

    Directory of Open Access Journals (Sweden)

    Marco Lanuzza

    2011-04-01

    Full Text Available Energy consumption is one of the main barriers to current high-performance designs. Moreover, the increased variability experienced in advanced process technologies implies further timing yield concerns and therefore intensifies this obstacle. Thus, proper techniques to achieve robust designs are a critical requirement for integrated circuit success. In this paper, the influence of intra-die random process variations is analyzed considering the particular case of the design of energy aware adder circuits. Five well known adder circuits were designed exploiting an industrial 45 nm static complementary metal-oxide semiconductor (CMOS standard cell library. The designed adders were comparatively evaluated under different energy constraints. As a main result, the performed analysis demonstrates that, for a given energy budget, simpler circuits (which are conventionally identified as low-energy slow architectures operating at higher power supply voltages can achieve a timing yield significantly better than more complex faster adders when used in low-power design with supply voltages lower than nominal.

  20. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media

    Science.gov (United States)

    Zhou, L.; Qu, Z. G.; Ding, T.; Miao, J. Y.

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  1. Insights from random vibration analyses using multiple earthquake components

    International Nuclear Information System (INIS)

    DebChaudhury, A.; Gasparini, D.A.

    1981-01-01

    The behavior of multi-degree-of-freedom systems subjected to multiple earthquake components is studied by the use of random vibration dynamic analyses. A linear system which has been decoupled into modes and has both translational and rotational degrees of freedom is analyzed. The seismic excitation is modelled as a correlated or uncorrelated, vector-valued, non-stationary random process having a Kanai-Tajimi type of frequency content. Non-stationarity is achieved by using a piece wise linear strength function. Therefore, almost any type of evolution and decay of an earthquake may be modelled. Also, in general, the components of the excitation have different frequency contents and strength functions; i.e. intensities and durations and the correlations between components can vary with time. A state-space, modal, random vibration approach is used. Exact analytical expressions for both the state transition matrix and the evolutionary modal covariance matrix are utilized to compute time histories of modal RMS responses. Desired responses are then computed by modal superposition. Specifically, relative displacement, relative velocity and absolute acceleration responses are studied. An important advantage of such analyses is that RMS responses vary smoothly in time therefore large time intervals may be used to generate response time histories. The modal superposition is exact; that is, all cross correlation terms between modal responses are included. (orig./RW)

  2. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  3. A method of signal transmission path analysis for multivariate random processes

    International Nuclear Information System (INIS)

    Oguma, Ritsuo

    1984-04-01

    A method for noise analysis called ''STP (signal transmission path) analysis'' is presentd as a tool to identify noise sources and their propagation paths in multivariate random proceses. Basic idea of the analysis is to identify, via time series analysis, effective network for the signal power transmission among variables in the system and to make use of its information to the noise analysis. In the present paper, we accomplish this through two steps of signal processings; first, we estimate, using noise power contribution analysis, variables which have large contribution to the power spectrum of interest, and then evaluate the STPs for each pair of variables to identify STPs which play significant role for the generated noise to transmit to the variable under evaluation. The latter part of the analysis is executed through comparison of partial coherence function and newly introduced partial noise power contribution function. This paper presents the procedure of the STP analysis and demonstrates, using simulation data as well as Borssele PWR noise data, its effectiveness for investigation of noise generation and propagation mechanisms. (author)

  4. Leaving Distress Behind: A Randomized Controlled Study on Change in Emotional Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Berthoud, Laurent; Pascual-Leone, Antonio; Caspar, Franz; Tissot, Hervé; Keller, Sabine; Rohde, Kristina B; de Roten, Yves; Despland, Jean-Nicolas; Kramer, Ueli

    2017-01-01

    The marked impulsivity and instability of clients suffering from borderline personality disorder (BPD) greatly challenge therapists' understanding and responsiveness. This may hinder the development of a constructive therapeutic relationship despite it being of particular importance in their treatment. Recent studies have shown that using motive-oriented therapeutic relationship (MOTR), a possible operationalization of appropriate therapist responsiveness, can enhance treatment outcome for BPD. The overall objective of this study is to examine change in emotional processing in BPD clients following the therapist's use of MOTR. The present paper focuses on N = 50 cases, n = 25 taken from each of two conditions of a randomized controlled add-on effectiveness design. Clients were either allocated to a manual-based psychiatric-psychodynamic 10-session version of general psychiatric management (GPM), a borderline-specific treatment, or to a 10-session version of GPM augmented with MOTR. Emotional states were assessed using the Classification of Affective-Meaning States (Pascual-Leone & Greenberg, 2005) at intake, midtreatment, and in the penultimate session. Across treatment, early expressions of distress, especially the emotion state of global distress, were shown to significantly decrease (p = .00), and adaptive emotions were found to emerge (p emotional variability and stronger outcome predictors in the MOTR condition. The findings indicate initial emotional change in BPD clients in a relatively short time frame and suggest the addition of MOTR to psychotherapeutic treatments as promising. Clinical implications are discussed.

  5. MMRW-BOOKS, Legacy books on slowing down, thermalization, particle transport theory, random processes in reactors

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2007-01-01

    Description: Prof. M.M..R Williams has now released three of his legacy books for free distribution: 1 - M.M.R. Williams: The Slowing Down and Thermalization of Neutrons, North-Holland Publishing Company - Amsterdam, 582 pages, 1966. Content: Part I - The Thermal Energy Region: 1. Introduction and Historical Review, 2. The Scattering Kernel, 3. Neutron Thermalization in an Infinite Homogeneous Medium, 4. Neutron Thermalization in Finite Media, 5. The Spatial Dependence of the Energy Spectrum, 6. Reactor Cell Calculations, 7. Synthetic Scattering Kernels. Part II - The Slowing Down Region: 8. Scattering Kernels in the Slowing Down Region, 9. Neutron Slowing Down in an Infinite Homogeneous Medium, 10.Neutron Slowing Down and Diffusion. 2 - M.M.R. Williams: Mathematical Methods in Particle Transport Theory, Butterworths, London, 430 pages, 1971. Content: 1 The General Problem of Particle Transport, 2 The Boltzmann Equation for Gas Atoms and Neutrons, 3 Boundary Conditions, 4 Scattering Kernels, 5 Some Basic Problems in Neutron Transport and Rarefied Gas Dynamics, 6 The Integral Form of the Transport Equation in Plane, Spherical and Cylindrical Geometries, 7 Exact Solutions of Model Problems, 8 Eigenvalue Problems in Transport Theory, 9 Collision Probability Methods, 10 Variational Methods, 11 Polynomial Approximations. 3 - M.M.R. Williams: Random Processes in Nuclear Reactors, Pergamon Press Oxford New York Toronto Sydney, 243 pages, 1974. Content: 1. Historical Survey and General Discussion, 2. Introductory Mathematical Treatment, 3. Applications of the General Theory, 4. Practical Applications of the Probability Distribution, 5. The Langevin Technique, 6. Point Model Power Reactor Noise, 7. The Spatial Variation of Reactor Noise, 8. Random Phenomena in Heterogeneous Reactor Systems, 9. Associated Fluctuation Problems, Appendix: Noise Equivalent Sources. Note to the user: Prof. M.M.R Williams owns the copyright of these books and he authorises the OECD/NEA Data Bank

  6. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  7. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  8. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    Science.gov (United States)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  9. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    Science.gov (United States)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  10. Use of Play Therapy in Nursing Process: A Prospective Randomized Controlled Study.

    Science.gov (United States)

    Sezici, Emel; Ocakci, Ayse Ferda; Kadioglu, Hasibe

    2017-03-01

    Play therapy is a nursing intervention employed in multidisciplinary approaches to develop the social, emotional, and behavioral skills of children. In this study, we aim to determine the effects of play therapy on the social, emotional, and behavioral skills of pre-school children through the nursing process. A single-blind, prospective, randomized controlled study was undertaken. The design, conduct, and reporting of this study adhere to the Consolidated Standards of Reporting Trials (CONSORT) guidelines. The participants included 4- to 5-year-old kindergarten children with no oral or aural disabilities and parents who agreed to participate in the study. The Pre-school Child and Family Identification Form and Social Competence and the Behavior Evaluation Scale were used to gather data. Games in the play therapy literature about nursing diagnoses (fear, social disturbance, impaired social interactions, ineffective coping, anxiety), which were determined after the preliminary test, constituted the application of the study. There was no difference in the average scores of the children in the experimental and control groups in their Anger-Aggression (AA), Social Competence (SC), and Anxiety-Withdrawal (AW) scores beforehand (t = 0.015, p = .988; t = 0.084, p = .933; t = 0.214, p = .831, respectively). The difference between the average AA and SC scores in the post-test (t = 2.041, p = .045; t = 2.692, p = .009, respectively), and the retests were statistically significant in AA and SC average scores in the experimental and control groups (t = 4.538, p = .000; t = 4.693; p = .000, respectively). In AW average scores, no statistical difference was found in the post-test (t = 0.700, p = .486), whereas in the retest, a significant difference was identified (t = 5.839, p = .000). Play therapy helped pre-school children to improve their social, emotional, and behavioral skills. It also provided benefits for the children to decrease their fear and anxiety levels, to improve

  11. A process evaluation of the Supermarket Healthy Eating for Life (SHELf) randomized controlled trial.

    Science.gov (United States)

    Olstad, Dana Lee; Ball, Kylie; Abbott, Gavin; McNaughton, Sarah A; Le, Ha N D; Ni Mhurchu, Cliona; Pollard, Christina; Crawford, David A

    2016-02-24

    Supermarket Healthy Eating for Life (SHELf) was a randomized controlled trial that operationalized a socioecological approach to population-level dietary behaviour change in a real-world supermarket setting. SHELf tested the impact of individual (skill-building), environmental (20% price reductions), and combined (skill-building + 20% price reductions) interventions on women's purchasing and consumption of fruits, vegetables, low-calorie carbonated beverages and water. This process evaluation investigated the reach, effectiveness, implementation, and maintenance of the SHELf interventions. RE-AIM provided a conceptual framework to examine the processes underlying the impact of the interventions using data from participant surveys and objective sales data collected at baseline, post-intervention (3 months) and 6-months post-intervention. Fisher's exact, χ (2) and t-tests assessed differences in quantitative survey responses among groups. Adjusted linear regression examined the impact of self-reported intervention dose on food purchasing and consumption outcomes. Thematic analysis identified key themes within qualitative survey responses. Reach of the SHELf interventions to disadvantaged groups, and beyond study participants themselves, was moderate. Just over one-third of intervention participants indicated that the interventions were effective in changing the way they bought, cooked or consumed food (p < 0.001 compared to control), with no differences among intervention groups. Improvements in purchasing and consumption outcomes were greatest among those who received a higher intervention dose. Most notably, participants who said they accessed price reductions on fruits and vegetables purchased (519 g/week) and consumed (0.5 servings/day) more vegetables. The majority of participants said they accessed (82%) and appreciated discounts on fruits and vegetables, while there was limited use (40%) and appreciation of discounts on low-calorie carbonated

  12. An innovative scintillation process for correcting, cooling, and reducing the randomness of waveforms

    International Nuclear Information System (INIS)

    Shen, J.

    1991-01-01

    Research activities were concentrated on an innovative scintillation technique for high-energy collider detection. Heretofore, scintillation waveform data of high- energy physics events have been problematically random. This paper represents a bottleneck of data flow for the next generation of detectors for proton colliders like SSC or LHC. Prevailing problems to resolve were: additional time walk and jitter resulting from the random hitting positions of particles, increased walk and jitter caused by scintillation photon propagation dispersions, and quantum fluctuations of luminescence. However, these were manageable when the different aspects of randomness had been clarified in increased detail. For this purpose, these three were defined as pseudorandomness, quasi-randomness, and real randomness, respectively. A unique scintillation counter incorporating long scintillators with light guides, a drift chamber, and fast discriminators plus integrators was employed to resolve problems of correcting time walk and reducing the additional jitter by establishing an analytical waveform description of V(t,z) for a measured (z). Resolving problem was accomplished by reducing jitter by compressing V(t,z) with a nonlinear medium, called cooling scintillation. Resolving problem was proposed by orienting molecular and polarizing scintillation through the use of intense magnetic technology, called stabilizing the waveform

  13. Random practice - one of the factors of the motor learning process

    Directory of Open Access Journals (Sweden)

    Petr Valach

    2012-01-01

    Full Text Available BACKGROUND: An important concept of acquiring motor skills is the random practice (contextual interference - CI. The explanation of the effect of contextual interference is that the memory has to work more intensively, and therefore it provides higher effect of motor skills retention than the block practice. Only active remembering of a motor skill assigns the practical value for appropriate using in the future. OBJECTIVE: The aim of this research was to determine the difference in how the motor skills in sport gymnastics are acquired and retained using the two different teaching methods - blocked and random practice. METHODS: The blocked and random practice on the three selected gymnastics tasks were applied in the two groups students of physical education (blocked practice - the group BP, random practice - the group RP during two months, in one session a week (totally 80 trials. At the end of the experiment and 6 months after (retention tests the groups were tested on the selected gymnastics skills. RESULTS: No significant differences in a level of the gymnastics skills were found between BP group and RP group at the end of the experiment. However, the retention tests showed significantly higher level of the gymnastics skills in the RP group in comparison with the BP group. CONCLUSION: The results confirmed that a retention of the gymnastics skills using the teaching method of the random practice was significantly higher than with use of the blocked practice.

  14. Processing speed and working memory training in multiple sclerosis: a double-blind randomized controlled pilot study.

    Science.gov (United States)

    Hancock, Laura M; Bruce, Jared M; Bruce, Amanda S; Lynch, Sharon G

    2015-01-01

    Between 40-65% of multiple sclerosis patients experience cognitive deficits, with processing speed and working memory most commonly affected. This pilot study investigated the effect of computerized cognitive training focused on improving processing speed and working memory. Participants were randomized into either an active or a sham training group and engaged in six weeks of training. The active training group improved on a measure of processing speed and attention following cognitive training, and data trended toward significance on measures of other domains. Results provide preliminary evidence that cognitive training with multiple sclerosis patients may produce moderate improvement in select areas of cognitive functioning.

  15. Process convergence of self-normalized sums of i.i.d. random ...

    Indian Academy of Sciences (India)

    The study of the asymptotics of the self-normalized sums are also interesting. Logan ... if the constituent random variables are from the domain of attraction of a normal dis- tribution ... index of stability α which equals 2 (for definition, see §2).

  16. Analysis, Simulation and Prediction of Multivariate Random Fields with Package RandomFields

    Directory of Open Access Journals (Sweden)

    Martin Schlather

    2015-02-01

    Full Text Available Modeling of and inference on multivariate data that have been measured in space, such as temperature and pressure, are challenging tasks in environmental sciences, physics and materials science. We give an overview over and some background on modeling with cross- covariance models. The R package RandomFields supports the simulation, the parameter estimation and the prediction in particular for the linear model of coregionalization, the multivariate Matrn models, the delay model, and a spectrum of physically motivated vector valued models. An example on weather data is considered, illustrating the use of RandomFields for parameter estimation and prediction.

  17. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  18. Choosing between Higher Moment Maximum Entropy Models and Its Application to Homogeneous Point Processes with Random Effects

    Directory of Open Access Journals (Sweden)

    Lotfi Khribi

    2017-12-01

    Full Text Available In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulation study and the choice of the best order is established by two goodness-of-fit criteria: Kullback–Leibler divergence and a discrepancy measure. This procedure is illustrated on a warranty data set from the automobile industry.

  19. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    Science.gov (United States)

    Godrèche, Claude

    2017-05-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature.

  20. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    International Nuclear Information System (INIS)

    Godrèche, Claude

    2017-01-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature. (paper)

  1. Random-walk simulation of diffusion-controlled processes among static traps

    International Nuclear Information System (INIS)

    Lee, S.B.; Kim, I.C.; Miller, C.A.; Torquato, S.; Department of Mechanical and Aerospace Engineering and Department of Chemical Engineering, North Carolina State University, Raleigh, North Carolina 27695-7910)

    1989-01-01

    We present computer-simulation results for the trapping rate (rate constant) k associated with diffusion-controlled reactions among identical, static spherical traps distributed with an arbitrary degree of impenetrability using a Pearson random-walk algorithm. We specifically consider the penetrable-concentric-shell model in which each trap of diameter σ is composed of a mutually impenetrable core of diameter λσ, encompassed by a perfectly penetrable shell of thickness (1-λ)σ/2: λ=0 corresponding to randomly centered or ''fully penetrable'' traps and λ=1 corresponding to totally impenetrable traps. Trapping rates are calculated accurately from the random-walk algorithm at the extreme limits of λ (λ=0 and 1) and at an intermediate value (λ=0.8), for a wide range of trap densities. Our simulation procedure has a relatively fast execution time. It is found that k increases with increasing impenetrability at fixed trap concentration. These ''exact'' data are compared with previous theories for the trapping rate. Although a good approximate theory exists for the fully-penetrable-trap case, there are no currently available theories that can provide good estimates of the trapping rate for a moderate to high density of traps with nonzero hard cores (λ>0)

  2. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  3. Direct observation of asperity deformation of specimens with random rough surfaces in upsetting and indentation processes

    DEFF Research Database (Denmark)

    Azushima, A.; Kuba, S.; Tani, S.

    2006-01-01

    The trapping behavior of liquid lubricant and contact behavior of asperities at the workpiece-tool interface during upsetting and indentation are observed directly using a compression subpress which consists of a transparent die made of sapphire, a microscope with a CCD camera and a video system....... The experiments are carried out without lubricant and with lubricant. Specimens used are commercially pure A1100 aluminum with a random rough surface. From these observations, the change in the fraction of real contact area is measured by an image processor. The real contact area ratios in upsetting experiments...

  4. Direct Observation of Asperity Deformation of Specimen with Random Rough Surface in Upsetting Process

    DEFF Research Database (Denmark)

    Azushima, A.; Kuba, S.; Tani, S.

    2004-01-01

    The trapping behavior of liquid lubricant and contact behavior of asperities at the workpiece-tool interface during upsetting and indentation are observed directly using a compression subpress which consists of a transparent die made of sapphire, a microscope with a CCD camera and a video system....... The experiments are carried out without lubricant and with lubricant. Specimens used are commercially pure A1100 Aluminum with a random rough surface. From this observation, the change in the fraction of real contact area is measured by an image processor. The real contact area ratios in upsetting experiment...

  5. The impact of randomness on the distribution of wealth: Some economic aspects of the Wright-Fisher diffusion process

    Science.gov (United States)

    Bouleau, Nicolas; Chorro, Christophe

    2017-08-01

    In this paper we consider some elementary and fair zero-sum games of chance in order to study the impact of random effects on the wealth distribution of N interacting players. Even if an exhaustive analytical study of such games between many players may be tricky, numerical experiments highlight interesting asymptotic properties. In particular, we emphasize that randomness plays a key role in concentrating wealth in the extreme, in the hands of a single player. From a mathematical perspective, we interestingly adopt some diffusion limits for small and high-frequency transactions which are otherwise extensively used in population genetics. Finally, the impact of small tax rates on the preceding dynamics is discussed for several regulation mechanisms. We show that taxation of income is not sufficient to overcome this extreme concentration process in contrast to the uniform taxation of capital which stabilizes the economy and prevents agents from being ruined.

  6. Accumulated damage evaluation for a piping system by the response factor on non-stationary random process, 2

    International Nuclear Information System (INIS)

    Shintani, Masanori

    1988-01-01

    This paper shows that the average and variance of the accumulated damage caused by earthquakes on the piping system attached to a building are related to the seismic response factor λ. The earthquakes refered to in this paper are of a non-stationary random process kind. The average is proportional to λ 2 and the variance to λ 4 . The analytical values of the average and variance for a single-degree-of-freedom system are compared with those obtained from computer simulations. Here the model of the building is a single-degree-of-freedom system. Both average of accumulated damage are approximately equal. The variance obtained from the analysis does not coincide with that from simulations. The reason is considered to be the forced vibraiton by sinusoidal waves, and the sinusoidal waves included random waves. Taking account of amplitude magnification factor, the values of the variance approach those obtained from simulations. (author)

  7. Scaling in Rate-Changeable Birth and Death Processes with Random Removals

    International Nuclear Information System (INIS)

    Ke Jianhong; Lin Zhenquan; Chen Xiaoshuang

    2009-01-01

    We propose a monomer birth-death model with random removals, in which an aggregate of size k can produce a new monomer at a time-dependent rate I(t)k or lose one monomer at a rate J(t)k, and with a probability P (t) an aggregate of any size is randomly removed. We then analytically investigate the kinetic evolution of the model by means of the rate equation. The results show that the scaling behavior of the aggregate size distribution is dependent crucially on the net birth rate I(t) - J(t) as well as the birth rate I(t). The aggregate size distribution can approach a standard or modified scaling form in some cases, but it may take a scale-free form in other cases. Moreover, the species can survive finally only if either I(t) - J(t) ≥ P (t) or [J(t) + P (t) - I(t)]t ≅ 0 at t >> 1; otherwise, it will become extinct.

  8. Order acceptance in food processing systems with random raw material requirements

    NARCIS (Netherlands)

    Kilic, Onur A.; van Donk, Dirk Pieter; Wijngaard, Jacob; Tarim, S. Armagan

    This study considers a food production system that processes a single perishable raw material into several products having stochastic demands. In order to process an order, the amount of raw material delivery from storage needs to meet the raw material requirement of the order. However, the amount

  9. [Working memory and executive control: inhibitory processes in updating and random generation tasks].

    Science.gov (United States)

    Macizo, Pedro; Bajo, Teresa; Soriano, Maria Felipa

    2006-02-01

    Working Memory (WM) span predicts subjects' performance in control executive tasks and, in addition, it has been related to the capacity to inhibit irrelevant information. In this paper we investigate the role of WM span in two executive tasks focusing our attention on inhibitory components of both tasks. High and low span participants recalled targets words rejecting irrelevant items at the same time (Experiment 1) and they generated random numbers (Experiment 2). Results showed a clear relation between WM span and performance in both tasks. In addition, analyses of intrusion errors (Experiment 1) and stereotyped responses (Experiment 2) indicated that high span individuals were able to efficiently use the inhibitory component implied in both tasks. The pattern of data provides support to the relation between WM span and control executive tasks through an inhibitory mechanism.

  10. β-decay rates of r-process nuclei in the relativistic quasiparticle random phase approximation

    International Nuclear Information System (INIS)

    Niksic, T.; Marketin, T.; Vretenar, D.; Paar, N.; Ring, P.

    2004-01-01

    The fully consistent relativistic proton-neutron quasiparticle random phase approximation (PN-RQRPA) is employed in the calculation of β-decay half-lives of neutron-rich nuclei in the N∼50 and N∼82 regions. A new density-dependent effective interaction, with an enhanced value of the nucleon effective mass, is used in relativistic Hartree-Bogolyubov calculation of nuclear ground states and in the particle-hole channel of the PN-RQRPA. The finite range Gogny D1S interaction is employed in the T=1 pairing channel, and the model also includes a proton-neutron particle-particle interaction. The theoretical half-lives reproduce the experimental data for the Fe, Zn, Cd, and Te isotopic chains, but overestimate the lifetimes of Ni isotopes and predict a stable 132 Sn. (orig.)

  11. β-decay rates of r-process nuclei in the relativistic quasiparticle random phase approximation

    International Nuclear Information System (INIS)

    Niksic, T.; Marketin, T.; Vretenar, D.; Paar, N.; Ring, P.

    2005-01-01

    The fully consistent relativistic proton-neutron quasiparticle random phase approximation (PN-RQRPA) is employed in the calculation of β-decay half-lives of neutron-rich nuclei in the N≅50 and N≅82 regions. A new density-dependent effective interaction, with an enhanced value of the nucleon effective mass, is used in relativistic Hartree-Bogoliubov calculation of nuclear ground states and in the particle-hole channel of the PN-RQRPA. The finite range Gogny D1S interaction is employed in the T=1 pairing channel, and the model also includes a proton-neutron particle-particle interaction. The theoretical half-lives reproduce the experimental data for the Fe, Zn, Cd, and Te isotopic chains but overestimate the lifetimes of Ni isotopes and predict a stable 132 Sn

  12. {beta}-decay rates of r-process nuclei in the relativistic quasiparticle random phase approximation

    Energy Technology Data Exchange (ETDEWEB)

    Niksic, T.; Marketin, T.; Vretenar, D. [Zagreb Univ. (Croatia). Faculty of Science, Physics Dept.; Paar, N. [Technische Univ. Darmstadt (Germany). Inst. fuer Kernphysik; Ring, P. [Technische Univ. Muenchen, Garching (Germany). Physik-Department

    2004-12-08

    The fully consistent relativistic proton-neutron quasiparticle random phase approximation (PN-RQRPA) is employed in the calculation of {beta}-decay half-lives of neutron-rich nuclei in the N{approx}50 and N{approx}82 regions. A new density-dependent effective interaction, with an enhanced value of the nucleon effective mass, is used in relativistic Hartree-Bogolyubov calculation of nuclear ground states and in the particle-hole channel of the PN-RQRPA. The finite range Gogny D1S interaction is employed in the T=1 pairing channel, and the model also includes a proton-neutron particle-particle interaction. The theoretical half-lives reproduce the experimental data for the Fe, Zn, Cd, and Te isotopic chains, but overestimate the lifetimes of Ni isotopes and predict a stable {sup 132}Sn. (orig.)

  13. Solution-processed flexible NiO resistive random access memory device

    Science.gov (United States)

    Kim, Soo-Jung; Lee, Heon; Hong, Sung-Hoon

    2018-04-01

    Non-volatile memories (NVMs) using nanocrystals (NCs) as active materials can be applied to soft electronic devices requiring a low-temperature process because NCs do not require a heat treatment process for crystallization. In addition, memory devices can be implemented simply by using a patterning technique using a solution process. In this study, a flexible NiO ReRAM device was fabricated using a simple NC patterning method that controls the capillary force and dewetting of a NiO NC solution at low temperature. The switching behavior of a NiO NC based memory was clearly observed by conductive atomic force microscopy (c-AFM).

  14. Simultaneous Range-Velocity Processing and SNR Analysis of AFIT’s Random Noise Radar

    Science.gov (United States)

    2012-03-22

    reducing the overall processing time. Two computers, equipped with NVIDIA ® GPUs, were used to process the col- 45 lected data. The specifications for each...gather the results back to the CPU. Another company , AccelerEyes®, has developed a product called Jacket® that claims to be better than the parallel...Number of Processing Cores 4 8 Processor Speed 3.33 GHz 3.07 GHz Installed Memory 48 GB 48 GB GPU Make NVIDIA NVIDIA GPU Model Tesla 1060 Tesla C2070 GPU

  15. Hierarchical random additive process and logarithmic scaling of generalized high order, two-point correlations in turbulent boundary layer flow

    Science.gov (United States)

    Yang, X. I. A.; Marusic, I.; Meneveau, C.

    2016-06-01

    Townsend [Townsend, The Structure of Turbulent Shear Flow (Cambridge University Press, Cambridge, UK, 1976)] hypothesized that the logarithmic region in high-Reynolds-number wall-bounded flows consists of space-filling, self-similar attached eddies. Invoking this hypothesis, we express streamwise velocity fluctuations in the inertial layer in high-Reynolds-number wall-bounded flows as a hierarchical random additive process (HRAP): uz+=∑i=1Nzai . Here u is the streamwise velocity fluctuation, + indicates normalization in wall units, z is the wall normal distance, and ai's are independently, identically distributed random additives, each of which is associated with an attached eddy in the wall-attached hierarchy. The number of random additives is Nz˜ln(δ /z ) where δ is the boundary layer thickness and ln is natural log. Due to its simplified structure, such a process leads to predictions of the scaling behaviors for various turbulence statistics in the logarithmic layer. Besides reproducing known logarithmic scaling of moments, structure functions, and correlation function [" close="]3/2 uz(x ) uz(x +r ) >, new logarithmic laws in two-point statistics such as uz4(x ) > 1 /2, 1/3, etc. can be derived using the HRAP formalism. Supporting empirical evidence for the logarithmic scaling in such statistics is found from the Melbourne High Reynolds Number Boundary Layer Wind Tunnel measurements. We also show that, at high Reynolds numbers, the above mentioned new logarithmic laws can be derived by assuming the arrival of an attached eddy at a generic point in the flow field to be a Poisson process [Woodcock and Marusic, Phys. Fluids 27, 015104 (2015), 10.1063/1.4905301]. Taken together, the results provide new evidence supporting the essential ingredients of the attached eddy hypothesis to describe streamwise velocity fluctuations of large, momentum transporting eddies in wall-bounded turbulence, while observed deviations suggest the need for further extensions of the

  16. A teachable moment communication process for smoking cessation talk: description of a group randomized clinician-focused intervention

    Directory of Open Access Journals (Sweden)

    Flocke Susan A

    2012-05-01

    Full Text Available Abstract Background Effective clinician-patient communication about health behavior change is one of the most important and most overlooked strategies to promote health and prevent disease. Existing guidelines for specific health behavior counseling have been created and promulgated, but not successfully adopted in primary care practice. Building on work focused on creating effective clinician strategies for prompting health behavior change in the primary care setting, we developed an intervention intended to enhance clinician communication skills to create and act on teachable moments for smoking cessation. In this manuscript, we describe the development and implementation of the Teachable Moment Communication Process (TMCP intervention and the baseline characteristics of a group randomized trial designed to evaluate its effectiveness. Methods/Design This group randomized trial includes thirty-one community-based primary care clinicians practicing in Northeast Ohio and 840 of their adult patients. Clinicians were randomly assigned to receive either the Teachable Moments Communication Process (TMCP intervention for smoking cessation, or the delayed intervention. The TMCP intervention consisted of two, 3-hour educational training sessions including didactic presentation, skill demonstration through video examples, skills practices with standardized patients, and feedback from peers and the trainers. For each clinician enrolled, 12 patients were recruited for two time points. Pre- and post-intervention data from the clinicians, patients and audio-recorded clinician‒patient interactions were collected. At baseline, the two groups of clinicians and their patients were similar with regard to all demographic and practice characteristics examined. Both physician and patient recruitment goals were met, and retention was 96% and 94% respectively. Discussion Findings support the feasibility of training clinicians to use the Teachable Moments

  17. Rapid Processing of Net-Shape Thermoplastic Planar-Random Composite Preforms

    Science.gov (United States)

    Jespersen, S. T.; Baudry, F.; Schmäh, D.; Wakeman, M. D.; Michaud, V.; Blanchard, P.; Norris, R. E.; Månson, J.-A. E.

    2009-02-01

    A novel thermoplastic composite preforming and moulding process is investigated to target cost issues in textile composite processing associated with trim waste, and the limited mechanical properties of current bulk flow-moulding composites. The thermoplastic programmable powdered preforming process (TP-P4) uses commingled glass and polypropylene yarns, which are cut to length before air assisted deposition onto a vacuum screen, enabling local preform areal weight tailoring. The as-placed fibres are heat-set for improved handling before an optional preconsolidation stage. The preforms are then preheated and press formed to obtain the final part. The process stages are examined to optimize part quality and throughput versus processing parameters. A viable processing route is proposed with typical cycle times below 40 s (for a plate 0.5 × 0.5 m2, weighing 2 kg), enabling high production capacity from one line. The mechanical performance is shown to surpass that of 40 wt.% GMT and has properties equivalent to those of 40 wt.% GMTex at both 20°C and 80°C.

  18. Process and effects of a community intervention on malaria in rural Burkina Faso: randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Gustafsson Lars

    2008-03-01

    Full Text Available Abstract Background In the rural areas of sub-Saharan Africa, the majority of young children affected by malaria have no access to formal health services. Home treatment through mothers of febrile children supported by mother groups and local health workers has the potential to reduce malaria morbidity and mortality. Methods A cluster-randomized controlled effectiveness trial was implemented from 2002–2004 in a malaria endemic area of rural Burkina Faso. Six and seven villages were randomly assigned to the intervention and control arms respectively. Febrile children from intervention villages were treated with chloroquine (CQ by their mothers, supported by local women group leaders. CQ was regularly supplied through a revolving fund from local health centres. The trial was evaluated through two cross-sectional surveys at baseline and after two years of intervention. The primary endpoint of the study was the proportion of moderate to severe anaemia in children aged 6–59 months. For assessment of the development of drug efficacy over time, an in vivo CQ efficacy study was nested into the trial. The study is registered under http://www.controlled-trials.com (ISRCTN 34104704. Results The intervention was shown to be feasible under program conditions and a total of 1.076 children and 999 children were evaluated at baseline and follow-up time points respectively. Self-reported CQ treatment of fever episodes at home as well as referrals to health centres increased over the study period. At follow-up, CQ was detected in the blood of high proportions of intervention and control children. Compared to baseline findings, the prevalence of anaemia (29% vs 16%, p P. falciparum parasitaemia, fever and palpable spleens was lower at follow-up but there were no differences between the intervention and control group. CQ efficacy decreased over the study period but this was not associated with the intervention. Discussion The decreasing prevalence of malaria

  19. Likelihood updating of random process load and resistance parameters by monitoring

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    that maximum likelihood estimation is a rational alternative to an arbitrary weighting for least square fitting. The derived likelihood function gets singularities if the spectrum is prescribed with zero values at some frequencies. This is often the case for models of technically relevant processes......, even though it is of complicated mathematical form, allows an approximate Bayesian updating and control of the time development of the parameters. Some of these parameters can be structural parameters that by too much change reveal progressing damage or other malfunctioning. Thus current process......Spectral parameters for a stationary Gaussian process are most often estimated by Fourier transformation of a realization followed by some smoothing procedure. This smoothing is often a weighted least square fitting of some prespecified parametric form of the spectrum. In this paper it is shown...

  20. Impact of Cocoa Consumption on Inflammation Processes-A Critical Review of Randomized Controlled Trials.

    Science.gov (United States)

    Ellinger, Sabine; Stehle, Peter

    2016-05-26

    Cocoa flavanols have strong anti-inflammatory properties in vitro. If these also occur in vivo, cocoa consumption may contribute to the prevention or treatment of diseases mediated by chronic inflammation. This critical review judged the evidence for such effects occurring after cocoa consumption. A literature search in Medline was performed for randomized controlled trials (RCTs) that investigated the effects of cocoa consumption on inflammatory biomarkers. Thirty-three RCTs were included, along with 9 bolus and 24 regular consumption studies. Acute cocoa consumption decreased adhesion molecules and 4-series leukotrienes in serum, nuclear factor κB activation in leukocytes, and the expression of CD62P and CD11b on monocytes and neutrophils. In healthy subjects and in patients with cardiovascular diseases, most regular consumption trials did not find any changes except for a decreased number of endothelial microparticles, but several cellular and humoral inflammation markers decreased in patients suffering from type 2 diabetes and impaired fasting glucose. Little evidence exists that consumption of cocoa-rich food may reduce inflammation, probably by lowering the activation of monocytes and neutrophils. The efficacy seems to depend on the extent of the basal inflammatory burden. Further well-designed RCTs with inflammation as the primary outcome are needed, focusing on specific markers of leukocyte activation and considering endothelial microparticles as marker of vascular inflammation.

  1. Aerobic Exercise Training in Post-Polio Syndrome: Process Evaluation of a Randomized Controlled Trial

    NARCIS (Netherlands)

    Voorn, Eric L.; Koopman, Fieke S.; Brehm, Merel A.; Beelen, Anita; de Haan, Arnold; Gerrits, Karin H. L.; Nollet, Frans

    2016-01-01

    To explore reasons for the lack of efficacy of a high intensity aerobic exercise program in post-polio syndrome (PPS) on cardiorespiratory fitness by evaluating adherence to the training program and effects on muscle function. A process evaluation using data from an RCT. Forty-four severely fatigued

  2. A One Line Derivation of DCC: Application of a Vector Random Coefficient Moving Average Process

    NARCIS (Netherlands)

    C.M. Hafner (Christian); M.J. McAleer (Michael)

    2014-01-01

    markdownabstract__Abstract__ One of the most widely-used multivariate conditional volatility models is the dynamic conditional correlation (or DCC) specification. However, the underlying stochastic process to derive DCC has not yet been established, which has made problematic the derivation of

  3. Does analgesia affect the diagnostic process in acute abdomen? a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Khashayar P.

    2008-03-01

    Full Text Available Background: About one-forth of the patients admitted to the emergency department complain of acute abdominal pain. According to surgical records, most surgeons believe that pain relief for these patients may interfere with the clinical examinations and the final diagnoses. As a result, analgesics are withheld in patients with acute abdominal pain until the determination of a definite diagnosis and suitable management plan. The purpose of this study was to evaluate the effect of analgesics on the evaluation course and treatment in acute abdomen.Methods: Two hundred patients at a surgical emergency department with acute abdominal pain were enrolled in this prospective study and randomly divided into two groups at the time of admission. The case group consisted of 98 patients who received intravenous analgesia immediately after admission. The other 102 patients in the control group did not receive analgesia until a definite diagnosis was made. Diagnostic and therapeutic procedures were similar between the two groups. The primary and final diagnoses, and the time intervals between the admission and definite diagnosis, and that between admission and surgery were gathered and analyzed.Results: The mean time to definitive diagnosis was 1.7 and 2.04 hours in the case and control groups, respectively. There was no statistically significant relationship between analgesic use and gender, age, time to definite diagnosis, or accuracy of the diagnosis. In fact, the time required to achieve a definite diagnosis and the time between admission and surgery were less in the group that had received analgesics. Conclusions: In spite of the fact that analgesics remove the very symptoms that brings patients to the emergency room, appropriate use of analgesics does not reduce diagnostic efficiency for patients with acute abdominal pain.

  4. Incorrect modeling of the failure process of minimally repaired systems under random conditions: The effect on the maintenance costs

    International Nuclear Information System (INIS)

    Pulcini, Gianpaolo

    2015-01-01

    This note investigates the effect of the incorrect modeling of the failure process of minimally repaired systems that operates under random environmental conditions on the costs of a periodic replacement maintenance. The motivation of this paper is given by a recently published paper, where a wrong formulation of the expected cost for unit time under a periodic replacement policy is obtained. This wrong formulation is due to the incorrect assumption that the intensity function of minimally repaired systems that operate under random conditions has the same functional form as the failure rate of the first failure time. This produced an incorrect optimization of the replacement maintenance. Thus, in this note the conceptual differences between the intensity function and the failure rate of the first failure time are first highlighted. Then, the correct expressions of the expected cost and of the optimal replacement period are provided. Finally, a real application is used to measure how severe can be the economical consequences caused by the incorrect modeling of the failure process.

  5. On the Coupling Time of the Heat-Bath Process for the Fortuin-Kasteleyn Random-Cluster Model

    Science.gov (United States)

    Collevecchio, Andrea; Elçi, Eren Metin; Garoni, Timothy M.; Weigel, Martin

    2018-01-01

    We consider the coupling from the past implementation of the random-cluster heat-bath process, and study its random running time, or coupling time. We focus on hypercubic lattices embedded on tori, in dimensions one to three, with cluster fugacity at least one. We make a number of conjectures regarding the asymptotic behaviour of the coupling time, motivated by rigorous results in one dimension and Monte Carlo simulations in dimensions two and three. Amongst our findings, we observe that, for generic parameter values, the distribution of the appropriately standardized coupling time converges to a Gumbel distribution, and that the standard deviation of the coupling time is asymptotic to an explicit universal constant multiple of the relaxation time. Perhaps surprisingly, we observe these results to hold both off criticality, where the coupling time closely mimics the coupon collector's problem, and also at the critical point, provided the cluster fugacity is below the value at which the transition becomes discontinuous. Finally, we consider analogous questions for the single-spin Ising heat-bath process.

  6. Process Convergence of Self-Normalized Sums of i.i.d. Random ...

    Indian Academy of Sciences (India)

    ... either of tightness or finite dimensional convergence to a non-degenerate limiting distribution does not hold. This work is an extension of the work by Csörgő et al. who showed Donsker's theorem for Y n , 2 ( ⋅ p ) , i.e., for p = 2 , holds i f f =2 and identified the limiting process as a standard Brownian motion in sup norm.

  7. On the regularity of the extinction probability of a branching process in varying and random environments

    International Nuclear Information System (INIS)

    Alili, Smail; Rugh, Hans Henrik

    2008-01-01

    We consider a supercritical branching process in time-dependent environment ξ. We assume that the offspring distributions depend regularly (C k or real-analytically) on real parameters λ. We show that the extinction probability q λ (ξ), given the environment ξ 'inherits' this regularity whenever the offspring distributions satisfy a condition of contraction-type. Our proof makes use of the Poincaré metric on the complex unit disc and a real-analytic implicit function theorem

  8. Statistical properties of a filtered Poisson process with additive random noise: distributions, correlations and moment estimation

    International Nuclear Information System (INIS)

    Theodorsen, A; Garcia, O E; Rypdal, M

    2017-01-01

    Filtered Poisson processes are often used as reference models for intermittent fluctuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model parameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the model parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type. (paper)

  9. Brain training game boosts executive functions, working memory and processing speed in the young adults: a randomized controlled trial.

    Science.gov (United States)

    Nouchi, Rui; Taki, Yasuyuki; Takeuchi, Hikaru; Hashizume, Hiroshi; Nozawa, Takayuki; Kambara, Toshimune; Sekiguchi, Atsushi; Miyauchi, Carlos Makoto; Kotozaki, Yuka; Nouchi, Haruka; Kawashima, Ryuta

    2013-01-01

    Do brain training games work? The beneficial effects of brain training games are expected to transfer to other cognitive functions. Yet in all honesty, beneficial transfer effects of the commercial brain training games in young adults have little scientific basis. Here we investigated the impact of the brain training game (Brain Age) on a wide range of cognitive functions in young adults. We conducted a double-blind (de facto masking) randomized controlled trial using a popular brain training game (Brain Age) and a popular puzzle game (Tetris). Thirty-two volunteers were recruited through an advertisement in the local newspaper and randomly assigned to either of two game groups (Brain Age, Tetris). Participants in both the Brain Age and the Tetris groups played their game for about 15 minutes per day, at least 5 days per week, for 4 weeks. Measures of the cognitive functions were conducted before and after training. Measures of the cognitive functions fell into eight categories (fluid intelligence, executive function, working memory, short-term memory, attention, processing speed, visual ability, and reading ability). Our results showed that commercial brain training game improves executive functions, working memory, and processing speed in young adults. Moreover, the popular puzzle game can engender improvement attention and visuo-spatial ability compared to playing the brain training game. The present study showed the scientific evidence which the brain training game had the beneficial effects on cognitive functions (executive functions, working memory and processing speed) in the healthy young adults. Our results do not indicate that everyone should play brain training games. However, the commercial brain training game might be a simple and convenient means to improve some cognitive functions. We believe that our findings are highly relevant to applications in educational and clinical fields. UMIN Clinical Trial Registry 000005618.

  10. Brain training game boosts executive functions, working memory and processing speed in the young adults: a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Rui Nouchi

    Full Text Available BACKGROUND: Do brain training games work? The beneficial effects of brain training games are expected to transfer to other cognitive functions. Yet in all honesty, beneficial transfer effects of the commercial brain training games in young adults have little scientific basis. Here we investigated the impact of the brain training game (Brain Age on a wide range of cognitive functions in young adults. METHODS: We conducted a double-blind (de facto masking randomized controlled trial using a popular brain training game (Brain Age and a popular puzzle game (Tetris. Thirty-two volunteers were recruited through an advertisement in the local newspaper and randomly assigned to either of two game groups (Brain Age, Tetris. Participants in both the Brain Age and the Tetris groups played their game for about 15 minutes per day, at least 5 days per week, for 4 weeks. Measures of the cognitive functions were conducted before and after training. Measures of the cognitive functions fell into eight categories (fluid intelligence, executive function, working memory, short-term memory, attention, processing speed, visual ability, and reading ability. RESULTS AND DISCUSSION: Our results showed that commercial brain training game improves executive functions, working memory, and processing speed in young adults. Moreover, the popular puzzle game can engender improvement attention and visuo-spatial ability compared to playing the brain training game. The present study showed the scientific evidence which the brain training game had the beneficial effects on cognitive functions (executive functions, working memory and processing speed in the healthy young adults. CONCLUSIONS: Our results do not indicate that everyone should play brain training games. However, the commercial brain training game might be a simple and convenient means to improve some cognitive functions. We believe that our findings are highly relevant to applications in educational and clinical fields

  11. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  12. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  13. Fuzzy control with random delays using invariant cones and its application to control of energy processes in microelectromechanical motion devices

    Energy Technology Data Exchange (ETDEWEB)

    Sinha, A.S.C. [Purdue Univ., Indianapolis, IN (United States). Dept. of Electrical Engineering; Lyshevski, S. [Rochester Inst. of Technology, NY (United States)

    2005-05-01

    In this paper, a class of microelectromechanical systems described by nonlinear differential equations with random delays is examined. Robust fuzzy controllers are designed to control the energy conversion processes with the ultimate objective to guarantee optimal achievable performance. The fuzzy rule base used consists of a collection of r fuzzy IF-THEN rules defined as a function of the conditional variable. The method of the theory of cones and Lyapunov functionals is used to design a class of local fuzzy control laws. A verifiably sufficient condition for stochastic stability of fuzzy stochastic microelectromechanical systems is given. As an example, we have considered the design of a fuzzy control law for an electrostatic micromotor. (author)

  14. Fuzzy control with random delays using invariant cones and its application to control of energy processes in microelectromechanical motion devices

    International Nuclear Information System (INIS)

    Sinha, A.S.C.; Lyshevski, S.

    2005-01-01

    In this paper, a class of microelectromechanical systems described by nonlinear differential equations with random delays is examined. Robust fuzzy controllers are designed to control the energy conversion processes with the ultimate objective to guarantee optimal achievable performance. The fuzzy rule base used consists of a collection of r fuzzy IF-THEN rules defined as a function of the conditional variable. The method of the theory of cones and Lyapunov functionals is used to design a class of local fuzzy control laws. A verifiably sufficient condition for stochastic stability of fuzzy stochastic microelectromechanical systems is given. As an example, we have considered the design of a fuzzy control law for an electrostatic micromotor

  15. Narrative exposure therapy for PTSD increases top-down processing of aversive stimuli - evidence from a randomized controlled treatment trial

    Directory of Open Access Journals (Sweden)

    Adenauer Hannah

    2011-12-01

    Full Text Available Abstract Background Little is known about the neurobiological foundations of psychotherapy for Posttraumatic Stress Disorder (PTSD. Prior studies have shown that PTSD is associated with altered processing of threatening and aversive stimuli. It remains unclear whether this functional abnormality can be changed by psychotherapy. This is the first randomized controlled treatment trial that examines whether narrative exposure therapy (NET causes changes in affective stimulus processing in patients with chronic PTSD. Methods 34 refugees with PTSD were randomly assigned to a NET group or to a waitlist control (WLC group. At pre-test and at four-months follow-up, the diagnostics included the assessment of clinical variables and measurements of neuromagnetic oscillatory brain activity (steady-state visual evoked fields, ssVEF resulting from exposure to aversive pictures compared to neutral pictures. Results PTSD as well as depressive symptom severity scores declined in the NET group, whereas symptoms persisted in the WLC group. Only in the NET group, parietal and occipital activity towards threatening pictures increased significantly after therapy. Conclusions Our results indicate that NET causes an increase of activity associated with cortical top-down regulation of attention towards aversive pictures. The increase of attention allocation to potential threat cues might allow treated patients to re-appraise the actual danger of the current situation and, thereby, reducing PTSD symptoms. Registration of the clinical trial Number: NCT00563888 Name: "Change of Neural Network Indicators Through Narrative Treatment of PTSD in Torture Victims" ULR: http://www.clinicaltrials.gov/ct2/show/NCT00563888

  16. Effectiveness of manual therapy versus surgery in pain processing due to carpal tunnel syndrome: A randomized clinical trial.

    Science.gov (United States)

    Fernández-de-Las-Peñas, C; Cleland, J; Palacios-Ceña, M; Fuensalida-Novo, S; Alonso-Blanco, C; Pareja, J A; Alburquerque-Sendín, F

    2017-08-01

    People with carpal tunnel syndrome (CTS) exhibit widespread pressure pain and thermal pain hypersensitivity as a manifestation of central sensitization. The aim of our study was to compare the effectiveness of manual therapy versus surgery for improving pain and nociceptive gain processing in people with CTS. The trial was conducted at a local regional Hospital in Madrid, Spain from August 2014 to February 2015. In this randomized parallel-group, blinded, clinical trial, 100 women with CTS were randomly allocated to either manual therapy (n = 50), who received three sessions (once/week) of manual therapies including desensitization manoeuvres of the central nervous system, or surgical intervention (n = 50) group. Outcomes including pressure pain thresholds (PPT), thermal pain thresholds (HPT or CPT), and pain intensity which were assessed at baseline, and 3, 6, 9 and 12 months after the intervention by an assessor unaware of group assignment. Analysis was by intention to treat with mixed ANCOVAs adjusted for baseline scores. At 12 months, 95 women completed the follow-up. Patients receiving manual therapy exhibited higher increases in PPT over the carpal tunnel at 3, 6 and 9 months (all, p < 0.01) and higher decrease of pain intensity at 3 month follow-up (p < 0.001) than those receiving surgery. No significant differences were observed between groups for the remaining outcomes. Manual therapy and surgery have similar effects on decreasing widespread pressure pain sensitivity and pain intensity in women with CTS. Neither manual therapy nor surgery resulted in changes in thermal pain sensitivity. The current study found that manual therapy and surgery exhibited similar effects on decreasing widespread pressure pain sensitivity and pain intensity in women with carpal tunnel syndrome at medium- and long-term follow-ups investigating changes in nociceptive gain processing after treatment in carpal tunnel syndrome. © 2017 European Pain Federation - EFIC®.

  17. Markov counting and reward processes for analysing the performance of a complex system subject to random inspections

    International Nuclear Information System (INIS)

    Ruiz-Castro, Juan Eloy

    2016-01-01

    In this paper, a discrete complex reliability system subject to internal failures and external shocks, is modelled algorithmically. Two types of internal failure are considered: repairable and non-repairable. When a repairable failure occurs, the unit goes to corrective repair. In addition, the unit is subject to external shocks that may produce an aggravation of the internal degradation level, cumulative damage or extreme failure. When a damage threshold is reached, the unit must be removed. When a non-repairable failure occurs, the device is replaced by a new, identical one. The internal performance and the external damage are partitioned in performance levels. Random inspections are carried out. When an inspection takes place, the internal performance of the system and the damage caused by external shocks are observed and if necessary the unit is sent to preventive maintenance. If the inspection observes minor state for the internal performance and/or external damage, then these states remain in memory when the unit goes to corrective or preventive maintenance. Transient and stationary analyses are performed. Markov counting and reward processes are developed in computational form to analyse the performance and profitability of the system with and without preventive maintenance. These aspects are implemented computationally with Matlab. - Highlights: • A multi-state device is modelled in an algorithmic and computational form. • The performance is partitioned in multi-states and degradation levels. • Several types of failures with repair times according to degradation levels. • Preventive maintenance as response to random inspection is introduced. • The performance-profitable is analysed through Markov counting and reward processes.

  18. Modeling reactive transport processes in fractured rock using the time domain random walk approach within a dual-porosity framework

    Science.gov (United States)

    Roubinet, D.; Russian, A.; Dentz, M.; Gouze, P.

    2017-12-01

    Characterizing and modeling hydrodynamic reactive transport in fractured rock are critical challenges for various research fields and applications including environmental remediation, geological storage, and energy production. To this end, we consider a recently developed time domain random walk (TDRW) approach, which is adapted to reproduce anomalous transport behaviors and capture heterogeneous structural and physical properties. This method is also very well suited to optimize numerical simulations by memory-shared massive parallelization and provide numerical results at various scales. So far, the TDRW approach has been applied for modeling advective-diffusive transport with mass transfer between mobile and immobile regions and simple (theoretical) reactions in heterogeneous porous media represented as single continuum domains. We extend this approach to dual-continuum representations considering a highly permeable fracture network embedded into a poorly permeable rock matrix with heterogeneous geochemical reactions occurring in both geological structures. The resulting numerical model enables us to extend the range of the modeled heterogeneity scales with an accurate representation of solute transport processes and no assumption on the Fickianity of these processes. The proposed model is compared to existing particle-based methods that are usually used to model reactive transport in fractured rocks assuming a homogeneous surrounding matrix, and is used to evaluate the impact of the matrix heterogeneity on the apparent reaction rates for different 2D and 3D simple-to-complex fracture network configurations.

  19. Effect of baking process on postprandial metabolic consequences: randomized trials in normal and type 2 diabetic subjects.

    Science.gov (United States)

    Rizkalla, S W; Laromiguiere, M; Champ, M; Bruzzo, F; Boillot, J; Slama, G

    2007-02-01

    To determine the impact of the form, fibre content, baking and processing on the glycaemic, insulinaemic and lipidaemic responses of different French breads. First study: Nine healthy subjects were randomized to consume in a crossover design one of six kinds of French bread (each containing 50 g available carbohydrate): classic baguette, traditional baguette, loaf of wholemeal bread (WM-B), loaf of bread fermented with yeast or with leaven, a sandwich and a glucose challenge as reference. The glycaemic index (GI) values ranged from 57+/-9% (mean+/-s.e.m.), for the traditional baguette, to 85+/-27% for the WM-B. No significant difference was found among the different tested bread. The insulinaemic index (II), however, of the traditional baguette and of the bread fermented with leaven were lower than the other breads (analysis of variance: Pvarieties of French bread (the TB) have lower II, in healthy subjects, and lower GI, in type 2 diabetic subjects, than that of the other varieties. These results might be due to bread processing difference rather than fibre content. Supported by grants from the National French Milling Association.

  20. Feasibility of a randomized controlled trial to evaluate the impact of decision boxes on shared decision-making processes.

    Science.gov (United States)

    Giguere, Anik Mc; Labrecque, Michel; Légaré, France; Grad, Roland; Cauchon, Michel; Greenway, Matthew; Haynes, R Brian; Pluye, Pierre; Syed, Iqra; Banerjee, Debi; Carmichael, Pierre-Hugues; Martin, Mélanie

    2015-02-25

    Decision boxes (DBoxes) are two-page evidence summaries to prepare clinicians for shared decision making (SDM). We sought to assess the feasibility of a clustered Randomized Controlled Trial (RCT) to evaluate their impact. A convenience sample of clinicians (nurses, physicians and residents) from six primary healthcare clinics who received eight DBoxes and rated their interest in the topic and satisfaction. After consultations, their patients rated their involvement in decision-making processes (SDM-Q-9 instrument). We measured clinic and clinician recruitment rates, questionnaire completion rates, patient eligibility rates, and estimated the RCT needed sample size. Among the 20 family medicine clinics invited to participate in this study, four agreed to participate, giving an overall recruitment rate of 20%. Of 148 clinicians invited to the study, 93 participated (63%). Clinicians rated an interest in the topics ranging 6.4-8.2 out of 10 (with 10 highest) and a satisfaction with DBoxes of 4 or 5 out of 5 (with 5 highest) for 81% DBoxes. For the future RCT, we estimated that a sample size of 320 patients would allow detecting a 9% mean difference in the SDM-Q-9 ratings between our two arms (0.02 ICC; 0.05 significance level; 80% power). Clinicians' recruitment and questionnaire completion rates support the feasibility of the planned RCT. The level of interest of participants for the DBox topics, and their level of satisfaction with the Dboxes demonstrate the acceptability of the intervention. Processes to recruit clinics and patients should be optimized.

  1. Apatite fission track analysis: geological thermal history analysis based on a three-dimensional random process of linear radiation damage

    International Nuclear Information System (INIS)

    Galbraith, R.F.; Laslett, G.M.; Green, P.F.; Duddy, I.R.

    1990-01-01

    Spontaneous fission of uranium atoms over geological time creates a random process of linearly shaped features (fission tracks) inside an apatite crystal. The theoretical distributions associated with this process are governed by the elapsed time and temperature history, but other factors are also reflected in empirical measurements as consequences of sampling by plane section and chemical etching. These include geometrical biases leading to over-representation of long tracks, the shape and orientation of host features when sampling totally confined tracks, and 'gaps' in heavily annealed tracks. We study the estimation of geological parameters in the presence of these factors using measurements on both confined tracks and projected semi-tracks. Of particular interest is a history of sedimentation, uplift and erosion giving rise to a two-component mixture of tracks in which the parameters reflect the current temperature, the maximum temperature and the timing of uplift. A full likelihood analysis based on all measured densities, lengths and orientations is feasible, but because some geometrical biases and measurement limitations are only partly understood it seems preferable to use conditional likelihoods given numbers and orientations of confined tracks. (author)

  2. Randomized, double-blinded clinical trial for human norovirus inactivation in oysters by high hydrostatic pressure processing.

    Science.gov (United States)

    Leon, Juan S; Kingsley, David H; Montes, Julia S; Richards, Gary P; Lyon, G Marshall; Abdulhafid, Gwen M; Seitz, Scot R; Fernandez, Marina L; Teunis, Peter F; Flick, George J; Moe, Christine L

    2011-08-01

    Contamination of oysters with human noroviruses (HuNoV) constitutes a human health risk and may lead to severe economic losses in the shellfish industry. There is a need to identify a technology that can inactivate HuNoV in oysters. In this study, we conducted a randomized, double-blinded clinical trial to assess the effect of high hydrostatic pressure processing (HPP) on Norwalk virus (HuNoV genogroup I.1) inactivation in virus-seeded oysters ingested by subjects. Forty-four healthy, positive-secretor adults were divided into three study phases. Subjects in each phase were randomized into control and intervention groups. Subjects received Norwalk virus (8FIIb, 1.0 × 10(4) genomic equivalent copies) in artificially seeded oysters with or without HPP treatment (400 MPa at 25°C, 600 MPa at 6°C, or 400 MPa at 6°C for 5 min). HPP at 600 MPa, but not 400 MPa (at 6° or 25°C), completely inactivated HuNoV in seeded oysters and resulted in no HuNoV infection among these subjects, as determined by reverse transcription-PCR detection of HuNoV RNA in subjects' stool or vomitus samples. Interestingly, a white blood cell (granulocyte) shift was identified in 92% of the infected subjects and was significantly associated with infection (P = 0.0014). In summary, these data suggest that HPP is effective at inactivating HuNoV in contaminated whole oysters and suggest a potential intervention to inactivate infectious HuNoV in oysters for the commercial shellfish industry.

  3. Process Evaluation of the Type 2 Diabetes Mellitus PULSE Program Randomized Controlled Trial: Recruitment, Engagement, and Overall Satisfaction.

    Science.gov (United States)

    Aguiar, Elroy J; Morgan, Philip J; Collins, Clare E; Plotnikoff, Ronald C; Young, Myles D; Callister, Robin

    2017-07-01

    Men are underrepresented in weight loss and type 2 diabetes mellitus (T2DM) prevention studies. To determine the effectiveness of recruitment, and acceptability of the T2DM Prevention Using LifeStyle Education (PULSE) Program-a gender-targeted, self-administered intervention for men. Men (18-65 years, high risk for T2DM) were randomized to intervention ( n = 53) or wait-list control groups ( n = 48). The 6-month PULSE Program intervention focused on weight loss, diet, and exercise for T2DM prevention. A process evaluation questionnaire was administered at 6 months to examine recruitment and selection processes, and acceptability of the intervention's delivery and content. Associations between self-monitoring and selected outcomes were assessed using Spearman's rank correlation. A pragmatic recruitment and online screening process was effective in identifying men at high risk of T2DM (prediabetes prevalence 70%). Men reported the trial was appealing because it targeted weight loss, T2DM prevention, and getting fit, and because it was perceived as "doable" and tailored for men. The intervention was considered acceptable, with men reporting high overall satisfaction (83%) and engagement with the various components. Adherence to self-monitoring was poor, with only 13% meeting requisite criteria. However, significant associations were observed between weekly self-monitoring of weight and change in weight ( r s = -.47, p = .004) and waist circumference ( r s = -.38, p = .026). Men reported they would have preferred more intervention contact, for example, by phone or email. Gender-targeted, self-administered lifestyle interventions are feasible, appealing, and satisfying for men. Future studies should explore the effects of additional non-face-to-face contact on motivation, accountability, self-monitoring adherence, and program efficacy.

  4. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    On a randomly imperfect spherical cap pressurized by a random dynamic load. ... In this paper, we investigate a dynamical system in a random setting of dual ... characterization of the random process for determining the dynamic buckling load ...

  5. Global industrial impact coefficient based on random walk process and inter-country input-output table

    Science.gov (United States)

    Xing, Lizhi; Dong, Xianlei; Guan, Jun

    2017-04-01

    Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.

  6. A randomized controlled trial of cognitive training using a visual speed of processing intervention in middle aged and older adults.

    Directory of Open Access Journals (Sweden)

    Fredric D Wolinsky

    Full Text Available Age-related cognitive decline is common and may lead to substantial difficulties and disabilities in everyday life. We hypothesized that 10 hours of visual speed of processing training would prevent age-related declines and potentially improve cognitive processing speed.Within two age bands (50-64 and ≥ 65 681 patients were randomized to (a three computerized visual speed of processing training arms (10 hours on-site, 14 hours on-site, or 10 hours at-home or (b an on-site attention control group using computerized crossword puzzles for 10 hours. The primary outcome was the Useful Field of View (UFOV test, and the secondary outcomes were the Trail Making (Trails A and B Tests, Symbol Digit Modalities Test (SDMT, Stroop Color and Word Tests, Controlled Oral Word Association Test (COWAT, and the Digit Vigilance Test (DVT, which were assessed at baseline and at one year. 620 participants (91% completed the study and were included in the analyses. Linear mixed models were used with Blom rank transformations within age bands.All intervention groups had (p<0.05 small to medium standardized effect size improvements on UFOV (Cohen's d = -0.322 to -0.579, depending on intervention arm, Trails A (d = -0.204 to -0.265, Trails B (d = -0.225 to -0.320, SDMT (d = 0.263 to 0.351, and Stroop Word (d = 0.240 to 0.271. Converted to years of protection against age-related cognitive declines, these effects reflect 3.0 to 4.1 years on UFOV, 2.2 to 3.5 years on Trails A, 1.5 to 2.0 years on Trails B, 5.4 to 6.6 years on SDMT, and 2.3 to 2.7 years on Stroop Word.Visual speed of processing training delivered on-site or at-home to middle-aged or older adults using standard home computers resulted in stabilization or improvement in several cognitive function tests. Widespread implementation of this intervention is feasible.ClinicalTrials.gov NCT-01165463.

  7. An empirical test of pseudo random number generators by means of an exponential decaying process; Una prueba empirica de generadores de numeros pseudoaleatorios mediante un proceso de decaimiento exponencial

    Energy Technology Data Exchange (ETDEWEB)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A. [Facultad de Fisica e Inteligencia Artificial, Universidad Veracruzana, A.P. 475, Xalapa, Veracruz (Mexico); Mora F, L.E. [CIMAT, A.P. 402, 36000 Guanajuato (Mexico)]. e-mail: hcoronel@uv.mx

    2007-07-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  8. Improving understanding in the research informed consent process: a systematic review of 54 interventions tested in randomized control trials.

    Science.gov (United States)

    Nishimura, Adam; Carey, Jantey; Erwin, Patricia J; Tilburt, Jon C; Murad, M Hassan; McCormick, Jennifer B

    2013-07-23

    Obtaining informed consent is a cornerstone of biomedical research, yet participants comprehension of presented information is often low. The most effective interventions to improve understanding rates have not been identified. To systematically analyze the random controlled trials testing interventions to research informed consent process. The primary outcome of interest was quantitative rates of participant understanding; secondary outcomes were rates of information retention, satisfaction, and accrual. Interventional categories included multimedia, enhanced consent documents, extended discussions, test/feedback quizzes, and miscellaneous methods. The search spanned from database inception through September 2010. It was run on Ovid MEDLINE, Ovid EMBASE, Ovid CINAHL, Ovid PsycInfo and Cochrane CENTRAL, ISI Web of Science and Scopus. Five reviewers working independently and in duplicate screened full abstract text to determine eligibility. We included only RCTs. 39 out of 1523 articles fulfilled review criteria (2.6%), with a total of 54 interventions. A data extraction form was created in Distiller, an online reference management system, through an iterative process. One author collected data on study design, population, demographics, intervention, and analytical technique. Meta-analysis was possible on 22 interventions: multimedia, enhanced form, and extended discussion categories; all 54 interventions were assessed by review. Meta-analysis of multimedia approaches was associated with a non-significant increase in understanding scores (SMD 0.30, 95% CI, -0.23 to 0.84); enhanced consent form, with significant increase (SMD 1.73, 95% CI, 0.99 to 2.47); and extended discussion, with significant increase (SMD 0.53, 95% CI, 0.21 to 0.84). By review, 31% of multimedia interventions showed significant improvement in understanding; 41% for enhanced consent form; 50% for extended discussion; 33% for test/feedback; and 29% for miscellaneous.Multiple sources of variation

  9. Specialized rheumatology nurse substitutes for rheumatologists in the diagnostic process of fibromyalgia: a cost-consequence analysis and a randomized controlled trial

    NARCIS (Netherlands)

    Kroese, Mariëlle E.; Severens, Johan L.; Schulpen, Guy J.; Bessems, Monique C.; Nijhuis, Frans J.; Landewé, Robert B.

    2011-01-01

    To perform a cost-consequence analysis of the substitution of specialized rheumatology nurses (SRN) for rheumatologists (RMT) in the diagnostic process of fibromyalgia (FM), using both a healthcare and societal perspective and a 9-month period. Alongside a randomized controlled trial, we measured

  10. Extubation process in bed-ridden elderly intensive care patients receiving inspiratory muscle training: a randomized clinical trial.

    Science.gov (United States)

    Cader, Samária Ali; de Souza Vale, Rodrigo Gomes; Zamora, Victor Emmanuel; Costa, Claudia Henrique; Dantas, Estélio Henrique Martin

    2012-01-01

    The purpose of this study was to evaluate the extubation process in bed-ridden elderly intensive care patients receiving inspiratory muscle training (IMT) and identify predictors of successful weaning. Twenty-eight elderly intubated patients in an intensive care unit were randomly assigned to an experimental group (n = 14) that received conventional physiotherapy plus IMT with a Threshold IMT(®) device or to a control group (n = 14) that received only conventional physiotherapy. The experimental protocol for muscle training consisted of an initial load of 30% maximum inspiratory pressure, which was increased by 10% daily. The training was administered for 5 minutes, twice daily, 7 days a week, with supplemental oxygen from the beginning of weaning until extubation. Successful extubation was defined by the ventilation time measurement with noninvasive positive pressure. A vacuum manometer was used for measurement of maximum inspiratory pressure, and the patients' Tobin index values were measured using a ventilometer. The maximum inspiratory pressure increased significantly (by 7 cm H(2)O, 95% confidence interval [CI] 4-10), and the Tobin index decreased significantly (by 16 breaths/ min/L, 95% CI -26 to 6) in the experimental group compared with the control group. The Chi-squared distribution did not indicate a significant difference in weaning success between the groups (χ(2) = 1.47; P = 0.20). However, a comparison of noninvasive positive pressure time dependence indicated a significantly lower value for the experimental group (P = 0.0001; 95% CI 13.08-18.06). The receiver-operating characteristic curve showed an area beneath the curve of 0.877 ± 0.06 for the Tobin index and 0.845 ± 0.07 for maximum inspiratory pressure. The IMT intervention significantly increased maximum inspiratory pressure and significantly reduced the Tobin index; both measures are considered to be good extubation indices. IMT was associated with a reduction in noninvasive positive

  11. Processing/structure/property Relationships of Barium Strontium Titanate Thin Films for Dynamic Random Access Memory Application.

    Science.gov (United States)

    Peng, Cheng-Jien

    The purpose of this study is to see the application feasibility of barium strontium titanate (BST) thin films on ultra large scale integration (ULSI) dynamic random access memory (DRAM) capacitors through the understanding of the relationships among processing, structure and electrical properties. Thin films of BST were deposited by multi-ion -beam reactive sputtering (MIBERS) technique and metallo -organic decomposition (MOD) method. The processing parameters such as Ba/Sr ratio, substrate temperature, annealing temperature and time, film thickness and doping concentration were correlated with the structure and electric properties of the films. Some effects of secondary low-energy oxygen ion bombardment were also examined. Microstructures of BST thin films could be classified into two types: (a) Type I structures, with multi-grains through the film thickness, for amorphous as-grown films after high temperature annealing, and (b) columnar structure (Type II) which remained even after high temperature annealing, for well-crystallized films deposited at high substrate temperatures. Type I films showed Curie-von Schweidler response, while Type II films showed Debted type behavior. Type I behavior may be attributed to the presence of a high density of disordered grain boundaries. Two types of current -voltage characteristics could be seen in non-bombarded films depending on the chemistry of the films (doped or undoped) and substrate temperature during deposition. Only the MIBERS films doped with high donor concentration and deposited at high substrate temperature showed space-charge -limited conduction (SCLC) with discrete shallow traps embedded in trap-distributed background at high electric field. All other non-bombarded films, including MOD films, showed trap-distributed SCLC behavior with a slope of {~}7.5-10 due to the presence of grain boundaries through film thickness or traps induced by unavoidable acceptor impurities in the films. Donor-doping could

  12. Run charts revisited: a simulation study of run chart rules for detection of non-random variation in health care processes.

    Science.gov (United States)

    Anhøj, Jacob; Olesen, Anne Vingaard

    2014-01-01

    A run chart is a line graph of a measure plotted over time with the median as a horizontal line. The main purpose of the run chart is to identify process improvement or degradation, which may be detected by statistical tests for non-random patterns in the data sequence. We studied the sensitivity to shifts and linear drifts in simulated processes using the shift, crossings and trend rules for detecting non-random variation in run charts. The shift and crossings rules are effective in detecting shifts and drifts in process centre over time while keeping the false signal rate constant around 5% and independent of the number of data points in the chart. The trend rule is virtually useless for detection of linear drift over time, the purpose it was intended for.

  13. Extubation process in bed-ridden elderly intensive care patients receiving inspiratory muscle training: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Cader SA

    2012-10-01

    Full Text Available Samária Ali Cader,1 Rodrigo Gomes de Souza Vale,1 Victor Emmanuel Zamora,2 Claudia Henrique Costa,2 Estélio Henrique Martin Dantas11Laboratory of Human Kinetics Bioscience, Federal University of Rio de Janeiro State, 2Pedro Ernesto University Hospital, School of Medicine, State University of Rio de Janeiro, Rio de Janeiro, BrazilBackground: The purpose of this study was to evaluate the extubation process in bed-ridden elderly intensive care patients receiving inspiratory muscle training (IMT and identify predictors of successful weaning.Methods: Twenty-eight elderly intubated patients in an intensive care unit were randomly assigned to an experimental group (n = 14 that received conventional physiotherapy plus IMT with a Threshold IMT® device or to a control group (n = 14 that received only conventional physiotherapy. The experimental protocol for muscle training consisted of an initial load of 30% maximum inspiratory pressure, which was increased by 10% daily. The training was administered for 5 minutes, twice daily, 7 days a week, with supplemental oxygen from the beginning of weaning until extubation. Successful extubation was defined by the ventilation time measurement with noninvasive positive pressure. A vacuum manometer was used for measurement of maximum inspiratory pressure, and the patients' Tobin index values were measured using a ventilometer.Results: The maximum inspiratory pressure increased significantly (by 7 cm H2O, 95% confidence interval [CI] 4–10, and the Tobin index decreased significantly (by 16 breaths/min/L, 95% CI −26 to 6 in the experimental group compared with the control group. The Chi-squared distribution did not indicate a significant difference in weaning success between the groups (Χ2 = 1.47; P = 0.20. However, a comparison of noninvasive positive pressure time dependence indicated a significantly lower value for the experimental group (P = 0.0001; 95% CI 13.08–18.06. The receiver

  14. Will Mobile Diabetes Education Teams (MDETs in primary care improve patient care processes and health outcomes? Study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Gucciardi Enza

    2012-09-01

    Full Text Available Abstract Background There is evidence to suggest that delivery of diabetes self-management support by diabetes educators in primary care may improve patient care processes and patient clinical outcomes; however, the evaluation of such a model in primary care is nonexistent in Canada. This article describes the design for the evaluation of the implementation of Mobile Diabetes Education Teams (MDETs in primary care settings in Canada. Methods/design This study will use a non-blinded, cluster-randomized controlled trial stepped wedge design to evaluate the Mobile Diabetes Education Teams' intervention in improving patient clinical and care process outcomes. A total of 1,200 patient charts at participating primary care sites will be reviewed for data extraction. Eligible patients will be those aged ≥18, who have type 2 diabetes and a hemoglobin A1c (HbA1c of ≥8%. Clusters (that is, primary care sites will be randomized to the intervention and control group using a block randomization procedure within practice size as the blocking factor. A stepped wedge design will be used to sequentially roll out the intervention so that all clusters eventually receive the intervention. The time at which each cluster begins the intervention is randomized to one of the four roll out periods (0, 6, 12, and 18 months. Clusters that are randomized into the intervention later will act as the control for those receiving the intervention earlier. The primary outcome measure will be the difference in the proportion of patients who achieve the recommended HbA1c target of ≤7% between intervention and control groups. Qualitative work (in-depth interviews with primary care physicians, MDET educators and patients; and MDET educators’ field notes and debriefing sessions will be undertaken to assess the implementation process and effectiveness of the MDET intervention. Trial registration ClinicalTrials.gov NCT01553266

  15. Omega-3 and -6 fatty acid supplementation and sensory processing in toddlers with ASD symptomology born preterm: A randomized controlled trial.

    Science.gov (United States)

    Boone, Kelly M; Gracious, Barbara; Klebanoff, Mark A; Rogers, Lynette K; Rausch, Joseph; Coury, Daniel L; Keim, Sarah A

    2017-12-01

    Despite advances in the health and long-term survival of infants born preterm, they continue to face developmental challenges including higher risk for autism spectrum disorder (ASD) and atypical sensory processing patterns. This secondary analysis aimed to describe sensory profiles and explore effects of combined dietary docosahexaenoic acid (DHA), eicosapentaenoic acid (EPA), and gamma-linolenic acid (GLA) supplementation on parent-reported sensory processing in toddlers born preterm who were exhibiting ASD symptoms. 90-day randomized, double blinded, placebo-controlled trial. 31 children aged 18-38months who were born at ≤29weeks' gestation. Mixed effects regression analyses followed intent to treat and explored effects on parent-reported sensory processing measured by the Infant/Toddler Sensory Profile (ITSP). Baseline ITSP scores reflected atypical sensory processing, with the majority of atypical scores falling below the mean. Sensory processing sections: auditory (above=0%, below=65%), vestibular (above=13%, below=48%), tactile (above=3%, below=35%), oral sensory (above=10%; below=26%), visual (above=10%, below=16%); sensory processing quadrants: low registration (above=3%; below=71%), sensation avoiding (above=3%; below=39%), sensory sensitivity (above=3%; below=35%), and sensation seeking (above=10%; below=19%). Twenty-eight of 31 children randomized had complete outcome data. Although not statistically significant (p=0.13), the magnitude of the effect for reduction in behaviors associated with sensory sensitivity was medium to large (effect size=0.57). No other scales reflected a similar magnitude of effect size (range: 0.10 to 0.32). The findings provide support for larger randomized trials of omega fatty acid supplementation for children at risk of sensory processing difficulties, especially those born preterm. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Efficient Numerical Methods for Analysis of Square Ratio of κ-μ and η-μ Random Processes with Their Applications in Telecommunications

    Directory of Open Access Journals (Sweden)

    Gradimir V. Milovanović

    2018-01-01

    Full Text Available We will provide statistical analysis of the square ratio of κ-μ and η-μ random processes and its application in the signal-to-interference ratio (SIR based performance analysis of wireless transmission subjected to the influence of multipath fading, modelled by κ-μ fading model, and undesired occurrence of co-channel interference (CCI, distributed as η-μ random process. First contribution of the paper is deriving exact closed expressions for the probability density function (PDF and cumulative distribution function (CDF of square ratio of κ-μ and η-μ random processes. Further, a verification of accuracy of these PDF and CDF expressions was given by comparison with the corresponding approximations obtained by the high-precision quadrature formulas of Gaussian type with respect to the weight functions on (0,+∞. The computational procedure of such quadrature rules is provided by using the constructive theory of orthogonal polynomials and the MATHEMATICA package OrthogonalPolynomials created by Cvetković and Milovanović (2004. Capitalizing on obtained expression, important wireless performance criteria, namely, outage probability (OP, have been obtained, as functions of transmission parameters. Also, possible performance improvement is observed through a glance at SC (selection combining reception employment based on obtained expressions.

  17. Randomized benchmarking of single- and multi-qubit control in liquid-state NMR quantum information processing

    International Nuclear Information System (INIS)

    Ryan, C A; Laforest, M; Laflamme, R

    2009-01-01

    Being able to quantify the level of coherent control in a proposed device implementing a quantum information processor (QIP) is an important task for both comparing different devices and assessing a device's prospects with regards to achieving fault-tolerant quantum control. We implement in a liquid-state nuclear magnetic resonance QIP the randomized benchmarking protocol presented by Knill et al (2008 Phys. Rev. A 77 012307). We report an error per randomized π/2 pulse of 1.3±0.1x10 -4 with a single-qubit QIP and show an experimentally relevant error model where the randomized benchmarking gives a signature fidelity decay which is not possible to interpret as a single error per gate. We explore and experimentally investigate multi-qubit extensions of this protocol and report an average error rate for one- and two-qubit gates of 4.7±0.3x10 -3 for a three-qubit QIP. We estimate that these error rates are still not decoherence limited and thus can be improved with modifications to the control hardware and software.

  18. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    Science.gov (United States)

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  19. EPQ model for imperfect production processes with rework and random preventive machine time for deteriorating items and trended demand

    Directory of Open Access Journals (Sweden)

    Shah Nita H.

    2015-01-01

    Full Text Available Economic production quantity (EPQ model has been analyzed for trended demand, and units in inventory are subject to constant rate. The system allows rework of imperfect units, and preventive maintenance time is random. A search method is used to study the model. The proposed methodology is validated by a numerical example. The sensitivity analysis is carried out to determine the critical model parameters. It is observed that the rate of change of demand, and the deterioration rate have a significant impact on the decision variables and the total cost of an inventory system. The model is highly sensitive to the production and demand rate.

  20. Making working memory work: the effects of extended practice on focus capacity and the processes of updating, forward access, and random access.

    Science.gov (United States)

    Price, John M; Colflesh, Gregory J H; Cerella, John; Verhaeghen, Paul

    2014-05-01

    We investigated the effects of 10h of practice on variations of the N-Back task to investigate the processes underlying possible expansion of the focus of attention within working memory. Using subtractive logic, we showed that random access (i.e., Sternberg-like search) yielded a modest effect (a 50% increase in speed) whereas the processes of forward access (i.e., retrieval in order, as in a standard N-Back task) and updating (i.e., changing the contents of working memory) were executed about 5 times faster after extended practice. We additionally found that extended practice increased working memory capacity as measured by the size of the focus of attention for the forward-access task, but not for variations where probing was in random order. This suggests that working memory capacity may depend on the type of search process engaged, and that certain working-memory-related cognitive processes are more amenable to practice than others. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Gibbs-non-Gibbs transitions and vector-valued integration

    NARCIS (Netherlands)

    Zuijlen, van W.B.

    2016-01-01

    This thesis consists of two distinct topics. The first part of the thesis con- siders Gibbs-non-Gibbs transitions. Gibbs measures describe the macro- scopic state of a system of a large number of components that is in equilib- rium. It may happen that when the system is transformed, for example, by

  2. Mimetic Discretization of Vector-valued Diffusion Problems

    DEFF Research Database (Denmark)

    Olesen, Kennet

    this is the balance of the change of mass in a finite volume with mass fluxes across the surfaces bounding this volume? In the FDM and FEM the derivatives in the gradient-, curl- and divergence operator are approximated by formulating expressions with respect to a finite number of selected points. The continuous...... gradient-, curl- and divergence operators are derived based on geometrical considerations on finite domains, and by introducing geometry into the numerical scheme these operators can be replicated exactly. To incorporate the geometry into the PDEs the field of differential geometry is applied, which has...... of different dimensions through Stokes' theorem. - A clear separation of balance/equilibrium equations and constitutive equations is possible. As mentioned the emphasis is put on diffusion dominated problems containing second order tensors. Earlier work has developed a rigorous framework for problems involving...

  3. TV-L1 optical flow for vector valued images

    DEFF Research Database (Denmark)

    Rakêt, Lars Lau; Roholm, Lars; Nielsen, Mads

    2011-01-01

    The variational TV-L1 framework has become one of the most popular and successful approaches for calculating optical flow. One reason for the popularity is the very appealing properties of the two terms in the energy formulation of the problem, the robust L1-norm of the data fidelity term combined...... with the total variation (TV) regular- ization that smoothes the flow, but preserve strong discontinuities such as edges. Specifically the approach of Zach et al. [1] has provided a very clean and efficient algorithm for calculating TV-L1 optical flows between grayscale images. In this paper we propose...

  4. Vector valued logarithmic residues and the extraction of elementary factors

    NARCIS (Netherlands)

    H. Bart (Harm); T. Ehrhardt; B. Silbermann

    2007-01-01

    textabstractAn analysis is presented of the circumstances under which, by the extraction of elementary factors, an analytic Banach algebra valued function can be transformed into one taking invertible values only. Elementary factors are generalizations of the simple scalar expressions λ – α, the

  5. Vector-valued almost convergence and classical properties in ...

    Indian Academy of Sciences (India)

    So, Banach limits are legitimate extensions of the limit function on c. In. [14], Lorentz made use of the concept of Banach limit to introduce the notion of 'almost convergence'. DEFINITION 1.2 [14]. A bounded sequence (xn)n∈N ∈ l∞ is called almost convergent exactly when there exists a number y ∈ R (called the almost ...

  6. Isometric multipliers of a vector valued Beurling algebra on a ...

    Indian Academy of Sciences (India)

    Throughout, let S be a nonunital faith- ful abelian semigroup, and let A be a commutative Banach algebra. A map σ : S → S is a multiplier [1, 4] if σ(xy) = xσ(y) = σ(x)y, x,y ∈ S. Let M(S) be the set of all multipliers of S. Then M(S) is a unital abelian semigroup under composition. Since S is faithful, S can be imbedded as an ...

  7. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  8. Small Acute Benefits of 4 Weeks Processing Speed Training Games on Processing Speed and Inhibition Performance and Depressive Mood in the Healthy Elderly People: Evidence from a Randomized Control Trial.

    Science.gov (United States)

    Nouchi, Rui; Saito, Toshiki; Nouchi, Haruka; Kawashima, Ryuta

    2016-01-01

    Background: Processing speed training using a 1-year intervention period improves cognitive functions and emotional states of elderly people. Nevertheless, it remains unclear whether short-term processing speed training such as 4 weeks can benefit elderly people. This study was designed to investigate effects of 4 weeks of processing speed training on cognitive functions and emotional states of elderly people. Methods: We used a single-blinded randomized control trial (RCT). Seventy-two older adults were assigned randomly to two groups: a processing speed training game (PSTG) group and knowledge quiz training game (KQTG) group, an active control group. In PSTG, participants were asked to play PSTG (12 processing speed games) for 15 min, during five sessions per week, for 4 weeks. In the KQTG group, participants were asked to play KQTG (four knowledge quizzes) for 15 min, during five sessions per week, for 4 weeks. We measured several cognitive functions and emotional states before and after the 4 week intervention period. Results: Our results revealed that PSTG improved performances in processing speed and inhibition compared to KQTG, but did not improve performance in reasoning, shifting, short term/working memory, and episodic memory. Moreover, PSTG reduced the depressive mood score as measured by the Profile of Mood State compared to KQTG during the 4 week intervention period, but did not change other emotional measures. Discussion: This RCT first provided scientific evidence related to small acute benefits of 4 week PSTG on processing speed, inhibition, and depressive mood in healthy elderly people. We discuss possible mechanisms for improvements in processing speed and inhibition and reduction of the depressive mood. Trial registration: This trial was registered in The University Hospital Medical Information Network Clinical Trials Registry (UMIN000022250).

  9. Liquid-borne nano particles impact on the random yield during critical processes in IC’s production

    NARCIS (Netherlands)

    Wali, F.; Knotter, D. Martin; Kuper, F.G.

    2008-01-01

    Semiconductor industry faces a continuous challenge to decrease the transistor size as well as to increase the yield by eliminating defect sources. One of the sources of particle defects is ultra pure water used in different production tools at different stages of processing. In this paper, particle

  10. Return to work and occupational physicians' management of common mental health problems--process evaluation of a randomized controlled trial

    NARCIS (Netherlands)

    Rebergen, David S.; Bruinvels, David J.; Bos, Chris M.; van der Beek, Allard J.; van Mechelen, Willem

    2010-01-01

    The aim of this study was to examine the adherence of occupational physicians (OP) to the Dutch guideline on the management of common mental health problems and its effect on return to work as part of the process evaluation of a trial comparing adherence to the guideline to care as usual. The first

  11. On Using the Volatile Mem-Capacitive Effect of TiO2 Resistive Random Access Memory to Mimic the Synaptic Forgetting Process

    Science.gov (United States)

    Sarkar, Biplab; Mills, Steven; Lee, Bongmook; Pitts, W. Shepherd; Misra, Veena; Franzon, Paul D.

    2018-02-01

    In this work, we report on mimicking the synaptic forgetting process using the volatile mem-capacitive effect of a resistive random access memory (RRAM). TiO2 dielectric, which is known to show volatile memory operations due to migration of inherent oxygen vacancies, was used to achieve the volatile mem-capacitive effect. By placing the volatile RRAM candidate along with SiO2 at the gate of a MOS capacitor, a volatile capacitance change resembling the forgetting nature of a human brain is demonstrated. Furthermore, the memory operation in the MOS capacitor does not require a current flow through the gate dielectric indicating the feasibility of obtaining low power memory operations. Thus, the mem-capacitive effect of volatile RRAM candidates can be attractive to the future neuromorphic systems for implementing the forgetting process of a human brain.

  12. A model for Intelligent Random Access Memory architecture (IRAM) cellular automata algorithms on the Associative String Processing machine (ASTRA)

    CERN Document Server

    Rohrbach, F; Vesztergombi, G

    1997-01-01

    In the near future, the computer performance will be completely determined by how long it takes to access memory. There are bottle-necks in memory latency and memory-to processor interface bandwidth. The IRAM initiative could be the answer by putting Processor-In-Memory (PIM). Starting from the massively parallel processing concept, one reached a similar conclusion. The MPPC (Massively Parallel Processing Collaboration) project and the 8K processor ASTRA machine (Associative String Test bench for Research \\& Applications) developed at CERN \\cite{kuala} can be regarded as a forerunner of the IRAM concept. The computing power of the ASTRA machine, regarded as an IRAM with 64 one-bit processors on a 64$\\times$64 bit-matrix memory chip machine, has been demonstrated by running statistical physics algorithms: one-dimensional stochastic cellular automata, as a simple model for dynamical phase transitions. As a relevant result for physics, the damage spreading of this model has been investigated.

  13. Dual N-Back Working Memory Training in Healthy Adults: A Randomized Comparison to Processing Speed Training

    Science.gov (United States)

    Lawlor-Savage, Linette; Goghari, Vina M.

    2016-01-01

    Enhancing cognitive ability is an attractive concept, particularly for middle-aged adults interested in maintaining cognitive functioning and preventing age-related declines. Computerized working memory training has been investigated as a safe method of cognitive enhancement in younger and older adults, although few studies have considered the potential impact of working memory training on middle-aged adults. This study investigated dual n-back working memory training in healthy adults aged 30–60. Fifty-seven adults completed measures of working memory, processing speed, and fluid intelligence before and after a 5-week web-based dual n-back or active control (processing speed) training program. Results: Repeated measures multivariate analysis of variance failed to identify improvements across the three cognitive composites, working memory, processing speed, and fluid intelligence, after training. Follow-up Bayesian analyses supported null findings for training effects for each individual composite. Findings suggest that dual n-back working memory training may not benefit working memory or fluid intelligence in healthy adults. Further investigation is necessary to clarify if other forms of working memory training may be beneficial, and what factors impact training-related benefits, should they occur, in this population. PMID:27043141

  14. Dual N-Back Working Memory Training in Healthy Adults: A Randomized Comparison to Processing Speed Training.

    Directory of Open Access Journals (Sweden)

    Linette Lawlor-Savage

    Full Text Available Enhancing cognitive ability is an attractive concept, particularly for middle-aged adults interested in maintaining cognitive functioning and preventing age-related declines. Computerized working memory training has been investigated as a safe method of cognitive enhancement in younger and older adults, although few studies have considered the potential impact of working memory training on middle-aged adults. This study investigated dual n-back working memory training in healthy adults aged 30-60. Fifty-seven adults completed measures of working memory, processing speed, and fluid intelligence before and after a 5-week web-based dual n-back or active control (processing speed training program.Repeated measures multivariate analysis of variance failed to identify improvements across the three cognitive composites, working memory, processing speed, and fluid intelligence, after training. Follow-up Bayesian analyses supported null findings for training effects for each individual composite. Findings suggest that dual n-back working memory training may not benefit working memory or fluid intelligence in healthy adults. Further investigation is necessary to clarify if other forms of working memory training may be beneficial, and what factors impact training-related benefits, should they occur, in this population.

  15. Balancing Opposing Forces—A Nested Process Evaluation Study Protocol for a Stepped Wedge Designed Cluster Randomized Controlled Trial of an Experience Based Codesign Intervention

    Directory of Open Access Journals (Sweden)

    Victoria Jane Palmer

    2016-10-01

    Full Text Available Background: Process evaluations are essential to understand the contextual, relational, and organizational and system factors of complex interventions. The guidance for developing process evaluations for randomized controlled trials (RCTs has until recently however, been fairly limited. Method/Design: A nested process evaluation (NPE was designed and embedded across all stages of a stepped wedge cluster RCT called the CORE study. The aim of the CORE study is to test the effectiveness of an experience-based codesign methodology for improving psychosocial recovery outcomes for people living with severe mental illness (service users. Process evaluation data collection combines qualitative and quantitative methods with four aims: (1 to describe organizational characteristics, service models, policy contexts, and government reforms and examine the interaction of these with the intervention; (2 to understand how the codesign intervention works, the cluster variability in implementation, and if the intervention is or is not sustained in different settings; (3 to assist in the interpretation of the primary and secondary outcomes and determine if the causal assumptions underpinning the codesign interventions are accurate; and (4 to determine the impact of a purposefully designed engagement model on the broader study retention and knowledge transfer in the trial. Discussion: Process evaluations require prespecified study protocols but finding a balance between their iterative nature and the structure offered by protocol development is an important step forward. Taking this step will advance the role of qualitative research within trials research and enable more focused data collection to occur at strategic points within studies.

  16. Achieving involvement: process outcomes from a cluster randomized trial of shared decision making skill development and use of risk communication aids in general practice.

    Science.gov (United States)

    Elwyn, G; Edwards, A; Hood, K; Robling, M; Atwell, C; Russell, I; Wensing, M; Grol, R

    2004-08-01

    A consulting method known as 'shared decision making' (SDM) has been described and operationalized in terms of several 'competences'. One of these competences concerns the discussion of the risks and benefits of treatment or care options-'risk communication'. Few data exist on clinicians' ability to acquire skills and implement the competences of SDM or risk communication in consultations with patients. The aims of this study were to evaluate the effects of skill development workshops for SDM and the use of risk communication aids on the process of consultations. A cluster randomized trial with crossover was carried out with the participation of 20 recently qualified GPs in urban and rural general practices in Gwent, South Wales. A total of 747 patients with known atrial fibrillation, prostatism, menorrhagia or menopausal symptoms were invited to a consultation to review their condition or treatments. Half the consultations were randomly selected for audio-taping, of which 352 patients attended and were audio-taped successfully. After baseline, participating doctors were randomized to receive training in (i) SDM skills or (ii) the use of simple risk communication aids, using simulated patients. The alternative training was then provided for the final study phase. Patients were allocated randomly to a consultation during baseline or intervention 1 (SDM or risk communication aids) or intervention 2 phases. A randomly selected half of the consultations were audio-taped from each phase. Raters (independent, trained and blinded to study phase) assessed the audio-tapes using a validated scale to assess levels of patient involvement (OPTION: observing patient involvement), and to analyse the nature of risk information discussed. Clinicians completed questionnaires after each consultation, assessing perceived clinician-patient agreement and level of patient involvement in decisions. Multilevel modelling was carried out with the OPTION score as the dependent variable, and

  17. Use of a multimedia module to aid the informed consent process in patients undergoing gynecologic laparoscopy for pelvic pain: randomized controlled trial.

    Science.gov (United States)

    Ellett, Lenore; Villegas, Rocio; Beischer, Andrew; Ong, Nicole; Maher, Peter

    2014-01-01

    To determine whether providing additional information to the standard consent process, in the form of a multimedia module (MM), improves patient knowledge about operative laparoscopy without increasing anxiety. Randomized controlled trial (Canadian Task Force classification I). Two outpatient gynecologic clinics, one in a private hospital and the other in a public teaching hospital. Forty-one women aged 19 to 51 years (median, 35.6 years) requiring operative laparoscopy for investigation and treatment of pelvic pain. Following the standard informed consent process, patients were randomized to watch the MM (intervention group, n = 21) or not (control group, n = 20). The surgeon was blinded to the group assignments. All patients completed a knowledge questionnaire and the Spielberger short-form State-Trait Anxiety Inventory. Six weeks after recruitment, patients completed the knowledge questionnaire and the State-Trait Anxiety Inventory a second time to assess knowledge retention and anxiety scores. Patient knowledge of operative laparoscopy, anxiety level, and acceptance of the MM were recorded. The MM intervention group demonstrated superior knowledge scores. Mean (SE) score in the MM group was 11.3 (0.49), and in the control group was 7.9 (0.50) (p <.001) (maximum score, 14). This did not translate into improved knowledge scores 6 weeks later; the score in the MM group was 8.4 (0.53) vs. 7.8 (0.50) in the control group (p = .44). There was no difference in anxiety levels between the groups at intervention or after 6 weeks. Overall, patients found the MM acceptable, and 18 women (86%) in the intervention group and 12 (60%) in the control group stated they would prefer this style of informed consent in the future. Use of an MM enhances the informed consent process by improving patient knowledge, in the short term, without increasing anxiety. Copyright © 2014 AAGL. Published by Elsevier Inc. All rights reserved.

  18. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  19. Performance analysis of spectral-phase-encoded optical code-division multiple-access system regarding the incorrectly decoded signal as a nonstationary random process

    Science.gov (United States)

    Yan, Meng; Yao, Minyu; Zhang, Hongming

    2005-11-01

    The performance of a spectral-phase-encoded (SPE) optical code-division multiple-access (OCDMA) system is analyzed. Regarding the incorrectly decoded signal (IDS) as a nonstationary random process, we derive a novel probability distribution for it. The probability distribution of the IDS is considered a chi-squared distribution with degrees of freedom r=1, which is more reasonable and accurate than in previous work. The bit error rate (BER) of an SPE OCDMA system under multiple-access interference is evaluated. Numerical results show that the system can sustain very low BER even when there are multiple simultaneous users, and as the code length becomes longer or the initial pulse becomes shorter, the system performs better.

  20. Load-Dependent Interference of Deep Brain Stimulation of the Subthalamic Nucleus with Switching from Automatic to Controlled Processing During Random Number Generation in Parkinson's Disease.

    Science.gov (United States)

    Williams, Isobel Anne; Wilkinson, Leonora; Limousin, Patricia; Jahanshahi, Marjan

    2015-01-01

    Deep brain stimulation of the subthalamic nucleus (STN DBS) ameliorates the motor symptoms of Parkinson's disease (PD). However, some aspects of executive control are impaired with STN DBS. We tested the prediction that (i) STN DBS interferes with switching from automatic to controlled processing during fast-paced random number generation (RNG) (ii) STN DBS-induced cognitive control changes are load-dependent. Fifteen PD patients with bilateral STN DBS performed paced-RNG, under three levels of cognitive load synchronised with a pacing stimulus presented at 1, 0.5 and 0.33 Hz (faster rates require greater cognitive control), with DBS on or off. Measures of output randomness were calculated. Countscore 1 (CS1) indicates habitual counting in steps of one (CS1). Countscore 2 (CS2) indicates a more controlled strategy of counting in twos. The fastest rate was associated with an increased CS1 score with STN DBS on compared to off. At the slowest rate, patients had higher CS2 scores with DBS off than on, such that the differences between CS1 and CS2 scores disappeared. We provide evidence for a load-dependent effect of STN DBS on paced RNG in PD. Patients could switch to more controlled RNG strategies during conditions of low cognitive load at slower rates only when the STN stimulators were off, but when STN stimulation was on, they engaged in more automatic habitual counting under increased cognitive load. These findings are consistent with the proposal that the STN implements a switch signal from the medial frontal cortex which enables a shift from automatic to controlled processing.

  1. Meaning in meaninglessness: The propensity to perceive meaningful patterns in coincident events and randomly arranged stimuli is linked to enhanced attention in early sensory processing.

    Science.gov (United States)

    Rominger, Christian; Schulter, Günter; Fink, Andreas; Weiss, Elisabeth M; Papousek, Ilona

    2018-05-01

    Perception of objectively independent events or stimuli as being significantly connected and the associated proneness to perceive meaningful patterns constitute part of the positive symptoms of schizophrenia, which are associated with altered attentional processes in lateralized speech perception. Since perceiving meaningful patterns is to some extent already prevalent in the general population, the aim of the study was to investigate whether the propensity to experience meaningful patterns in co-occurring events and random stimuli may be associated with similar altered attentional processes in lateralized speech perception. Self-reported and behavioral indicators of the perception of meaningful patterns were assessed in non-clinical individuals, along with EEG auditory evoked potentials during the performance of an attention related lateralized speech perception task (Dichotic Listening Test). A greater propensity to perceive meaningful patterns was associated with higher N1 amplitudes of the evoked potentials to the onset of the dichotically presented consonant-vowel syllables, indicating enhanced automatic attention in early sensory processing. The study suggests that more basic mechanisms in how people associate events may play a greater role in the cognitive biases that are manifest in personality expressions such as positive schizotypy, rather than that positive schizotypy moderates these cognitive biases directly. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Almond Consumption and Processing Affects the Composition of the Gastrointestinal Microbiota of Healthy Adult Men and Women: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Hannah D. Holscher

    2018-01-01

    Full Text Available Background: Almond processing has been shown to differentially impact metabolizable energy; however, the effect of food form on the gastrointestinal microbiota is under-investigated. Objective: We aimed to assess the interrelationship of almond consumption and processing on the gastrointestinal microbiota. Design: A controlled-feeding, randomized, five-period, crossover study with washouts between diet periods was conducted in healthy adults (n = 18. Treatments included: (1 zero servings/day of almonds (control; (2 1.5 servings (42 g/day of whole almonds; (3 1.5 servings/day of whole, roasted almonds; (4 1.5 servings/day of roasted, chopped almonds; and (5 1.5 servings/day of almond butter. Fecal samples were collected at the end of each three-week diet period. Results: Almond consumption increased the relative abundances of Lachnospira, Roseburia, and Dialister (p ≤ 0.05. Comparisons between control and the four almond treatments revealed that chopped almonds increased Lachnospira, Roseburia, and Oscillospira compared to control (p < 0.05, while whole almonds increased Dialister compared to control (p = 0.007. There were no differences between almond butter and control. Conclusions: These results reveal that almond consumption induced changes in the microbial community composition of the human gastrointestinal microbiota. Furthermore, the degree of almond processing (e.g., roasting, chopping, and grinding into butter differentially impacted the relative abundances of bacterial genera.

  3. Impact of acute administration of escitalopram on the processing of emotional and neutral images: a randomized crossover fMRI study of healthy women.

    Science.gov (United States)

    Outhred, Tim; Das, Pritha; Felmingham, Kim L; Bryant, Richard A; Nathan, Pradeep J; Malhi, Gin S; Kemp, Andrew H

    2014-07-01

    Acute neural effects of antidepressant medication on emotion processing biases may provide the foundation on which clinical outcomes are based. Along with effects on positive and negative stimuli, acute effects on neutral stimuli may also relate to antidepressant efficacy, yet these effects are still to be investigated. The present study therefore examined the impact of a single dose of the selective serotonin reuptake inhibitor escitalopram (20 mg) on positive, negative and neutral stimuli using pharmaco-fMRI. Within a double-blind, randomized, placebo-controlled crossover design, healthy women completed 2 sessions of treatment administration and fMRI scanning separated by a 1-week washout period. We enrolled 36 women in our study. When participants were administered escitalopram relative to placebo, left amygdala activity was increased and right inferior frontal gyrus (IFG) activity was decreased during presentation of positive pictures (potentiation of positive emotion processing). In contrast, escitalopram was associated with decreased left amygdala and increased right IFG activity during presentation of negative pictures (attenuation of negative emotion processing). In addition, escitalopram decreased right IFG activity during the processing of neutral stimuli, akin to the effects on positive stimuli (decrease in negative appraisal). Although we used a women-only sample to reduce heterogeneity, our results may not generalize to men. Potential unblinding, which was related to the subjective occurrence of side effects, occurred in the study; however, manipulation check analyses demonstrated that results were not impacted. These novel findings demonstrate that a single dose of the commonly prescribed escitalopram facilitates a positive information processing bias. These findings provide an important lead for better understanding effects of antidepressant medication.

  4. Effects of a Community-Based, Post-Rehabilitation Exercise Program in COPD: Protocol for a Randomized Controlled Trial With Embedded Process Evaluation.

    Science.gov (United States)

    Desveaux, Laura; Beauchamp, Marla K; Lee, Annemarie; Ivers, Noah; Goldstein, Roger; Brooks, Dina

    2016-05-11

    This manuscript (1) outlines the intervention, (2) describes how its effectiveness is being evaluated in a pragmatic randomized controlled trial, and (3) summarizes the embedded process evaluation aiming to understand key barriers and facilitators for implementation in new environments. Participating centers refer eligible individuals with COPD following discharge from their local PR program. Consenting patients are assigned to a year-long community exercise program or usual care using block randomization and stratifying for supplemental oxygen use. Patients in the intervention arm are asked to attend an exercise session at least twice per week at their local community facility where their progress is supervised by a case manager. Each exercise session includes a component of aerobic exercise, and activities designed to optimize balance, flexibility, and strength. All study participants will have access to routine follow-up appointments with their respiratory physician, and additional health care providers as part of their usual care. Assessments will be completed at baseline (post-PR), 6, and 12 months, and include measures of functional exercise capacity, quality of life, self-efficacy, and health care usage. Intervention effectiveness will be assessed by comparing functional exercise capacity between intervention and control groups. A mixed-methods process evaluation will be conducted to better understand intervention implementation, guided by Normalization Process Theory and the Consolidated Framework for Implementation Research. Based on results from our pilot work, we anticipate a maintenance of exercise capacity and improved health-related quality of life in the intervention group, compared with a decline in exercise capacity in the usual care group. Findings from this study will improve our understanding of the effectiveness of community-based exercise programs for maintaining benefits following PR in patients with COPD and provide information on how best

  5. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  6. Tri-state resistive switching characteristics of MnO/Ta2O5 resistive random access memory device by a controllable reset process

    Science.gov (United States)

    Lee, N. J.; Kang, T. S.; Hu, Q.; Lee, T. S.; Yoon, T.-S.; Lee, H. H.; Yoo, E. J.; Choi, Y. J.; Kang, C. J.

    2018-06-01

    Tri-state resistive switching characteristics of bilayer resistive random access memory devices based on manganese oxide (MnO)/tantalum oxide (Ta2O5) have been studied. The current–voltage (I–V) characteristics of the Ag/MnO/Ta2O5/Pt device show tri-state resistive switching (RS) behavior with a high resistance state (HRS), intermediate resistance state (IRS), and low resistance state (LRS), which are controlled by the reset process. The MnO/Ta2O5 film shows bipolar RS behavior through the formation and rupture of conducting filaments without the forming process. The device shows reproducible and stable RS both from the HRS to the LRS and from the IRS to the LRS. In order to elucidate the tri-state RS mechanism in the Ag/MnO/Ta2O5/Pt device, transmission electron microscope (TEM) images are measured in the LRS, IRS and HRS. White lines like dendrites are observed in the Ta2O5 film in both the LRS and the IRS. Poole–Frenkel conduction, space charge limited conduction, and Ohmic conduction are proposed as the dominant conduction mechanisms for the Ag/MnO/Ta2O5/Pt device based on the obtained I–V characteristics and TEM images.

  7. Material insights of HfO2-based integrated 1-transistor-1-resistor resistive random access memory devices processed by batch atomic layer deposition.

    Science.gov (United States)

    Niu, Gang; Kim, Hee-Dong; Roelofs, Robin; Perez, Eduardo; Schubert, Markus Andreas; Zaumseil, Peter; Costina, Ioan; Wenger, Christian

    2016-06-17

    With the continuous scaling of resistive random access memory (RRAM) devices, in-depth understanding of the physical mechanism and the material issues, particularly by directly studying integrated cells, become more and more important to further improve the device performances. In this work, HfO2-based integrated 1-transistor-1-resistor (1T1R) RRAM devices were processed in a standard 0.25 μm complementary-metal-oxide-semiconductor (CMOS) process line, using a batch atomic layer deposition (ALD) tool, which is particularly designed for mass production. We demonstrate a systematic study on TiN/Ti/HfO2/TiN/Si RRAM devices to correlate key material factors (nano-crystallites and carbon impurities) with the filament type resistive switching (RS) behaviours. The augmentation of the nano-crystallites density in the film increases the forming voltage of devices and its variation. Carbon residues in HfO2 films turn out to be an even more significant factor strongly impacting the RS behaviour. A relatively higher deposition temperature of 300 °C dramatically reduces the residual carbon concentration, thus leading to enhanced RS performances of devices, including lower power consumption, better endurance and higher reliability. Such thorough understanding on physical mechanism of RS and the correlation between material and device performances will facilitate the realization of high density and reliable embedded RRAM devices with low power consumption.

  8. On the role of heat and mass transfer into laser processability during selective laser melting AlSi12 alloy based on a randomly packed powder-bed

    Science.gov (United States)

    Wang, Lianfeng; Yan, Biao; Guo, Lijie; Gu, Dongdong

    2018-04-01

    A newly transient mesoscopic model with a randomly packed powder-bed has been proposed to investigate the heat and mass transfer and laser process quality between neighboring tracks during selective laser melting (SLM) AlSi12 alloy by finite volume method (FVM), considering the solid/liquid phase transition, variable temperature-dependent properties and interfacial force. The results apparently revealed that both the operating temperature and resultant cooling rate were obviously elevated by increasing the laser power. Accordingly, the resultant viscosity of liquid significantly reduced under a large laser power and was characterized with a large velocity, which was prone to result in a more intensive convection within pool. In this case, the sufficient heat and mass transfer occurred at the interface between the previously fabricated tracks and currently building track, revealing a strongly sufficient spreading between the neighboring tracks and a resultant high-quality surface without obvious porosity. By contrast, the surface quality of SLM-processed components with a relatively low laser power notably weakened due to the limited and insufficient heat and mass transfer at the interface of neighboring tracks. Furthermore, the experimental surface morphologies of the top surface were correspondingly acquired and were in full accordance to the calculated results via simulation.

  9. Comparing Acceptance and Commitment Group Therapy and 12-Steps Narcotics Anonymous in Addict’s Rehabilitation Process: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Manoochehr Azkhosh

    2016-12-01

    Full Text Available Objective: Substance abuse is a socio-psychological disorder. The aim of this study was to compare the effectiveness of acceptance and commitment therapy with 12-steps Narcotics Anonymous on psychological well-being of opiate dependent individuals in addiction treatment centers in Shiraz, Iran.Method: This was a randomized controlled trial. Data were collected at entry into the study and at post-test and follow-up visits. The participants were selected from opiate addicted individuals who referred to addiction treatment centers in Shiraz. Sixty individuals were evaluated according to inclusion/ exclusion criteria and were divided into three equal groups randomly (20 participants per group. One group received acceptance and commitment group therapy (Twelve 90-minute sessions and the other group was provided with the 12-steps Narcotics Anonymous program and the control group received the usual methadone maintenance treatment. During the treatment process, seven participants dropped out. Data were collected using the psychological well-being questionnaire and AAQ questionnaire in the three groups at pre-test, post-test and follow-up visits. Data were analyzed using repeated measure analysis of variance.Results: Repeated measure analysis of variance revealed that the mean difference between the three groups was significant (P<0.05 and that acceptance and commitment therapy group showed improvement relative to the NA and control groups on psychological well-being and psychological flexibility.Conclusion: The results of this study revealed that acceptance and commitment therapy can be helpful in enhancing positive emotions and increasing psychological well-being of addicts who seek treatment.

  10. Shamba Maisha: Pilot agricultural intervention for food security and HIV health outcomes in Kenya: design, methods, baseline results and process evaluation of a cluster-randomized controlled trial.

    Science.gov (United States)

    Cohen, Craig R; Steinfeld, Rachel L; Weke, Elly; Bukusi, Elizabeth A; Hatcher, Abigail M; Shiboski, Stephen; Rheingans, Richard; Scow, Kate M; Butler, Lisa M; Otieno, Phelgona; Dworkin, Shari L; Weiser, Sheri D

    2015-01-01

    Despite advances in treatment of people living with HIV, morbidity and mortality remains unacceptably high in sub-Saharan Africa, largely due to parallel epidemics of poverty and food insecurity. We conducted a pilot cluster randomized controlled trial (RCT) of a multisectoral agricultural and microfinance intervention (entitled Shamba Maisha) designed to improve food security, household wealth, HIV clinical outcomes and women's empowerment. The intervention was carried out at two HIV clinics in Kenya, one randomized to the intervention arm and one to the control arm. HIV-infected patients >18 years, on antiretroviral therapy, with moderate/severe food insecurity and/or body mass index (BMI) loan (~$150) to purchase the farming commodities, 2) a micro-irrigation pump, seeds, and fertilizer, and 3) trainings in sustainable agricultural practices and financial literacy. Enrollment of 140 participants took four months, and the screening-to-enrollment ratio was similar between arms. We followed participants for 12 months and conducted structured questionnaires. We also conducted a process evaluation with participants and stakeholders 3-5 months after study start and at study end. Baseline results revealed that participants at the two sites were similar in age, gender and marital status. A greater proportion of participants at the intervention site had a low BMI in comparison to participants at the control site (18% vs. 7%, p = 0.054). While median CD4 count was similar between arms, a greater proportion of participants enrolled at the intervention arm had a detectable HIV viral load compared with control participants (49% vs. 28%, respectively, p loans, agricultural challenges due to weather patterns, and a challenging partnership with the microfinance institution. We expect the results from this pilot study to provide useful data on the impacts of livelihood interventions and will help in the design of a definitive cluster RCT. This trial is registered at Clinical

  11. Nonlinear transformations of random processes

    CERN Document Server

    Deutsch, Ralph

    2017-01-01

    This concise treatment of nonlinear noise techniques encountered in system applications is suitable for advanced undergraduates and graduate students. It is also a valuable reference for systems analysts and communication engineers. 1962 edition.

  12. Random magnetism

    International Nuclear Information System (INIS)

    Tahir-Kheli, R.A.

    1975-01-01

    A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt

  13. Dynamical replica analysis of processes on finitely connected random graphs: II. Dynamics in the Griffiths phase of the diluted Ising ferromagnet

    International Nuclear Information System (INIS)

    Mozeika, A; Coolen, A C C

    2009-01-01

    We study the Glauber dynamics of Ising spin models with random bonds, on finitely connected random graphs. We generalize a recent dynamical replica theory with which to predict the evolution of the joint spin-field distribution, to include random graphs with arbitrary degree distributions. The theory is applied to Ising ferromagnets on randomly diluted Bethe lattices, where we study the evolution of the magnetization and the internal energy. It predicts a prominent slowing down of the flow in the Griffiths phase, it suggests a further dynamical transition at lower temperatures within the Griffiths phase, and it is verified quantitatively by the results of Monte Carlo simulations

  14. A Randomized Controlled Clinical Trial of Dialogical Exposure Therapy versus Cognitive Processing Therapy for Adult Outpatients Suffering from PTSD after Type I Trauma in Adulthood.

    Science.gov (United States)

    Butollo, Willi; Karl, Regina; König, Julia; Rosner, Rita

    2016-01-01

    Although there are effective treatments for posttraumatic stress disorder (PTSD), there is little research on treatments with non-cognitive-behavioural backgrounds, such as gestalt therapy. We tested an integrative gestalt-derived intervention, dialogical exposure therapy (DET), against an established cognitive-behavioural treatment (cognitive processing therapy, CPT) for possible differential effects in terms of symptomatic outcome and drop-out rates. We randomized 141 treatment-seeking individuals with a diagnosis of PTSD to receive either DET or CPT. Therapy length in both treatments was flexible with a maximum duration of 24 sessions. Dropout rates were 12.2% in DET and 14.9% in CPT. Patients in both conditions achieved significant and large reductions in PTSD symptoms (Impact of Event Scale - Revised; Hedges' g = 1.14 for DET and d = 1.57 for CPT) which were largely stable at the 6-month follow-up. At the posttreatment assessment, CPT performed statistically better than DET on symptom and cognition measures. For several outcome measures, younger patients profited better from CPT than older ones, while there was no age effect for DET. Our results indicate that DET merits further research and may be an alternative to established treatments for PTSD. It remains to be seen whether DET confers advantages in areas of functioning beyond PTSD symptoms. © 2015 S. Karger AG, Basel.

  15. Qualitative insights into implementation, processes, and outcomes of a randomized trial on peer support and HIV care engagement in Rakai, Uganda.

    Science.gov (United States)

    Monroe, April; Nakigozi, Gertrude; Ddaaki, William; Bazaale, Jeremiah Mulamba; Gray, Ronald H; Wawer, Maria J; Reynolds, Steven J; Kennedy, Caitlin E; Chang, Larry W

    2017-01-10

    People living with human immunodeficiency virus (HIV) who have not yet initiated antiretroviral therapy (ART) can benefit from being engaged in care and utilizing preventive interventions. Community-based peer support may be an effective approach to promote these important HIV services. After conducting a randomized trial of the impact of peer support on pre-ART outcomes, we conducted a qualitative evaluation to better understand trial implementation, processes, and results. Overall, 75 participants, including trial participants (clients), peer supporters, and clinic staff, participated in 41 in-depth interviews and 6 focus group discussions. A situated Information Motivation, and Behavioral skills model of behavior change was used to develop semi-structured interview and focus group guides. Transcripts were coded and thematically synthesized. We found that participant narratives were generally consistent with the theoretical model, indicating that peer support improved information, motivation, and behavioral skills, leading to increased engagement in pre-ART care. Clients described how peer supporters reinforced health messages and helped them better understand complicated health information. Peer supporters also helped clients navigate the health system, develop support networks, and identify strategies for remembering medication and clinic appointments. Some peer supporters adopted roles beyond visiting patients, serving as a bridge between the client and his or her family, community, and health system. Qualitative results demonstrated plausible processes by which peer support improved client engagement in care, cotrimoxazole use, and safe water vessel use. Challenges identified included insufficient messaging surrounding ART initiation, lack of care continuity after ART initiation, rare breaches in confidentiality, and structural challenges. The evaluation found largely positive perceptions of the peer intervention across stakeholders and provided valuable

  16. A note on asymptotic expansions for sums over a weakly dependent random field with application to the Poisson and Strauss processes

    DEFF Research Database (Denmark)

    Jensen, J.L.

    1993-01-01

    Previous results on Edgeworth expansions for sums over a random field are extended to the case where the strong mixing coefficient depends not only on the distance between two sets of random variables, but also on the size of the two sets. The results are applied to the Poisson and the Strauss...

  17. The Prevention Program for Externalizing Problem Behavior (PEP) Improves Child Behavior by Reducing Negative Parenting: Analysis of Mediating Processes in a Randomized Controlled Trial

    Science.gov (United States)

    Hanisch, Charlotte; Hautmann, Christopher; Plück, Julia; Eichelberger, Ilka; Döpfner, Manfred

    2014-01-01

    Background: Our indicated Prevention program for preschool children with Externalizing Problem behavior (PEP) demonstrated improved parenting and child problem behavior in a randomized controlled efficacy trial and in a study with an effectiveness design. The aim of the present analysis of data from the randomized controlled trial was to identify…

  18. Supporting health care professionals to improve the processes of shared decision making and self-management in a web-based intervention: randomized controlled trial.

    Science.gov (United States)

    Sassen, Barbara; Kok, Gerjo; Schepers, Jan; Vanhees, Luc

    2014-10-21

    Research to assess the effect of interventions to improve the processes of shared decision making and self-management directed at health care professionals is limited. Using the protocol of Intervention Mapping, a Web-based intervention directed at health care professionals was developed to complement and optimize health services in patient-centered care. The objective of the Web-based intervention was to increase health care professionals' intention and encouraging behavior toward patient self-management, following cardiovascular risk management guidelines. A randomized controlled trial was used to assess the effect of a theory-based intervention, using a pre-test and post-test design. The intervention website consisted of a module to help improve professionals' behavior, a module to increase patients' intention and risk-reduction behavior toward cardiovascular risk, and a parallel module with a support system for the health care professionals. Health care professionals (n=69) were recruited online and randomly allocated to the intervention group (n=26) or (waiting list) control group (n=43), and invited their patients to participate. The outcome was improved professional behavior toward health education, and was self-assessed through questionnaires based on the Theory of Planned Behavior. Social-cognitive determinants, intention and behavior were measured pre-intervention and at 1-year follow-up. The module to improve professionals' behavior was used by 45% (19/42) of the health care professionals in the intervention group. The module to support the health professional in encouraging behavior toward patients was used by 48% (20/42). The module to improve patients' risk-reduction behavior was provided to 44% (24/54) of patients. In 1 of every 5 patients, the guideline for cardiovascular risk management was used. The Web-based intervention was poorly used. In the intervention group, no differences in social-cognitive determinants, intention and behavior were found

  19. A web-based tool to support shared decision making for people with a psychotic disorder: randomized controlled trial and process evaluation.

    Science.gov (United States)

    van der Krieke, Lian; Emerencia, Ando C; Boonstra, Nynke; Wunderink, Lex; de Jonge, Peter; Sytema, Sjoerd

    2013-10-07

    Mental health policy makers encourage the development of electronic decision aids to increase patient participation in medical decision making. Evidence is needed to determine whether these decision aids are helpful in clinical practice and whether they lead to increased patient involvement and better outcomes. This study reports the outcome of a randomized controlled trial and process evaluation of a Web-based intervention to facilitate shared decision making for people with psychotic disorders. The study was carried out in a Dutch mental health institution. Patients were recruited from 2 outpatient teams for patients with psychosis (N=250). Patients in the intervention condition (n=124) were provided an account to access a Web-based information and decision tool aimed to support patients in acquiring an overview of their needs and appropriate treatment options provided by their mental health care organization. Patients were given the opportunity to use the Web-based tool either on their own (at their home computer or at a computer of the service) or with the support of an assistant. Patients in the control group received care as usual (n=126). Half of the patients in the sample were patients experiencing a first episode of psychosis; the other half were patients with a chronic psychosis. Primary outcome was patient-perceived involvement in medical decision making, measured with the Combined Outcome Measure for Risk Communication and Treatment Decision-making Effectiveness (COMRADE). Process evaluation consisted of questionnaire-based surveys, open interviews, and researcher observation. In all, 73 patients completed the follow-up measurement and were included in the final analysis (response rate 29.2%). More than one-third (48/124, 38.7%) of the patients who were provided access to the Web-based decision aid used it, and most used its full functionality. No differences were found between the intervention and control conditions on perceived involvement in medical

  20. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  1. Certified randomness in quantum physics.

    Science.gov (United States)

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  2. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  3. The impact of Cognitive Processing Therapy on stigma among survivors of sexual violence in eastern Democratic Republic of Congo: results from a cluster randomized controlled trial.

    Science.gov (United States)

    Murray, S M; Augustinavicius, J; Kaysen, D; Rao, D; Murray, L K; Wachter, K; Annan, J; Falb, K; Bolton, P; Bass, J K

    2018-01-01

    Sexual violence is associated with a multitude of poor physical, emotional, and social outcomes. Despite reports of stigma by sexual violence survivors, limited evidence exists on effective strategies to reduce stigma, particularly in conflict-affected settings. We sought to assess the effect of group Cognitive Processing Therapy (CPT) on stigma and the extent to which stigma might moderate the effectiveness of CPT in treating mental health problems among survivors of sexual violence in the Democratic Republic of Congo. Data were drawn from 405 adult female survivors of sexual violence reporting mental distress and poor functioning in North and South Kivu. Women were recruited through organizations providing psychosocial support and then cluster randomized to group CPT or individual support. Women were assessed at baseline, the end of treatment, and again six months later. Assessors were masked to women's treatment assignment. Linear mixed-effect regression models were used to estimate (1) the effect of CPT on feelings of perceived and internalized (felt) stigma, and (2) whether felt stigma and discrimination (enacted stigma) moderated the effects of CPT on combined depression and anxiety symptoms, posttraumatic stress, and functional impairment. Participants receiving CPT experienced moderate reductions in felt stigma relative to those in individual support (Cohen's D = 0.44, p  = value = 0.02) following the end of treatment, though this difference was no longer significant six-months later (Cohen's D = 0.45, p  = value = 0.12). Neither felt nor enacted stigma significantly moderated the effect of CPT on mental health symptoms or functional impairment. Group cognitive-behavioral based therapies may be an effective stigma reduction tool for survivors of sexual violence. Experiences and perceptions of stigma did not hinder therapeutic effects of group psychotherapy on survivors' mental health. ClinicalTrials.gov NCT01385163.

  4. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  5. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  6. Random numbers from vacuum fluctuations

    International Nuclear Information System (INIS)

    Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda

    2016-01-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  7. Random numbers from vacuum fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com [Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore); Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore); Chng, Brenda [Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore)

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  8. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  9. A Web-Based Tool to Support Shared Decision Making for People With a Psychotic Disorder: Randomized Controlled Trial and Process Evaluation

    Science.gov (United States)

    Emerencia, Ando C; Boonstra, Nynke; Wunderink, Lex; de Jonge, Peter; Sytema, Sjoerd

    2013-01-01

    Background Mental health policy makers encourage the development of electronic decision aids to increase patient participation in medical decision making. Evidence is needed to determine whether these decision aids are helpful in clinical practice and whether they lead to increased patient involvement and better outcomes. Objective This study reports the outcome of a randomized controlled trial and process evaluation of a Web-based intervention to facilitate shared decision making for people with psychotic disorders. Methods The study was carried out in a Dutch mental health institution. Patients were recruited from 2 outpatient teams for patients with psychosis (N=250). Patients in the intervention condition (n=124) were provided an account to access a Web-based information and decision tool aimed to support patients in acquiring an overview of their needs and appropriate treatment options provided by their mental health care organization. Patients were given the opportunity to use the Web-based tool either on their own (at their home computer or at a computer of the service) or with the support of an assistant. Patients in the control group received care as usual (n=126). Half of the patients in the sample were patients experiencing a first episode of psychosis; the other half were patients with a chronic psychosis. Primary outcome was patient-perceived involvement in medical decision making, measured with the Combined Outcome Measure for Risk Communication and Treatment Decision-making Effectiveness (COMRADE). Process evaluation consisted of questionnaire-based surveys, open interviews, and researcher observation. Results In all, 73 patients completed the follow-up measurement and were included in the final analysis (response rate 29.2%). More than one-third (48/124, 38.7%) of the patients who were provided access to the Web-based decision aid used it, and most used its full functionality. No differences were found between the intervention and control conditions

  10. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1980-03-01

    The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  12. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1981-01-01

    The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  13. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune

    Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...

  14. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  15. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, R.; Brincker, Rune

    1998-01-01

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  16. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.; Brene, N.; Nielsen, H.B.

    1986-06-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  17. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: Gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  18. Random Dynamics

    Science.gov (United States)

    Bennett, D. L.; Brene, N.; Nielsen, H. B.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.

  19. On randomly interrupted diffusion

    International Nuclear Information System (INIS)

    Luczka, J.

    1993-01-01

    Processes driven by randomly interrupted Gaussian white noise are considered. An evolution equation for single-event probability distributions in presented. Stationary states are considered as a solution of a second-order ordinary differential equation with two imposed conditions. A linear model is analyzed and its stationary distributions are explicitly given. (author). 10 refs

  20. A comparison of random walks in dependent random environments

    NARCIS (Netherlands)

    Scheinhardt, Willem R.W.; Kroese, Dirk

    2015-01-01

    Although the theoretical behavior of one-dimensional random walks in random environments is well understood, the actual evaluation of various characteristics of such processes has received relatively little attention. This paper develops new methodology for the exact computation of the drift in such

  1. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  3. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    1997-01-01

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  4. The clinical reasoning process in randomized clinical trials with patients with non-specific neck pain is incomplete: A systematic review.

    Science.gov (United States)

    Maissan, Francois; Pool, Jan; de Raaij, Edwin; Mollema, Jürgen; Ostelo, Raymond; Wittink, Harriet

    2018-06-01

    Primarily to evaluate the completeness of the description of the clinical reasoning process in RCTs with patients with non-specific neck pain with an argued or diagnosed cause i.e. an impairment or activity limitation. Secondly, to determine the association between the completeness of the clinical reasoning process and the degree of risk of bias. Pubmed, Cinahl and PEDro were systematically searched from inception to July 2016. RCTs (n = 122) with patients with non-specific neck pain receiving physiotherapy treatment published in English were included. Data extraction included study characteristics and important features of the clinical reasoning process based on the Hypothesis-Oriented Algorithm for Clinicians II (HOAC II)]. Thirty-seven studies (30%) had a complete clinical reasoning process of which 8 (6%) had a 'diagnosed cause' and 29 (24%) had an 'argued cause'. The Spearmans rho association between the extent of the clinical reasoning process and the risk of bias was -0.2. In the majority of studies (70%) the described clinical reasoning process was incomplete. A very small proportion (6%) had a 'diagnosed cause'. Therefore, a better methodological quality does not necessarily imply a better described clinical reasoning process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Investigating the effect of a 3-month workplace-based pedometer-driven walking programme on health-related quality of life in meat processing workers: a feasibility study within a randomized controlled trial.

    Science.gov (United States)

    Mansi, Suliman; Milosavljevic, Stephan; Tumilty, Steve; Hendrick, Paul; Higgs, Chris; Baxter, David G

    2015-04-22

    In New Zealand, meat processing populations face many health problems as a result of the nature of work in meat processing industries. The primary aim of this study was to examine the feasibility of using a pedometer-based intervention to increase physical activity and improve health-related outcomes in a population of meat processing workers. A single-blinded randomized controlled trial (RCT) was conducted. A convenience sample of meat workers (n = 58; mean age 41.0 years; range: 18-65) participated in the trial. Participants were randomly allocated into two groups. Intervention participants (n = 29) utilized a pedometer to self monitor their activity, whilst undertaking a brief intervention, and educational material. Control participants (n = 29) received educational material only. The primary outcomes of ambulatory activity, and health-related quality of life, were evaluated at baseline, immediately following the 12-week intervention and three months post-intervention. Fifty three participants completed the program (91.3% adherence). Adherence with the intervention group was high, 93% (n = 27/29), and this group increased their mean daily step count from 5993 to 9792 steps per day, while the control group steps changed from 5788 to 6551 steps per day from baseline. This increase in step counts remained significant within the intervention group p workplace setting over the short term. Australian New Zealand Clinical Trials Registry (ANZCTR) ACTRN12613000087752.

  6. Levy flights and random searches

    Energy Technology Data Exchange (ETDEWEB)

    Raposo, E P [Laboratorio de Fisica Teorica e Computacional, Departamento de Fisica, Universidade Federal de Pernambuco, Recife-PE, 50670-901 (Brazil); Buldyrev, S V [Department of Physics, Yeshiva University, New York, 10033 (United States); Da Luz, M G E [Departamento de Fisica, Universidade Federal do Parana, Curitiba-PR, 81531-990 (Brazil); Viswanathan, G M [Instituto de Fisica, Universidade Federal de Alagoas, Maceio-AL, 57072-970 (Brazil); Stanley, H E [Center for Polymer Studies and Department of Physics, Boston University, Boston, MA 02215 (United States)

    2009-10-30

    In this work we discuss some recent contributions to the random search problem. Our analysis includes superdiffusive Levy processes and correlated random walks in several regimes of target site density, mobility and revisitability. We present results in the context of mean-field-like and closed-form average calculations, as well as numerical simulations. We then consider random searches performed in regular lattices and lattices with defects, and we discuss a necessary criterion for distinguishing true superdiffusion from correlated random walk processes. We invoke energy considerations in relation to critical survival states on the edge of extinction, and we analyze the emergence of Levy behavior in deterministic search walks. Finally, we comment on the random search problem in the context of biological foraging.

  7. Computer generation of random deviates

    International Nuclear Information System (INIS)

    Cormack, John

    1991-01-01

    The need for random deviates arises in many scientific applications. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. 27 refs., 3 tabs., 5 figs

  8. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  9. MEMO--a mobile phone depression prevention intervention for adolescents: development process and postprogram findings on acceptability from a randomized controlled trial.

    Science.gov (United States)

    Whittaker, Robyn; Merry, Sally; Stasiak, Karolina; McDowell, Heather; Doherty, Iain; Shepherd, Matthew; Dorey, Enid; Parag, Varsha; Ameratunga, Shanthi; Rodgers, Anthony

    2012-01-24

    Prevention of the onset of depression in adolescence may prevent social dysfunction, teenage pregnancy, substance abuse, suicide, and mental health conditions in adulthood. New technologies allow delivery of prevention programs scalable to large and disparate populations. To develop and test the novel mobile phone delivery of a depression prevention intervention for adolescents. We describe the development of the intervention and the results of participants' self-reported satisfaction with the intervention. The intervention was developed from 15 key messages derived from cognitive behavioral therapy (CBT). The program was fully automated and delivered in 2 mobile phone messages/day for 9 weeks, with a mixture of text, video, and cartoon messages and a mobile website. Delivery modalities were guided by social cognitive theory and marketing principles. The intervention was compared with an attention control program of the same number and types of messages on different topics. A double-blind randomized controlled trial was undertaken in high schools in Auckland, New Zealand, from June 2009 to April 2011. A total of 1348 students (13-17 years of age) volunteered to participate at group sessions in schools, and 855 were eventually randomly assigned to groups. Of these, 835 (97.7%) self-completed follow-up questionnaires at postprogram interviews on satisfaction, perceived usefulness, and adherence to the intervention. Over three-quarters of participants viewed at least half of the messages and 90.7% (379/418) in the intervention group reported they would refer the program to a friend. Intervention group participants said the intervention helped them to be more positive (279/418, 66.7%) and to get rid of negative thoughts (210/418, 50.2%)--significantly higher than proportions in the control group. Key messages from CBT can be delivered by mobile phone, and young people report that these are helpful. Change in clinician-rated depression symptom scores from baseline to 12

  10. Potential use of the non-random distribution of N2 and N2O mole masses in the atmosphere as a tool for tracing atmospheric mixing and isotope fractionation processes

    International Nuclear Information System (INIS)

    Well, R.; Langel, R.; Reineking, A.

    2002-01-01

    The variation in the natural abundance of 15 N in atmospheric gas species is often used to determine the mixing of trace gases from different sources. With conventional budget calculations one unknown quantity can be determined if the remaining quantities are known. From 15 N tracer studies in soils with highly enriched 15 N-nitrate a procedure is known to calculate the mixing of atmospheric and soil derived N 2 based on the measurement of the 30/28 and 29/28 ratios in gas samples collected from soil covers. Because of the non-random distribution of the mole masses 30 N 2 , 29 N 2 and 28 N 2 in the mixing gas it is possible to calculate two quantities simultaneously, i.e. the mixing ratio of atmospheric and soil derived N 2 , and the isotopic signature of the soil derived N 2 . Routine standard measurements of laboratory air had suggested a non-random distribution of N 2 -mole masses. The objective of this study was to investigate and explain the existence of non-random distributions of 15 N 15 N, 14 N 15 N and 14 N 14 N in N 2 and N 2 O in environmental samples. The calculation of theoretical isotope data resulting from hypothetical mixing of two sources differing in 15 N natural abundance demonstrated, that the deviation from an ideal random distribution of mole masses is not detectable with the current precision of mass spectrometry. 15 N-analysis of N 2 or N 2 O was conducted with randomised and non-randomised replicate samples of different origin. 15 N abundance as calculated from 29/28 ratios were generally higher in randomised samples. The differences between the treatments ranged between 0.05 and 0.17 δper mille 15 N. It was concluded that the observed randomisation effect is probably caused by 15 N 15 N fractionation during environmental processes. (author)

  11. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  12. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    Science.gov (United States)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  13. The Impact of Personality Factors and Preceding User Comments on the Processing of Research Findings on Deep Brain Stimulation: A Randomized Controlled Experiment in a Simulated Online Forum.

    Science.gov (United States)

    Feinkohl, Insa; Flemming, Danny; Cress, Ulrike; Kimmerle, Joachim

    2016-03-03

    Laypeople frequently discuss medical research findings on Web-based platforms, but little is known about whether they grasp the tentativeness that is inherent in these findings. Potential influential factors involved in understanding medical tentativeness have hardly been assessed to date. The research presented here aimed to examine the effects of personality factors and of other users' previous contributions in a Web-based forum on laypeople's understanding of the tentativeness of medical research findings, using the example of research on deep brain stimulation. We presented 70 university students with an online news article that reported findings on applying deep brain stimulation as a novel therapeutic method for depression, which participants were unfamiliar with. In a randomized controlled experiment, we manipulated the forum such that the article was either accompanied by user comments that addressed the issue of tentativeness, by comments that did not address this issue, or the article was accompanied by no comments at all. Participants were instructed to write their own individual user comments. Their scientific literacy, epistemological beliefs, and academic self-efficacy were measured. The outcomes measured were perceived tentativeness and tentativeness addressed in the participants' own comments. More sophisticated epistemological beliefs enhanced the perception of tentativeness (standardized β=.26, P=.034). Greater scientific literacy (stand. β=.25, P=.025) and greater academic self-efficacy (stand. β=.31, P=.007) were both predictors of a more extensive discussion of tentativeness in participants' comments. When forum posts presented in the experiment addressed the issue of tentativeness, participants' subsequent behavior tended to be consistent with what they had read in the forum, F2,63=3.66; P=.049, ηp(2)=.092. Students' understanding of the tentativeness of research findings on deep brain stimulation in an online forum is influenced by a

  14. Random pulse generator

    International Nuclear Information System (INIS)

    Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing

    2007-01-01

    Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)

  15. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  16. Coupled continuous time-random walks in quenched random environment

    Science.gov (United States)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  17. Recruiting primary care practices for practice-based research: a case study of a group-randomized study (TRANSLATE CKD) recruitment process.

    Science.gov (United States)

    Loskutova, Natalia Y; Smail, Craig; Ajayi, Kemi; Pace, Wilson D; Fox, Chester H

    2018-01-16

    We assessed the challenging process of recruiting primary care practices in a practice-based research study. In this descriptive case study of recruitment data collected for a large practice-based study (TRANSLATE CKD), 48 single or multiple-site health care organizations in the USA with a total of 114 practices were invited to participate. We collected quantitative and qualitative measures of recruitment process and outcomes for the first 25 practices recruited. Information about 13 additional practices is not provided due to staff transitions and limited data collection resources. Initial outreach was made to 114 practices (from 48 organizations, 41% small); 52 (45%) practices responded with interest. Practices enrolled in the study (n = 25) represented 22% of the total outreach number, or 48% of those initially interested. Average time to enroll was 71 calendar days (range 11-107). There was no difference in the number of days practices remained under recruitment, based on enrolled versus not enrolled (44.8 ± 30.4 versus 46.8 ± 25.4 days, P = 0.86) or by the organization size, i.e. large versus small (defined by having ≤4 distinct practices; 52 ± 23.6 versus 43.6 ± 27.8 days; P = 0.46). The most common recruitment barriers were administrative, e.g. lack of perceived direct organizational benefit, and were more prominent among large organizations. Despite the general belief that the research topic, invitation method, and interest in research may facilitate practice recruitment, our results suggest that most of the recruitment challenges represent managerial challenges. Future research projects may need to consider relevant methodologies from businesses administration and marketing fields. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A written language intervention for at-risk second grade students: a randomized controlled trial of the process assessment of the learner lesson plans in a tier 2 response-to-intervention (RtI) model.

    Science.gov (United States)

    Hooper, Stephen R; Costa, Lara-Jeane C; McBee, Matthew; Anderson, Kathleen L; Yerby, Donna Carlson; Childress, Amy; Knuth, Sean B

    2013-04-01

    In a randomized controlled trial, 205 students were followed from grades 1 to 3 with a focus on changes in their writing trajectories following an evidence-based intervention during the spring of second grade. Students were identified as being at-risk (n=138), and then randomized into treatment (n=68) versus business-as-usual conditions (n=70). A typical group also was included (n=67). The writing intervention comprised Lesson Sets 4 and 7 from the Process Assessment of the Learner (PAL), and was conducted via small groups (three to six students) twice a week for 12 weeks in accordance with a response-to-intervention Tier 2 model. The primary outcome was the Wechsler Individual Achievement Test-II Written Expression Scale. Results indicated modest support for the PAL lesson plans, with an accelerated rate of growth in writing skills following treatment. There were no significant moderator effects, although there was evidence that the most globally impaired students demonstrated a more rapid rate of growth following treatment. These findings suggest the need for ongoing examination of evidence-based treatments in writing for young elementary students.

  19. The impact of physicians' communication styles on evaluation of physicians and information processing: A randomized study with simulated video consultations on contraception with an intrauterine device.

    Science.gov (United States)

    Bientzle, Martina; Fissler, Tim; Cress, Ulrike; Kimmerle, Joachim

    2017-10-01

    This study aimed at examining the impact of different types of physicians' communication styles on people's subsequent evaluation of physician attributes as well as on their information processing, attitude and decision making. In a between-group experiment, 80 participants watched one of three videos in which a gynaecologist displayed a particular communication style in a consultation situation on contraception with an intrauterine device. We compared doctor-centred communication (DCC) vs patient-centred communication (PCC) vs patient-centred communication with need-orientation (PCC-N). In the PCC condition, participants perceived the physician to be more empathetic and more competent than in the DCC condition. In the DCC condition, participants showed less attitude change compared to the other conditions. In the PCC-N condition, the physician was perceived as more empathetic and more socially competent than in the other conditions. However, participants acquired less knowledge in the PCC-N condition. We conclude that appropriate application of particular communication styles depends on specific consultation goals. Our results suggest that patients' needs should be addressed if the main goal is to build a good relationship, whereas a traditional PCC style appears to be more effective in communicating factual information. © 2016 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  20. Topics in random walks in random environment

    International Nuclear Information System (INIS)

    Sznitman, A.-S.

    2004-01-01

    Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)

  1. Weak convergence to isotropic complex [Formula: see text] random measure.

    Science.gov (United States)

    Wang, Jun; Li, Yunmeng; Sang, Liheng

    2017-01-01

    In this paper, we prove that an isotropic complex symmetric α -stable random measure ([Formula: see text]) can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  2. Implementation of the Blended Care Self-Management Program for Caregivers of People With Early-Stage Dementia (Partner in Balance): Process Evaluation of a Randomized Controlled Trial.

    Science.gov (United States)

    Boots, Lizzy Mm; de Vugt, Marjolein E; Smeets, Claudia Mj; Kempen, Gertrudis Ijm; Verhey, Frans Rj

    2017-12-19

    Caring for a family member with dementia puts caregivers at risk of overburdening. Electronic health (eHealth) support for caregivers offers an opportunity for accessible tailored interventions. The blended care self-management program "Partner in Balance" (PiB) for early-stage dementia caregivers was executed in Dutch dementia care organizations. The program combines face-to-face coaching with tailored Web-based modules. Next to an evaluation of program effectiveness, an evaluation of sampling and intervention quality is essential for the generalizability and interpretation of results. The aim of this study was to describe the process evaluation from the perspective of both family caregivers (participants) and professionals delivering the intervention (coaches) to determine internal and external validity before the effect analysis and aid future implementation. Implementation, sampling, and intervention quality were evaluated with quantitative and qualitative data from logistical research data, coach questionnaires (n=13), and interviews with coaches (n=10) and participants (n=49). Goal attainment scaling was used to measure treatment-induced change. Analyses were performed with descriptive statistics and deductive content analysis. The participation rate of eligible caregivers was 51.9% (80/154). Recruitment barriers were lack of computer and lack of need for support. Young age and employment were considered recruitment facilitators. All coaches attended training and supervision in blended care self-management. Deviations from the structured protocol were reported on intervention time, structure, and feedback. Coaches described an intensified relationship with the caregiver post intervention. Caregivers appreciated the tailored content and positive feedback. The blended structure increased their openness. The discussion forum was appreciated less. Overall, personal goals were attained after the program (T>50). Implementation barriers included lack of financing

  3. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation

    Energy Technology Data Exchange (ETDEWEB)

    Stipčević, Mario, E-mail: mario.stipcevic@irb.hr [Photonics and Quantum Optics Research Unit, Center of Excellence for Advanced Materials and Sensing Devices, Ruđer Bošković Institute, Bijenička 54, 10000 Zagreb (Croatia)

    2016-03-15

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.

  4. Randomized Primitives for Big Data Processing

    DEFF Research Database (Denmark)

    Stöckel, Morten

    The growth in information technology during the last decade has brought a great increase in the number of users that have access to computers or mobile phones, as well as an increase in the number of data-based services offered to users. For instance, the number of web servers almost doubled from...... 70 to 135 million during 2005-2007. The growth in users, combined with the growth in services, means that the amount of total data to manage is exploding. An important query in the field of algorithms asks how much two data sets intersect, that is, the "overlap" between the pieces of data....... Such a query is fundamental in applications such as recommender systems, where the answer would be used to measure similarity over shopping patterns and, based on that, recommend items to the user. In this dissertation we examine the problem of computing intersection sizes among data sets in several...

  5. Random-process excursions in radioisotope instruments

    International Nuclear Information System (INIS)

    Galochkin, D.V.; Polovko, S.A.

    1984-01-01

    Approximate expressions are derived for the mathematical expectation, variance, and distribution of the durations of the excursions of the output signal from a ratemeter in a radioisotope relay instrument. The tabulated comparison of results from Monte Carlo simulation and analytical calculation shows good agreement over the mean value and the variance of the excursion duration for T 0.2 sec as calculated and as obtained by Monte Carlo simulation with a computer using 5000 realizations. It is suggested that the results should be used in choosing the optimum parameters of radioisotope relay instruments

  6. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    2014-01-01

    In this new edition of this classic text, much of the material has been rearranged and revised for pedagogical reasons. Many classic inequalities and proofs are now incorporated into the text, and many citations have been added.

  7. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  8. Reading aloud and solving simple arithmetic calculation intervention (Learning therapy improves inhibition, verbal episodic memory, focus attention, and processing speed in healthy elderly people: Evidence from a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rui eNouchi

    2016-05-01

    Full Text Available BackgroundPrevious reports have described that simple cognitive training using reading aloud and solving simple arithmetic calculations, so-called learning therapy, can improve executive functions and processing speed in the older adults. Nevertheless, it is not well-known whether learning therapy improve a wide range of cognitive functions or not. We investigated the beneficial effects of learning therapy on various cognitive functions in healthy older adults.MethodsWe used a single-blinded intervention with two groups (learning therapy group: LT and waiting list control group: WL. Sixty-four elderly were randomly assigned to LT or WL. In LT, participants performed reading Japanese aloud and solving simple calculations training tasks for 6 months. WL did not participate in the intervention. We measured several cognitive functions before and after 6 months intervention periods.ResultsCompared to WL, results revealed that LT improved inhibition performance in executive functions (Stroop: LT (Mean = 3.88 vs. WL (Mean = 1.22, adjusted p =.013 and reverse Stroop LT (Mean = 3.22 vs. WL (Mean = 1.59, adjusted p =.015, verbal episodic memory (logical memory: LT (Mean = 4.59 vs. WL (Mean = 2.47, adjusted p =.015, focus attention(D-CAT: LT (Mean = 2.09 vs. WL (Mean = -0.59, adjusted p =.010 and processing speed compared to the waiting list control group (digit symbol coding: LT (Mean = 5.00 vs. WL (Mean = 1.13, adjusted p =.015 and symbol search: LT (Mean = 3.47 vs. WL (Mean = 1.81, adjusted p =.014.DiscussionThis RCT can showed the benefit of learning therapy on inhibition of executive functions, verbal episodic memory, focus attention, and processing speed in healthy elderly people. Our results were discussed under overlapping hypothesis.Trial registrationThis trial was registered in The University Hospital Medical Information Network Clinical Trials Registry (UMIN000006998.

  9. Differential acute postprandial effects of processed meat and isocaloric vegan meals on the gastrointestinal hormone response in subjects suffering from type 2 diabetes and healthy controls: a randomized crossover study.

    Science.gov (United States)

    Belinova, Lenka; Kahleova, Hana; Malinska, Hana; Topolcan, Ondrej; Vrzalova, Jindra; Oliyarnyk, Olena; Kazdova, Ludmila; Hill, Martin; Pelikanova, Terezie

    2014-01-01

    The intake of meat, particularly processed meat, is a dietary risk factor for diabetes. Meat intake impairs insulin sensitivity and leads to increased oxidative stress. However, its effect on postprandial gastrointestinal hormone (GIH) secretion is unclear. We aimed to investigate the acute effects of two standardized isocaloric meals: a processed hamburger meat meal rich in protein and saturated fat (M-meal) and a vegan meal rich in carbohydrates (V-meal). We hypothesized that the meat meal would lead to abnormal postprandial increases in plasma lipids and oxidative stress markers and impaired GIH responses. In a randomized crossover study, 50 patients suffering from type 2 diabetes (T2D) and 50 healthy subjects underwent two 3-h meal tolerance tests. For statistical analyses, repeated-measures ANOVA was performed. The M-meal resulted in a higher postprandial increase in lipids in both groups (p<0.001) and persistent postprandial hyperinsulinemia in patients with diabetes (p<0.001). The plasma glucose levels were significantly higher after the V-meal only at the peak level. The plasma concentrations of glucose-dependent insulinotropic peptide (GIP), peptide tyrosine-tyrosine (PYY) and pancreatic polypeptide (PP) were higher (p<0.05, p<0.001, p<0.001, respectively) and the ghrelin concentration was lower (p<0.001) after the M-meal in healthy subjects. In contrast, the concentrations of GIP, PYY and PP were significantly lower after the M-meal in T2D patients (p<0.001). Compared with the V-meal, the M-meal was associated with a larger increase in lipoperoxidation in T2D patients (p<0.05). Our results suggest that the diet composition and the energy content, rather than the carbohydrate count, should be important considerations for dietary management and demonstrate that processed meat consumption is accompanied by impaired GIH responses and increased oxidative stress marker levels in diabetic patients. ClinicalTrials.gov NCT01572402.

  10. A Method of Erasing Data Using Random Number Generators

    OpenAIRE

    井上,正人

    2012-01-01

    Erasing data is an indispensable step for disposal of computers or external storage media. Except physical destruction, erasing data means writing random information on entire disk drives or media. We propose a method which erases data safely using random number generators. These random number generators create true random numbers based on quantum processes.

  11. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  12. Quantumness, Randomness and Computability

    International Nuclear Information System (INIS)

    Solis, Aldo; Hirsch, Jorge G

    2015-01-01

    Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics. (paper)

  13. How random is a random vector?

    Science.gov (United States)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  14. How random is a random vector?

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2015-01-01

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  15. An introduction to random interlacements

    CERN Document Server

    Drewitz, Alexander; Sapozhnikov, Artëm

    2014-01-01

    This book gives a self-contained introduction to the theory of random interlacements. The intended reader of the book is a graduate student with a background in probability theory who wants to learn about the fundamental results and methods of this rapidly emerging field of research. The model was introduced by Sznitman in 2007 in order to describe the local picture left by the trace of a random walk on a large discrete torus when it runs up to times proportional to the volume of the torus. Random interlacements is a new percolation model on the d-dimensional lattice. The main results covered by the book include the full proof of the local convergence of random walk trace on the torus to random interlacements and the full proof of the percolation phase transition of the vacant set of random interlacements in all dimensions. The reader will become familiar with the techniques relevant to working with the underlying Poisson Process and the method of multi-scale renormalization, which helps in overcoming the ch...

  16. Convergence to Equilibrium in Energy-Reaction-Diffusion Systems Using Vector-Valued Functional Inequalities

    Science.gov (United States)

    Mielke, Alexander; Mittnenzweig, Markus

    2018-04-01

    We discuss how the recently developed energy dissipation methods for reaction diffusion systems can be generalized to the non-isothermal case. For this, we use concave entropies in terms of the densities of the species and the internal energy, where the importance is that the equilibrium densities may depend on the internal energy. Using the log-Sobolev estimate and variants for lower-order entropies as well as estimates for the entropy production of the nonlinear reactions, we give two methods to estimate the relative entropy by the total entropy production, namely a somewhat restrictive convexity method, which provides explicit decay rates, and a very general, but weaker compactness method.

  17. Weighted and vector-valued inequalities for one-sided maximal ...

    Indian Academy of Sciences (India)

    Department of Mathematics, Indian Institute of Science Education and Research,. Bhopal 462 .... The theory of one-sided maximal operators is of interest for its intrinsic nature and for connections to the ... Also, another motivation to study these.

  18. Abstract interpolation in vector-valued de Branges-Rovnyak spaces

    NARCIS (Netherlands)

    Ball, J.A.; Bolotnikov, V.; ter Horst, S.

    2011-01-01

    Following ideas from the Abstract Interpolation Problem of Katsnelson et al. (Operators in spaces of functions and problems in function theory, vol 146, pp 83–96, Naukova Dumka, Kiev, 1987) for Schur class functions, we study a general metric constrained interpolation problem for functions from a

  19. A randomized, placebo-controlled, double-blind trial of supplemental docosahexaenoic acid on cognitive processing speed and executive function in females of reproductive age with phenylketonuria: A pilot study☆, ☆☆

    Science.gov (United States)

    Yi, S.H.L.; Kable, J.A.; Evatt, M.L.; Singh, R.H.

    2014-01-01

    Low blood docosahexaenoic acid (DHA) is reported in patients with phenylketonuria (PKU); however, the functional implications in adolescents and adults are unknown. This pilot study investigated the effect of supplemental DHA on cognitive performance in 33 females with PKU ages 12–47 years. Participants were randomly assigned to receive DHA (10 mg/kg/day) or placebo for 4.5 months. Performance on cognitive processing speed and executive functioning tasks was evaluated at baseline and follow up. Intention-to-treat and per protocol analyses were performed. At follow up, biomarkers of DHA status were significantly higher in the DHA-supplemented group. Performance on the cognitive tasks and reported treatment-related adverse events did not differ. While no evidence of cognitive effect was seen, a larger sample size is needed to be conclusive, which may not be feasible in this population. Supplementation was a safe and effective way to increase biomarkers of DHA status (www.clinicaltrials.gov; Identifier: NCT00892554). PMID:22000478

  20. Micro-Texture Synthesis by Phase Randomization

    Directory of Open Access Journals (Sweden)

    Bruno Galerne

    2011-09-01

    Full Text Available This contribution is concerned with texture synthesis by example, the process of generating new texture images from a given sample. The Random Phase Noise algorithm presented here synthesizes a texture from an original image by simply randomizing its Fourier phase. It is able to reproduce textures which are characterized by their Fourier modulus, namely the random phase textures (or micro-textures.

  1. Effect of low-level laser therapy on the healing process of donor site in patients with grade 3 burn ulcer after skin graft surgery (a randomized clinical trial).

    Science.gov (United States)

    Vaghardoost, Reza; Momeni, Mahnoush; Kazemikhoo, Nooshafarin; Mokmeli, Soheila; Dahmardehei, Mostafa; Ansari, Fereshteh; Nilforoushzadeh, Mohammad Ali; Sabr Joo, Parisa; Mey Abadi, Sara; Naderi Gharagheshlagh, Soheila; Sassani, Saeed

    2018-04-01

    Skin graft is a standard therapeutic technique in patients with deep ulcers, but managing donor site after grafting is very important. Although several modern dressings are available to enhance the comfort of donor site, using techniques that accelerate wound healing may enhance patient satisfaction. Low-level laser therapy (LLLT) has been used in several medical fields, including healing of diabetic, surgical, and pressure ulcers, but there is not any report of using this method for healing of donor site in burn patients. The protocols and informed consent were reviewed according to Medical Ethics Board of Shahid Beheshti University of Medical Sciences (IR.SBMU.REC.1394.363) and Iranian Registry of Clinical Trials (IRCT2016020226069N2). Eighteen donor sites in 11 patients with grade 3 burn ulcer were selected. Donor areas were divided into 2 parts, for laser irradiation and control randomly. Laser area was irradiated by a red, 655-nm laser light, 150 mW, 2 J/cm 2 , on days 0 (immediately after surgery), 3, 5, and 7. Dressing and other therapeutic care for both sites were the same. The patients and the person who analyzed the results were blinded. The size of donor site reduced in both groups during the 7-day study period (P < 0.01) and this reduction was significantly greater in the laser group (P = 0.01). In the present study, for the first time, we evaluate the effects of LLLT on the healing process of donor site in burn patients. The results showed that local irradiation of red laser accelerates wound healing process significantly.

  2. Fragmentation of random trees

    International Nuclear Information System (INIS)

    Kalay, Z; Ben-Naim, E

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N→∞. We obtain analytically the size density ϕ s of trees of size s. The size density has power-law tail ϕ s ∼s −α with exponent α=1+(1/m). Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees. (paper)

  3. Random-walk enzymes

    Science.gov (United States)

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  4. Stochastic processes

    CERN Document Server

    Parzen, Emanuel

    1962-01-01

    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  5. Reserves Represented by Random Walks

    International Nuclear Information System (INIS)

    Filipe, J A; Ferreira, M A M; Andrade, M

    2012-01-01

    The reserves problem is studied through models based on Random Walks. Random walks are a classical particular case in the analysis of stochastic processes. They do not appear only to study reserves evolution models. They are also used to build more complex systems and as analysis instruments, in a theoretical feature, of other kind of systems. In this work by studying the reserves, the main objective is to see and guarantee that pensions funds get sustainable. Being the use of these models considering this goal a classical approach in the study of pensions funds, this work concluded about the problematic of reserves. A concrete example is presented.

  6. Random walks and diffusion on networks

    Science.gov (United States)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  7. Random walk on a population of random walkers

    International Nuclear Information System (INIS)

    Agliari, E; Burioni, R; Cassi, D; Neri, F M

    2008-01-01

    We consider a population of N labelled random walkers moving on a substrate, and an excitation jumping among the walkers upon contact. The label X(t) of the walker carrying the excitation at time t can be viewed as a stochastic process, where the transition probabilities are a stochastic process themselves. Upon mapping onto two simpler processes, the quantities characterizing X(t) can be calculated in the limit of long times and low walkers density. The results are compared with numerical simulations. Several different topologies for the substrate underlying diffusion are considered

  8. Differential acute postprandial effects of processed meat and isocaloric vegan meals on the gastrointestinal hormone response in subjects suffering from type 2 diabetes and healthy controls: a randomized crossover study.

    Directory of Open Access Journals (Sweden)

    Lenka Belinova

    Full Text Available The intake of meat, particularly processed meat, is a dietary risk factor for diabetes. Meat intake impairs insulin sensitivity and leads to increased oxidative stress. However, its effect on postprandial gastrointestinal hormone (GIH secretion is unclear. We aimed to investigate the acute effects of two standardized isocaloric meals: a processed hamburger meat meal rich in protein and saturated fat (M-meal and a vegan meal rich in carbohydrates (V-meal. We hypothesized that the meat meal would lead to abnormal postprandial increases in plasma lipids and oxidative stress markers and impaired GIH responses.In a randomized crossover study, 50 patients suffering from type 2 diabetes (T2D and 50 healthy subjects underwent two 3-h meal tolerance tests. For statistical analyses, repeated-measures ANOVA was performed.The M-meal resulted in a higher postprandial increase in lipids in both groups (p<0.001 and persistent postprandial hyperinsulinemia in patients with diabetes (p<0.001. The plasma glucose levels were significantly higher after the V-meal only at the peak level. The plasma concentrations of glucose-dependent insulinotropic peptide (GIP, peptide tyrosine-tyrosine (PYY and pancreatic polypeptide (PP were higher (p<0.05, p<0.001, p<0.001, respectively and the ghrelin concentration was lower (p<0.001 after the M-meal in healthy subjects. In contrast, the concentrations of GIP, PYY and PP were significantly lower after the M-meal in T2D patients (p<0.001. Compared with the V-meal, the M-meal was associated with a larger increase in lipoperoxidation in T2D patients (p<0.05.Our results suggest that the diet composition and the energy content, rather than the carbohydrate count, should be important considerations for dietary management and demonstrate that processed meat consumption is accompanied by impaired GIH responses and increased oxidative stress marker levels in diabetic patients.ClinicalTrials.gov NCT01572402.

  9. Snake representation of a superprocess in random environment

    OpenAIRE

    Mytnik, Leonid; Xiong, Jie; Zeitouni, Ofer

    2011-01-01

    We consider (discrete time) branching particles in a random environment which is i.i.d. in time and possibly spatially correlated. We prove a representation of the limit process by means of a Brownian snake in random environment.

  10. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  11. Cover times of random searches

    Science.gov (United States)

    Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël

    2015-10-01

    How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.

  12. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  13. Markov Random Fields on Triangle Meshes

    DEFF Research Database (Denmark)

    Andersen, Vedrana; Aanæs, Henrik; Bærentzen, Jakob Andreas

    2010-01-01

    In this paper we propose a novel anisotropic smoothing scheme based on Markov Random Fields (MRF). Our scheme is formulated as two coupled processes. A vertex process is used to smooth the mesh by displacing the vertices according to a MRF smoothness prior, while an independent edge process label...

  14. Stochastic processes

    CERN Document Server

    Borodin, Andrei N

    2017-01-01

    This book provides a rigorous yet accessible introduction to the theory of stochastic processes. A significant part of the book is devoted to the classic theory of stochastic processes. In turn, it also presents proofs of well-known results, sometimes together with new approaches. Moreover, the book explores topics not previously covered elsewhere, such as distributions of functionals of diffusions stopped at different random times, the Brownian local time, diffusions with jumps, and an invariance principle for random walks and local times. Supported by carefully selected material, the book showcases a wealth of examples that demonstrate how to solve concrete problems by applying theoretical results. It addresses a broad range of applications, focusing on concrete computational techniques rather than on abstract theory. The content presented here is largely self-contained, making it suitable for researchers and graduate students alike.

  15. Random walks, random fields, and disordered systems

    CERN Document Server

    Černý, Jiří; Kotecký, Roman

    2015-01-01

    Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...

  16. The Theory of Random Laser Systems

    International Nuclear Information System (INIS)

    Xunya Jiang

    2002-01-01

    Studies of random laser systems are a new direction with promising potential applications and theoretical interest. The research is based on the theories of localization and laser physics. So far, the research shows that there are random lasing modes inside the systems which is quite different from the common laser systems. From the properties of the random lasing modes, they can understand the phenomena observed in the experiments, such as multi-peak and anisotropic spectrum, lasing mode number saturation, mode competition and dynamic processes, etc. To summarize, this dissertation has contributed the following in the study of random laser systems: (1) by comparing the Lamb theory with the Letokhov theory, the general formulas of the threshold length or gain of random laser systems were obtained; (2) they pointed out the vital weakness of previous time-independent methods in random laser research; (3) a new model which includes the FDTD method and the semi-classical laser theory. The solutions of this model provided an explanation of the experimental results of multi-peak and anisotropic emission spectra, predicted the saturation of lasing modes number and the length of localized lasing modes; (4) theoretical (Lamb theory) and numerical (FDTD and transfer-matrix calculation) studies of the origin of localized lasing modes in the random laser systems; and (5) proposal of using random lasing modes as a new path to study wave localization in random systems and prediction of the lasing threshold discontinuity at mobility edge

  17. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B...... virus (HBV) infection were identified that tested Chinese medicinal herbs. They were published in 49 Chinese journals. Only 10% (18/176) of the studies reported the method by which they randomized patients. Only two reported allocation concealment and were considered as adequate. Twenty percent (30...

  18. Security and Composability of Randomness Expansion from Bell Inequalities

    NARCIS (Netherlands)

    S. Fehr (Serge); R. Gelles; C. Schaffner (Christian)

    2013-01-01

    htmlabstractThe nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness

  19. The random walk model of intrafraction movement

    International Nuclear Information System (INIS)

    Ballhausen, H; Reiner, M; Kantz, S; Belka, C; Söhn, M

    2013-01-01

    The purpose of this paper is to understand intrafraction movement as a stochastic process driven by random external forces. The hypothetically proposed three-dimensional random walk model has significant impact on optimal PTV margins and offers a quantitatively correct explanation of experimental findings. Properties of the random walk are calculated from first principles, in particular fraction-average population density distributions for displacements along the principal axes. When substituted into the established optimal margin recipes these fraction-average distributions yield safety margins about 30% smaller as compared to the suggested values from end-of-fraction Gaussian fits. Stylized facts of a random walk are identified in clinical data, such as the increase of the standard deviation of displacements with the square root of time. Least squares errors in the comparison to experimental results are reduced by about 50% when accounting for non-Gaussian corrections from the random walk model. (paper)

  20. The random walk model of intrafraction movement.

    Science.gov (United States)

    Ballhausen, H; Reiner, M; Kantz, S; Belka, C; Söhn, M

    2013-04-07

    The purpose of this paper is to understand intrafraction movement as a stochastic process driven by random external forces. The hypothetically proposed three-dimensional random walk model has significant impact on optimal PTV margins and offers a quantitatively correct explanation of experimental findings. Properties of the random walk are calculated from first principles, in particular fraction-average population density distributions for displacements along the principal axes. When substituted into the established optimal margin recipes these fraction-average distributions yield safety margins about 30% smaller as compared to the suggested values from end-of-fraction gaussian fits. Stylized facts of a random walk are identified in clinical data, such as the increase of the standard deviation of displacements with the square root of time. Least squares errors in the comparison to experimental results are reduced by about 50% when accounting for non-gaussian corrections from the random walk model.

  1. Random surfaces and strings

    International Nuclear Information System (INIS)

    Ambjoern, J.

    1987-08-01

    The theory of strings is the theory of random surfaces. I review the present attempts to regularize the world sheet of the string by triangulation. The corresponding statistical theory of triangulated random surfaces has a surprising rich structure, but the connection to conventional string theory seems non-trivial. (orig.)

  2. Derandomizing from random strings

    NARCIS (Netherlands)

    Buhrman, H.; Fortnow, L.; Koucký, M.; Loff, B.

    2010-01-01

    In this paper we show that BPP is truth-table reducible to the set of Kolmogorov random strings R(K). It was previously known that PSPACE, and hence BPP is Turing-reducible to R(K). The earlier proof relied on the adaptivity of the Turing-reduction to find a Kolmogorov-random string of polynomial

  3. Quantum random number generator

    Science.gov (United States)

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  4. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  5. Algorithmic randomness and physical entropy

    International Nuclear Information System (INIS)

    Zurek, W.H.

    1989-01-01

    Algorithmic randomness provides a rigorous, entropylike measure of disorder of an individual, microscopic, definite state of a physical system. It is defined by the size (in binary digits) of the shortest message specifying the microstate uniquely up to the assumed resolution. Equivalently, algorithmic randomness can be expressed as the number of bits in the smallest program for a universal computer that can reproduce the state in question (for instance, by plotting it with the assumed accuracy). In contrast to the traditional definitions of entropy, algorithmic randomness can be used to measure disorder without any recourse to probabilities. Algorithmic randomness is typically very difficult to calculate exactly but relatively easy to estimate. In large systems, probabilistic ensemble definitions of entropy (e.g., coarse-grained entropy of Gibbs and Boltzmann's entropy H=lnW, as well as Shannon's information-theoretic entropy) provide accurate estimates of the algorithmic entropy of an individual system or its average value for an ensemble. One is thus able to rederive much of thermodynamics and statistical mechanics in a setting very different from the usual. Physical entropy, I suggest, is a sum of (i) the missing information measured by Shannon's formula and (ii) of the algorithmic information content---algorithmic randomness---present in the available data about the system. This definition of entropy is essential in describing the operation of thermodynamic engines from the viewpoint of information gathering and using systems. These Maxwell demon-type entities are capable of acquiring and processing information and therefore can ''decide'' on the basis of the results of their measurements and computations the best strategy for extracting energy from their surroundings. From their internal point of view the outcome of each measurement is definite

  6. Random scalar fields and hyperuniformity

    Science.gov (United States)

    Ma, Zheng; Torquato, Salvatore

    2017-06-01

    Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.

  7. Autonomous Byte Stream Randomizer

    Science.gov (United States)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  8. Evolving Random Forest for Preference Learning

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Shaker, Noor

    2015-01-01

    This paper introduces a novel approach for pairwise preference learning through a combination of an evolutionary method and random forest. Grammatical evolution is used to describe the structure of the trees in the Random Forest (RF) and to handle the process of evolution. Evolved random forests ...... obtained for predicting pairwise self-reports of users for the three emotional states engagement, frustration and challenge show very promising results that are comparable and in some cases superior to those obtained from state-of-the-art methods....

  9. Correlated randomness and switching phenomena

    Science.gov (United States)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  10. Tempered stable laws as random walk limits

    OpenAIRE

    Chakrabarty, Arijit; Meerschaert, Mark M.

    2010-01-01

    Stable laws can be tempered by modifying the L\\'evy measure to cool the probability of large jumps. Tempered stable laws retain their signature power law behavior at infinity, and infinite divisibility. This paper develops random walk models that converge to a tempered stable law under a triangular array scheme. Since tempered stable laws and processes are useful in statistical physics, these random walk models can provide a basic physical model for the underlying physical phenomena.

  11. Quantum random access memory

    OpenAIRE

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2007-01-01

    A random access memory (RAM) uses n bits to randomly address N=2^n distinct memory cells. A quantum random access memory (qRAM) uses n qubits to address any quantum superposition of N memory cells. We present an architecture that exponentially reduces the requirements for a memory call: O(log N) switches need be thrown instead of the N used in conventional (classical or quantum) RAM designs. This yields a more robust qRAM algorithm, as it in general requires entanglement among exponentially l...

  12. Randomization of inspections

    International Nuclear Information System (INIS)

    Markin, J.T.

    1989-01-01

    As the numbers and complexity of nuclear facilities increase, limitations on resources for international safeguards may restrict attainment of safeguards goals. One option for improving the efficiency of limited resources is to expand the current inspection regime to include random allocation of the amount and frequency of inspection effort to material strata or to facilities. This paper identifies the changes in safeguards policy, administrative procedures, and operational procedures that would be necessary to accommodate randomized inspections and identifies those situations where randomization can improve inspection efficiency and those situations where the current nonrandom inspections should be maintained. 9 refs., 1 tab

  13. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  14. Introduction to stochastic processes

    CERN Document Server

    Cinlar, Erhan

    2013-01-01

    Clear presentation employs methods that recognize computer-related aspects of theory. Topics include expectations and independence, Bernoulli processes and sums of independent random variables, Markov chains, renewal theory, more. 1975 edition.

  15. Tunable random packings

    International Nuclear Information System (INIS)

    Lumay, G; Vandewalle, N

    2007-01-01

    We present an experimental protocol that allows one to tune the packing fraction η of a random pile of ferromagnetic spheres from a value close to the lower limit of random loose packing η RLP ≅0.56 to the upper limit of random close packing η RCP ≅0.64. This broad range of packing fraction values is obtained under normal gravity in air, by adjusting a magnetic cohesion between the grains during the formation of the pile. Attractive and repulsive magnetic interactions are found to affect stongly the internal structure and the stability of sphere packing. After the formation of the pile, the induced cohesion is decreased continuously along a linear decreasing ramp. The controlled collapse of the pile is found to generate various and reproducible values of the random packing fraction η

  16. Random maintenance policies

    CERN Document Server

    Nakagawa, Toshio

    2014-01-01

    Exploring random maintenance models, this book provides an introduction to the implementation of random maintenance, and it is one of the first books to be written on this subject.  It aims to help readers learn new techniques for applying random policies to actual reliability models, and it provides new theoretical analyses of various models including classical replacement, preventive maintenance and inspection policies. These policies are applied to scheduling problems, backup policies of database systems, maintenance policies of cumulative damage models, and reliability of random redundant systems. Reliability theory is a major concern for engineers and managers, and in light of Japan’s recent earthquake, the reliability of large-scale systems has increased in importance. This also highlights the need for a new notion of maintenance and reliability theory, and how this can practically be applied to systems. Providing an essential guide for engineers and managers specializing in reliability maintenance a...

  17. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  18. Random Number Generation for High Performance Computing

    Science.gov (United States)

    2015-01-01

    number streams, a quality metric for the parallel random number streams. * * * * * Atty. Dkt . No.: 5660-14400 Customer No. 35690 Eric B. Meyertons...responsibility to ensure timely payment of maintenance fees when due. Pagel of3 PTOL-85 (Rev. 02/11) Atty. Dkt . No.: 5660-14400 Page 1 Meyertons...with each subtask executed by a separate thread or process (henceforth, process). Each process has Atty. Dkt . No.: 5660-14400 Page 2 Meyertons

  19. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  20. The effect of changing stool collection processes on compliance in nationwide organized screening using a fecal occult blood test (FOBT) in Korea: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Shin, Hye Young; Suh, Mina; Baik, Hyung Won; Choi, Kui Son; Park, Boyoung; Jun, Jae Kwan; Lee, Chan Wha; Oh, Jae Hwan; Lee, You Kyoung; Han, Dong Soo; Lee, Do-Hoon

    2014-11-26

    Colorectal cancer (CRC) screening by fecal occult blood test (FOBT) significantly reduces CRC mortality, and compliance rates directly influence the efficacy of this screening method. The aim of this study is to investigate whether stool collection strategies affect compliance with the FOBT. In total, 3,596 study participants aged between 50 and 74 years will be recruited. The study will be conducted using a randomized controlled trial, with a 2 × 2 factorial design resulting in four groups. The first factor is the method of stool-collection device distribution (mailing vs. visiting the clinic) and the second is the type of stool-collection device (sampling kit vs. conventional container). Participants will be randomly assigned to one of four groups: (1) sampling kit received by mail; (2) conventional container received by mail; (3) sampling kit received at the clinic; (4) conventional container received at the clinic (control group). The primary outcome will be the FOBT compliance rate; satisfaction and intention to be rescreened in the next screening round will also be evaluated. The rates of positive FOBT results and detection of advanced adenomas or cancers through colonoscopies will also be compared between the two collection containers. Identifying a method of FOBT that yields high compliance rates will be a key determinant of the success of CRC screening. The findings of this study will provide reliable information for health policy makers to develop evidence-based strategies for a high compliance rate. KCT0000803 Date of registration in primary registry: 9 January, 2013.

  1. More randomness from the same data

    International Nuclear Information System (INIS)

    Bancal, Jean-Daniel; Sheridan, Lana; Scarani, Valerio

    2014-01-01

    Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment (Christensen et al 2013 Phys. Rev. Lett. 111 130406), we show that about twice as much randomness as previously reported can be potentially extracted from this setup. (paper)

  2. Palm theory for random time changes

    Directory of Open Access Journals (Sweden)

    Masakiyo Miyazawa

    2001-01-01

    Full Text Available Palm distributions are basic tools when studying stationarity in the context of point processes, queueing systems, fluid queues or random measures. The framework varies with the random phenomenon of interest, but usually a one-dimensional group of measure-preserving shifts is the starting point. In the present paper, by alternatively using a framework involving random time changes (RTCs and a two-dimensional family of shifts, we are able to characterize all of the above systems in a single framework. Moreover, this leads to what we call the detailed Palm distribution (DPD which is stationary with respect to a certain group of shifts. The DPD has a very natural interpretation as the distribution seen at a randomly chosen position on the extended graph of the RTC, and satisfies a general duality criterion: the DPD of the DPD gives the underlying probability P in return.

  3. Random thermal stress in concrete containments

    International Nuclear Information System (INIS)

    Singh, M.P.; Heller, R.A.

    1980-01-01

    Currently, the overly conservative thermal design forces are obtained on the basis of simplified assumptions made about the temperature gradient across the containment wall. Using the method presented in this paper, a more rational and better estimate of the design forces can be obtained. Herein, the outside temperature is considered to consist of a constant mean on which yearly and daily harmonic changes plus a randomly varying part are superimposed. The random part is modeled as a stationary random process. To obtain the stresses due to random and harmonic temperatures, the complex frequency response function approach has been used. Numerical results obtained for a typical containment show that the higher frequency temperature variations, though of large magnitude, induce relatively small forces in a containment. Therefore, in a containment design, a rational separation of more effective, slowly varying temperatures, such as seasonal cycle from less effective but more frequently occuring daily and hourly changes, is desirable to obtain rational design forces. 7 refs

  4. What Randomized Benchmarking Actually Measures

    International Nuclear Information System (INIS)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-01-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  5. Reconstructing random media

    International Nuclear Information System (INIS)

    Yeong, C.L.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society

  6. Intermittency and random matrices

    Science.gov (United States)

    Sokoloff, Dmitry; Illarionov, E. A.

    2015-08-01

    A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.

  7. Random quantum operations

    International Nuclear Information System (INIS)

    Bruzda, Wojciech; Cappellini, Valerio; Sommers, Hans-Juergen; Zyczkowski, Karol

    2009-01-01

    We define a natural ensemble of trace preserving, completely positive quantum maps and present algorithms to generate them at random. Spectral properties of the superoperator Φ associated with a given quantum map are investigated and a quantum analogue of the Frobenius-Perron theorem is proved. We derive a general formula for the density of eigenvalues of Φ and show the connection with the Ginibre ensemble of real non-symmetric random matrices. Numerical investigations of the spectral gap imply that a generic state of the system iterated several times by a fixed generic map converges exponentially to an invariant state

  8. Random a-adic groups and random net fractals

    Energy Technology Data Exchange (ETDEWEB)

    Li Yin [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: Lyjerry7788@hotmail.com; Su Weiyi [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: suqiu@nju.edu.cn

    2008-08-15

    Based on random a-adic groups, this paper investigates the relationship between the existence conditions of a positive flow in a random network and the estimation of the Hausdorff dimension of a proper random net fractal. Subsequently we describe some particular random fractals for which our results can be applied. Finally the Mauldin and Williams theorem is shown to be very important example for a random Cantor set with application in physics as shown in E-infinity theory.

  9. Markov processes

    CERN Document Server

    Kirkwood, James R

    2015-01-01

    Review of ProbabilityShort HistoryReview of Basic Probability DefinitionsSome Common Probability DistributionsProperties of a Probability DistributionProperties of the Expected ValueExpected Value of a Random Variable with Common DistributionsGenerating FunctionsMoment Generating FunctionsExercisesDiscrete-Time, Finite-State Markov ChainsIntroductionNotationTransition MatricesDirected Graphs: Examples of Markov ChainsRandom Walk with Reflecting BoundariesGambler’s RuinEhrenfest ModelCentral Problem of Markov ChainsCondition to Ensure a Unique Equilibrium StateFinding the Equilibrium StateTransient and Recurrent StatesIndicator FunctionsPerron-Frobenius TheoremAbsorbing Markov ChainsMean First Passage TimeMean Recurrence Time and the Equilibrium StateFundamental Matrix for Regular Markov ChainsDividing a Markov Chain into Equivalence ClassesPeriodic Markov ChainsReducible Markov ChainsSummaryExercisesDiscrete-Time, Infinite-State Markov ChainsRenewal ProcessesDelayed Renewal ProcessesEquilibrium State f...

  10. [Intel random number generator-based true random number generator].

    Science.gov (United States)

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  11. Uniform random number generators

    Science.gov (United States)

    Farr, W. R.

    1971-01-01

    Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.

  12. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered as ...

  13. Random eigenvalue problems revisited

    Indian Academy of Sciences (India)

    statistical distributions; linear stochastic systems. 1. ... dimensional multivariate Gaussian random vector with mean µ ∈ Rm and covariance ... 5, the proposed analytical methods are applied to a three degree-of-freedom system and the ...... The joint pdf ofω1 andω3 is however close to a bivariate Gaussian density function.

  14. Generating random numbers by means of nonlinear dynamic systems

    Science.gov (United States)

    Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi

    2018-07-01

    To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the angular displacements of chaotic motion, random binary numerical arrays can be generated. To check the randomness of generated numerical arrays, the NIST Special Publication 800-20 method was adopted. As a result, it was found that all the random arrays which were generated by the chaotic motion could pass the validity criteria and some of them were even better than the quality of pseudo-random numbers generated by a computer. Through the experiments, it is demonstrated that chaotic pendulum can be used as an efficient mechanical facility in generating random numbers, and can be applied in teaching random motion to the students.

  15. Alzheimer random walk

    Science.gov (United States)

    Odagaki, Takashi; Kasuya, Keisuke

    2017-09-01

    Using the Monte Carlo simulation, we investigate a memory-impaired self-avoiding walk on a square lattice in which a random walker marks each of sites visited with a given probability p and makes a random walk avoiding the marked sites. Namely, p = 0 and p = 1 correspond to the simple random walk and the self-avoiding walk, respectively. When p> 0, there is a finite probability that the walker is trapped. We show that the trap time distribution can well be fitted by Stacy's Weibull distribution b(a/b){a+1}/{b}[Γ({a+1}/{b})]-1x^a\\exp(-a/bx^b)} where a and b are fitting parameters depending on p. We also find that the mean trap time diverges at p = 0 as p- α with α = 1.89. In order to produce sufficient number of long walks, we exploit the pivot algorithm and obtain the mean square displacement and its Flory exponent ν(p) as functions of p. We find that the exponent determined for 1000 step walks interpolates both limits ν(0) for the simple random walk and ν(1) for the self-avoiding walk as [ ν(p) - ν(0) ] / [ ν(1) - ν(0) ] = pβ with β = 0.388 when p ≪ 0.1 and β = 0.0822 when p ≫ 0.1. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  16. Random walk of the baryon number

    International Nuclear Information System (INIS)

    Kazaryan, A.M.; Khlebnikov, S.Y.; Shaposhnikov, M.E.

    1989-01-01

    A new approach is suggested for the anomalous nonconservation of baryon number in the electroweak theory at high temperatures. Arguments are presented in support of the idea that the baryon-number changing reactions may be viewed as random Markov processes. Making use of the general theory of Markov processes, the Fokker--Planck equation for the baryon-number distribution density is obtained and kinetic coefficients are calculated

  17. Brownian motion, dynamical randomness and irreversibility

    International Nuclear Information System (INIS)

    Gaspard, Pierre

    2005-01-01

    A relationship giving the entropy production as the difference between a time-reversed entropy per unit time and the standard one is applied to stochastic processes of diffusion of Brownian particles between two reservoirs at different concentrations. The entropy production in the nonequilibrium steady state is interpreted in terms of a time asymmetry in the dynamical randomness between the forward and backward paths of the diffusion process

  18. Generating equilateral random polygons in confinement II

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue an earlier study (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202) on the generation algorithms of random equilateral polygons confined in a sphere. Here, the equilateral random polygons are rooted at the center of the confining sphere and the confining sphere behaves like an absorbing boundary. One way to generate such a random polygon is the accept/reject method in which an unconditioned equilateral random polygon rooted at origin is generated. The polygon is accepted if it is within the confining sphere, otherwise it is rejected and the process is repeated. The algorithm proposed in this paper offers an alternative to the accept/reject method, yielding a faster generation process when the confining sphere is small. In order to use this algorithm effectively, a large, reusable data set needs to be pre-computed only once. We derive the theoretical distribution of the given random polygon model and demonstrate, with strong numerical evidence, that our implementation of the algorithm follows this distribution. A run time analysis and a numerical error estimate are given at the end of the paper. (paper)

  19. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  20. An integrable low-cost hardware random number generator

    Science.gov (United States)

    Ranasinghe, Damith C.; Lim, Daihyun; Devadas, Srinivas; Jamali, Behnam; Zhu, Zheng; Cole, Peter H.

    2005-02-01

    A hardware random number generator is different from a pseudo-random number generator; a pseudo-random number generator approximates the assumed behavior of a real hardware random number generator. Simple pseudo random number generators suffices for most applications, however for demanding situations such as the generation of cryptographic keys, requires an efficient and a cost effective source of random numbers. Arbiter-based Physical Unclonable Functions (PUFs) proposed for physical authentication of ICs exploits statistical delay variation of wires and transistors across integrated circuits, as a result of process variations, to build a secret key unique to each IC. Experimental results and theoretical studies show that a sufficient amount of variation exits across IC"s. This variation enables each IC to be identified securely. It is possible to exploit the unreliability of these PUF responses to build a physical random number generator. There exists measurement noise, which comes from the instability of an arbiter when it is in a racing condition. There exist challenges whose responses are unpredictable. Without environmental variations, the responses of these challenges are random in repeated measurements. Compared to other physical random number generators, the PUF-based random number generators can be a compact and a low-power solution since the generator need only be turned on when required. A 64-stage PUF circuit costs less than 1000 gates and the circuit can be implemented using a standard IC manufacturing processes. In this paper we have presented a fast and an efficient random number generator, and analysed the quality of random numbers produced using an array of tests used by the National Institute of Standards and Technology to evaluate the randomness of random number generators designed for cryptographic applications.

  1. A matrix contraction process

    Science.gov (United States)

    Wilkinson, Michael; Grant, John

    2018-03-01

    We consider a stochastic process in which independent identically distributed random matrices are multiplied and where the Lyapunov exponent of the product is positive. We continue multiplying the random matrices as long as the norm, ɛ, of the product is less than unity. If the norm is greater than unity we reset the matrix to a multiple of the identity and then continue the multiplication. We address the problem of determining the probability density function of the norm, \

  2. Random Interchange of Magnetic Connectivity

    Science.gov (United States)

    Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.

    2015-12-01

    Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)

  3. Entanglement dynamics in random media

    Science.gov (United States)

    Menezes, G.; Svaiter, N. F.; Zarro, C. A. D.

    2017-12-01

    We study how the entanglement dynamics between two-level atoms is impacted by random fluctuations of the light cone. In our model the two-atom system is envisaged as an open system coupled with an electromagnetic field in the vacuum state. We employ the quantum master equation in the Born-Markov approximation in order to describe the completely positive time evolution of the atomic system. We restrict our investigations to the situation in which the atoms are coupled individually to two spatially separated cavities, one of which displays the emergence of light-cone fluctuations. In such a disordered cavity, we assume that the coefficients of the Klein-Gordon equation are random functions of the spatial coordinates. The disordered medium is modeled by a centered, stationary, and Gaussian process. We demonstrate that disorder has the effect of slowing down the entanglement decay. We conjecture that in a strong-disorder environment the mean life of entangled states can be enhanced in such a way as to almost completely suppress quantum nonlocal decoherence.

  4. The physics of randomness and regularities for languages (lifetimes, family trees, and the second languages); in terms of random matrices

    OpenAIRE

    Tuncay, Caglar

    2007-01-01

    The physics of randomness and regularities for languages (mother tongues) and their lifetimes and family trees and for the second languages are studied in terms of two opposite processes; random multiplicative noise [1], and fragmentation [2], where the original model is given in the matrix format. We start with a random initial world, and come out with the regularities, which mimic various empirical data [3] for the present languages.

  5. Free random variables

    CERN Document Server

    Voiculescu, Dan; Nica, Alexandru

    1992-01-01

    This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.

  6. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  7. On Complex Random Variables

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable  is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of  have a complex univariate normal distribution. The characteristic function of  has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector  is Hermitian positive definite. Marginal distributions of  have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.

  8. Effect and process evaluation of a kindergarten-based, family-involved intervention with a randomized cluster design on sedentary behaviour in 4- to 6- year old European preschool children: The ToyBox-study.

    Science.gov (United States)

    Latomme, Julie; Cardon, Greet; De Bourdeaudhuij, Ilse; Iotova, Violeta; Koletzko, Berthold; Socha, Piotr; Moreno, Luis; Androutsos, Odysseas; Manios, Yannis; De Craemer, Marieke

    2017-01-01

    The aim of the present study evaluated the effect and process of the ToyBox-intervention on proxy-reported sedentary behaviours in 4- to 6-year-old preschoolers from six European countries. In total, 2434 preschoolers' parents/primary caregivers (mean age: 4.7±0.4 years, 52.2% boys) filled out a questionnaire, assessing preschoolers' sedentary behaviours (TV/DVD/video viewing, computer/video games use and quiet play) on weekdays and weekend days. Multilevel repeated measures analyses were conducted to measure the intervention effects. Additionally, process evaluation data were included to better understand the intervention effects. Positive intervention effects were found for computer/video games use. In the total sample, the intervention group showed a smaller increase in computer/video games use on weekdays (ß = -3.40, p = 0.06; intervention: +5.48 min/day, control: +8.89 min/day) and on weekend days (ß = -5.97, p = 0.05; intervention: +9.46 min/day, control: +15.43 min/day) from baseline to follow-up, compared to the control group. Country-specific analyses showed similar effects in Belgium and Bulgaria, while no significant intervention effects were found in the other countries. Process evaluation data showed relatively low teachers' and low parents' process evaluation scores for the sedentary behaviour component of the intervention (mean: 15.6/24, range: 2.5-23.5 and mean: 8.7/17, range: 0-17, respectively). Higher parents' process evaluation scores were related to a larger intervention effect, but higher teachers' process evaluation scores were not. The ToyBox-intervention had a small, positive effect on European preschoolers' computer/video games use on both weekdays and weekend days, but not on TV/DVD/video viewing or quiet play. The lack of larger effects can possibly be due to the fact that parents were only passively involved in the intervention and to the fact that the intervention was too demanding for the teachers. Future interventions targeting

  9. Effect and process evaluation of a kindergarten-based, family-involved intervention with a randomized cluster design on sedentary behaviour in 4- to 6- year old European preschool children: The ToyBox-study.

    Directory of Open Access Journals (Sweden)

    Julie Latomme

    Full Text Available The aim of the present study evaluated the effect and process of the ToyBox-intervention on proxy-reported sedentary behaviours in 4- to 6-year-old preschoolers from six European countries.In total, 2434 preschoolers' parents/primary caregivers (mean age: 4.7±0.4 years, 52.2% boys filled out a questionnaire, assessing preschoolers' sedentary behaviours (TV/DVD/video viewing, computer/video games use and quiet play on weekdays and weekend days. Multilevel repeated measures analyses were conducted to measure the intervention effects. Additionally, process evaluation data were included to better understand the intervention effects.Positive intervention effects were found for computer/video games use. In the total sample, the intervention group showed a smaller increase in computer/video games use on weekdays (ß = -3.40, p = 0.06; intervention: +5.48 min/day, control: +8.89 min/day and on weekend days (ß = -5.97, p = 0.05; intervention: +9.46 min/day, control: +15.43 min/day from baseline to follow-up, compared to the control group. Country-specific analyses showed similar effects in Belgium and Bulgaria, while no significant intervention effects were found in the other countries. Process evaluation data showed relatively low teachers' and low parents' process evaluation scores for the sedentary behaviour component of the intervention (mean: 15.6/24, range: 2.5-23.5 and mean: 8.7/17, range: 0-17, respectively. Higher parents' process evaluation scores were related to a larger intervention effect, but higher teachers' process evaluation scores were not.The ToyBox-intervention had a small, positive effect on European preschoolers' computer/video games use on both weekdays and weekend days, but not on TV/DVD/video viewing or quiet play. The lack of larger effects can possibly be due to the fact that parents were only passively involved in the intervention and to the fact that the intervention was too demanding for the teachers. Future interventions

  10. Effect and process evaluation of a kindergarten-based, family-involved intervention with a randomized cluster design on sedentary behaviour in 4- to 6- year old European preschool children: The ToyBox-study

    Science.gov (United States)

    Latomme, Julie; Cardon, Greet; De Bourdeaudhuij, Ilse; Iotova, Violeta; Koletzko, Berthold; Socha, Piotr; Moreno, Luis; Androutsos, Odysseas; Manios, Yannis; De Craemer, Marieke

    2017-01-01

    Background The aim of the present study evaluated the effect and process of the ToyBox-intervention on proxy-reported sedentary behaviours in 4- to 6-year-old preschoolers from six European countries. Methods In total, 2434 preschoolers’ parents/primary caregivers (mean age: 4.7±0.4 years, 52.2% boys) filled out a questionnaire, assessing preschoolers’ sedentary behaviours (TV/DVD/video viewing, computer/video games use and quiet play) on weekdays and weekend days. Multilevel repeated measures analyses were conducted to measure the intervention effects. Additionally, process evaluation data were included to better understand the intervention effects. Results Positive intervention effects were found for computer/video games use. In the total sample, the intervention group showed a smaller increase in computer/video games use on weekdays (ß = -3.40, p = 0.06; intervention: +5.48 min/day, control: +8.89 min/day) and on weekend days (ß = -5.97, p = 0.05; intervention: +9.46 min/day, control: +15.43 min/day) from baseline to follow-up, compared to the control group. Country-specific analyses showed similar effects in Belgium and Bulgaria, while no significant intervention effects were found in the other countries. Process evaluation data showed relatively low teachers’ and low parents’ process evaluation scores for the sedentary behaviour component of the intervention (mean: 15.6/24, range: 2.5–23.5 and mean: 8.7/17, range: 0–17, respectively). Higher parents’ process evaluation scores were related to a larger intervention effect, but higher teachers’ process evaluation scores were not. Conclusions The ToyBox-intervention had a small, positive effect on European preschoolers’ computer/video games use on both weekdays and weekend days, but not on TV/DVD/video viewing or quiet play. The lack of larger effects can possibly be due to the fact that parents were only passively involved in the intervention and to the fact that the intervention was too

  11. Cross over of recurrence networks to random graphs and random ...

    Indian Academy of Sciences (India)

    2017-01-27

    Jan 27, 2017 ... that all recurrence networks can cross over to random geometric graphs by adding sufficient amount of noise to .... municative [19] or social [20], deviate from the random ..... He has shown that the spatial effects become.

  12. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  13. A random number generator for continuous random variables

    Science.gov (United States)

    Guerra, V. M.; Tapia, R. A.; Thompson, J. R.

    1972-01-01

    A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.

  14. Random and cooperative sequential adsorption

    Science.gov (United States)

    Evans, J. W.

    1993-10-01

    Irreversible random sequential adsorption (RSA) on lattices, and continuum "car parking" analogues, have long received attention as models for reactions on polymer chains, chemisorption on single-crystal surfaces, adsorption in colloidal systems, and solid state transformations. Cooperative generalizations of these models (CSA) are sometimes more appropriate, and can exhibit richer kinetics and spatial structure, e.g., autocatalysis and clustering. The distribution of filled or transformed sites in RSA and CSA is not described by an equilibrium Gibbs measure. This is the case even for the saturation "jammed" state of models where the lattice or space cannot fill completely. However exact analysis is often possible in one dimension, and a variety of powerful analytic methods have been developed for higher dimensional models. Here we review the detailed understanding of asymptotic kinetics, spatial correlations, percolative structure, etc., which is emerging for these far-from-equilibrium processes.

  15. Random skew plane partitions with a piecewise periodic back wall

    DEFF Research Database (Denmark)

    Boutillier, Cedric; Mkrtchyan, Sevak; Reshetikhin, Nicolai

    Random skew plane partitions of large size distributed according to an appropriately scaled Schur process develop limit shapes. In the present work we consider the limit of large random skew plane partitions where the inner boundary approaches a piecewise linear curve with non-lattice slopes. Muc...

  16. A random energy model for size dependence : recurrence vs. transience

    NARCIS (Netherlands)

    Külske, Christof

    1998-01-01

    We investigate the size dependence of disordered spin models having an infinite number of Gibbs measures in the framework of a simplified 'random energy model for size dependence'. We introduce two versions (involving either independent random walks or branching processes), that can be seen as

  17. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  18. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  19. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    In this paper, we investigate a dynamical system in a random setting of dual randomness in space and time variables in which both the imperfection of the structure and the load function are considered random , each with a statistical zero-mean .The auto- covariance of the load is correlated as an exponentially decaying ...

  20. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  1. A comparison of random walks in dependent random environments

    NARCIS (Netherlands)

    Scheinhardt, Willem R.W.; Kroese, Dirk

    We provide exact computations for the drift of random walks in dependent random environments, including $k$-dependent and moving average environments. We show how the drift can be characterized and evaluated using Perron–Frobenius theory. Comparing random walks in various dependent environments, we

  2. Filling of a Poisson trap by a population of random intermittent searchers

    KAUST Repository

    Bressloff, Paul C.; Newby, Jay M.

    2012-01-01

    We extend the continuum theory of random intermittent search processes to the case of N independent searchers looking to deliver cargo to a single hidden target located somewhere on a semi-infinite track. Each searcher randomly switches between a

  3. Vacuum instability in a random electric field

    International Nuclear Information System (INIS)

    Krive, I.V.; Pastur, L.A.

    1984-01-01

    The reaction of the vacuum on an intense spatially homogeneous random electric field is investigated. It is shown that a stochastic electric field always causes a breakdown of the boson vacuum, and the number of pairs of particles which are created by the electric field increases exponentially in time. For the choice of potential field in the form of a dichotomic random process we find in explicit form the dependence of the average number of pairs of particles on the time of the action of the source of the stochastic field. For the fermion vacuum the average number of pairs of particles which are created by the field in the lowest order of perturbation theory in the amplitude of the random field is independent of time

  4. Diffusion in randomly perturbed dissipative dynamics

    Science.gov (United States)

    Rodrigues, Christian S.; Chechkin, Aleksei V.; de Moura, Alessandro P. S.; Grebogi, Celso; Klages, Rainer

    2014-11-01

    Dynamical systems having many coexisting attractors present interesting properties from both fundamental theoretical and modelling points of view. When such dynamics is under bounded random perturbations, the basins of attraction are no longer invariant and there is the possibility of transport among them. Here we introduce a basic theoretical setting which enables us to study this hopping process from the perspective of anomalous transport using the concept of a random dynamical system with holes. We apply it to a simple model by investigating the role of hyperbolicity for the transport among basins. We show numerically that our system exhibits non-Gaussian position distributions, power-law escape times, and subdiffusion. Our simulation results are reproduced consistently from stochastic continuous time random walk theory.

  5. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  6. Random numbers spring from alpha decay

    International Nuclear Information System (INIS)

    Frigerio, N.A.; Sanathanan, L.P.; Morley, M.; Clark, N.A.; Tyler, S.A.

    1980-05-01

    Congruential random number generators, which are widely used in Monte Carlo simulations, are deficient in that the number they generate are concentrated in a relatively small number of hyperplanes. While this deficiency may not be a limitation in small Monte Carlo studies involving a few variables, it introduces a significant bias in large simulations requiring high resolution. This bias was recognized and assessed during preparations for an accident analysis study of nuclear power plants. This report describes a random number device based on the radioactive decay of alpha particles from a 235 U source in a high-resolution gas proportional counter. The signals were fed to a 4096-channel analyzer and for each channel the frequency of signals registered in a 20,000-microsecond interval was recorded. The parity bits of these frequency counts (0 for an even count and 1 for an odd count) were then assembled in sequence to form 31-bit binary random numbers and transcribed to a magnetic tape. This cycle was repeated as many times as were necessary to create 3 million random numbers. The frequency distribution of counts from the present device conforms to the Brockwell-Moyal distribution, which takes into account the dead time of the counter (both the dead time and decay constant of the underlying Poisson process were estimated). Analysis of the count data and tests of randomness on a sample set of the 31-bit binary numbers indicate that this random number device is a highly reliable source of truly random numbers. Its use is, therefore, recommended in Monte Carlo simulations for which the congruential pseudorandom number generators are found to be inadequate. 6 figures, 5 tables

  7. Randomness at the root of things 1: Random walks

    Science.gov (United States)

    Ogborn, Jon; Collins, Simon; Brown, Mick

    2003-09-01

    This is the first of a pair of articles about randomness in physics. In this article, we use some variations on the idea of a `random walk' to consider first the path of a particle in Brownian motion, and then the random variation to be expected in radioactive decay. The arguments are set in the context of the general importance of randomness both in physics and in everyday life. We think that the ideas could usefully form part of students' A-level work on random decay and quantum phenomena, as well as being good for their general education. In the second article we offer a novel and simple approach to Poisson sequences.

  8. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  9. Strong Decomposition of Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.

    2007-01-01

    A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....

  10. Random Numbers and Quantum Computers

    Science.gov (United States)

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  11. Random number generation based on digital differential chaos

    KAUST Repository

    Zidan, Mohammed A.; Radwan, Ahmed G.; Salama, Khaled N.

    2012-01-01

    In this paper, we present a fully digital differential chaos based random number generator. The output of the digital circuit is proved to be chaotic by calculating the output time series maximum Lyapunov exponent. We introduce a new post processing

  12. Random walk loop soup

    OpenAIRE

    Lawler, Gregory F.; Ferreras, José A. Trujillo

    2004-01-01

    The Brownian loop soup introduced in Lawler and Werner (2004) is a Poissonian realization from a sigma-finite measure on unrooted loops. This measure satisfies both conformal invariance and a restriction property. In this paper, we define a random walk loop soup and show that it converges to the Brownian loop soup. In fact, we give a strong approximation result making use of the strong approximation result of Koml\\'os, Major, and Tusn\\'ady. To make the paper self-contained, we include a proof...

  13. Random matrix theory

    CERN Document Server

    Deift, Percy

    2009-01-01

    This book features a unified derivation of the mathematical theory of the three classical types of invariant random matrix ensembles-orthogonal, unitary, and symplectic. The authors follow the approach of Tracy and Widom, but the exposition here contains a substantial amount of additional material, in particular, facts from functional analysis and the theory of Pfaffians. The main result in the book is a proof of universality for orthogonal and symplectic ensembles corresponding to generalized Gaussian type weights following the authors' prior work. New, quantitative error estimates are derive

  14. On random unitary channels

    International Nuclear Information System (INIS)

    Audenaert, Koenraad M R; Scheel, Stefan

    2008-01-01

    In this paper, we provide necessary and sufficient conditions for a completely positive trace-preserving (CPT) map to be decomposable into a convex combination of unitary maps. Additionally, we set out to define a proper distance measure between a given CPT map and the set of random unitary maps, and methods for calculating it. In this way one could determine whether non-classical error mechanisms such as spontaneous decay or photon loss dominate over classical uncertainties, for example, in a phase parameter. The present paper is a step towards achieving this goal

  15. Drawing a random number

    DEFF Research Database (Denmark)

    Wanscher, Jørgen Bundgaard; Sørensen, Majken Vildrik

    2006-01-01

    Random numbers are used for a great variety of applications in almost any field of computer and economic sciences today. Examples ranges from stock market forecasting in economics, through stochastic traffic modelling in operations research to photon and ray tracing in graphics. The construction...... distributions into others with most of the required characteristics. In essence, a uniform sequence which is transformed into a new sequence with the required distribution. The subject of this article is to consider the well known highly uniform Halton sequence and modifications to it. The intent is to generate...

  16. Benford's law and continuous dependent random variables

    Science.gov (United States)

    Becker, Thealexa; Burt, David; Corcoran, Taylor C.; Greaves-Tunnell, Alec; Iafrate, Joseph R.; Jing, Joy; Miller, Steven J.; Porfilio, Jaclyn D.; Ronan, Ryan; Samranvedhya, Jirapat; Strauch, Frederick W.; Talbut, Blaine

    2018-01-01

    Many mathematical, man-made and natural systems exhibit a leading-digit bias, where a first digit (base 10) of 1 occurs not 11% of the time, as one would expect if all digits were equally likely, but rather 30%. This phenomenon is known as Benford's Law. Analyzing which datasets adhere to Benford's Law and how quickly Benford behavior sets in are the two most important problems in the field. Most previous work studied systems of independent random variables, and relied on the independence in their analyses. Inspired by natural processes such as particle decay, we study the dependent random variables that emerge from models of decomposition of conserved quantities. We prove that in many instances the distribution of lengths of the resulting pieces converges to Benford behavior as the number of divisions grow, and give several conjectures for other fragmentation processes. The main difficulty is that the resulting random variables are dependent. We handle this by using tools from Fourier analysis and irrationality exponents to obtain quantified convergence rates as well as introducing and developing techniques to measure and control the dependencies. The construction of these tools is one of the major motivations of this work, as our approach can be applied to many other dependent systems. As an example, we show that the n ! entries in the determinant expansions of n × n matrices with entries independently drawn from nice random variables converges to Benford's Law.

  17. Random walk in dynamically disordered chains: Poisson white noise disorder

    International Nuclear Information System (INIS)

    Hernandez-Garcia, E.; Pesquera, L.; Rodriguez, M.A.; San Miguel, M.

    1989-01-01

    Exact solutions are given for a variety of models of random walks in a chain with time-dependent disorder. Dynamic disorder is modeled by white Poisson noise. Models with site-independent (global) and site-dependent (local) disorder are considered. Results are described in terms of an affective random walk in a nondisordered medium. In the cases of global disorder the effective random walk contains multistep transitions, so that the continuous limit is not a diffusion process. In the cases of local disorder the effective process is equivalent to usual random walk in the absence of disorder but with slower diffusion. Difficulties associated with the continuous-limit representation of random walk in a disordered chain are discussed. In particular, the authors consider explicit cases in which taking the continuous limit and averaging over disorder sources do not commute

  18. Distance covariance for stochastic processes

    DEFF Research Database (Denmark)

    Matsui, Muneya; Mikosch, Thomas Valentin; Samorodnitsky, Gennady

    2017-01-01

    The distance covariance of two random vectors is a measure of their dependence. The empirical distance covariance and correlation can be used as statistical tools for testing whether two random vectors are independent. We propose an analog of the distance covariance for two stochastic processes...

  19. Random ancestor trees

    International Nuclear Information System (INIS)

    Ben-Naim, E; Krapivsky, P L

    2010-01-01

    We investigate a network growth model in which the genealogy controls the evolution. In this model, a new node selects a random target node and links either to this target node, or to its parent, or to its grandparent, etc; all nodes from the target node to its most ancient ancestor are equiprobable destinations. The emerging random ancestor tree is very shallow: the fraction g n of nodes at distance n from the root decreases super-exponentially with n, g n = e −1 /(n − 1)!. We find that a macroscopic hub at the root coexists with highly connected nodes at higher generations. The maximal degree of a node at the nth generation grows algebraically as N 1/β n , where N is the system size. We obtain the series of nontrivial exponents which are roots of transcendental equations: β 1 ≅1.351 746, β 2 ≅1.682 201, etc. As a consequence, the fraction p k of nodes with degree k has an algebraic tail, p k ∼ k −γ , with γ = β 1 + 1 = 2.351 746

  20. Lectures on random interfaces

    CERN Document Server

    Funaki, Tadahisa

    2016-01-01

    Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...

  1. Random catalytic reaction networks

    Science.gov (United States)

    Stadler, Peter F.; Fontana, Walter; Miller, John H.

    1993-03-01

    We study networks that are a generalization of replicator (or Lotka-Volterra) equations. They model the dynamics of a population of object types whose binary interactions determine the specific type of interaction product. Such a system always reduces its dimension to a subset that contains production pathways for all of its members. The network equation can be rewritten at a level of collectives in terms of two basic interaction patterns: replicator sets and cyclic transformation pathways among sets. Although the system contains well-known cases that exhibit very complicated dynamics, the generic behavior of randomly generated systems is found (numerically) to be extremely robust: convergence to a globally stable rest point. It is easy to tailor networks that display replicator interactions where the replicators are entire self-sustaining subsystems, rather than structureless units. A numerical scan of random systems highlights the special properties of elementary replicators: they reduce the effective interconnectedness of the system, resulting in enhanced competition, and strong correlations between the concentrations.

  2. Quincke random walkers

    Science.gov (United States)

    Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia

    2017-11-01

    The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.

  3. Smooth random change point models.

    Science.gov (United States)

    van den Hout, Ardo; Muniz-Terrera, Graciela; Matthews, Fiona E

    2011-03-15

    Change point models are used to describe processes over time that show a change in direction. An example of such a process is cognitive ability, where a decline a few years before death is sometimes observed. A broken-stick model consists of two linear parts and a breakpoint where the two lines intersect. Alternatively, models can be formulated that imply a smooth change between the two linear parts. Change point models can be extended by adding random effects to account for variability between subjects. A new smooth change point model is introduced and examples are presented that show how change point models can be estimated using functions in R for mixed-effects models. The Bayesian inference using WinBUGS is also discussed. The methods are illustrated using data from a population-based longitudinal study of ageing, the Cambridge City over 75 Cohort Study. The aim is to identify how many years before death individuals experience a change in the rate of decline of their cognitive ability. Copyright © 2010 John Wiley & Sons, Ltd.

  4. Aggregated recommendation through random forests.

    Science.gov (United States)

    Zhang, Heng-Ru; Min, Fan; He, Xu

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy.

  5. Solid-State Random Lasers

    CERN Document Server

    Noginov, Mikhail A

    2005-01-01

    Random lasers are the simplest sources of stimulated emission without cavity, with the feedback provided by scattering in a gain medium. First proposed in the late 60’s, random lasers have grown to a large research field. This book reviews the history and the state of the art of random lasers, provides an outline of the basic models describing their behavior, and describes the recent advances in the field. The major focus of the book is on solid-state random lasers. However, it also briefly describes random lasers based on liquid dyes with scatterers. The chapters of the book are almost independent of each other. So, the scientists or engineers interested in any particular aspect of random lasers can read directly the relevant section. Researchers entering the field of random lasers will find in the book an overview of the field of study. Scientists working in the field can use the book as a reference source.

  6. Chemical Continuous Time Random Walks

    Science.gov (United States)

    Aquino, T.; Dentz, M.

    2017-12-01

    Traditional methods for modeling solute transport through heterogeneous media employ Eulerian schemes to solve for solute concentration. More recently, Lagrangian methods have removed the need for spatial discretization through the use of Monte Carlo implementations of Langevin equations for solute particle motions. While there have been recent advances in modeling chemically reactive transport with recourse to Lagrangian methods, these remain less developed than their Eulerian counterparts, and many open problems such as efficient convergence and reconstruction of the concentration field remain. We explore a different avenue and consider the question: In heterogeneous chemically reactive systems, is it possible to describe the evolution of macroscopic reactant concentrations without explicitly resolving the spatial transport? Traditional Kinetic Monte Carlo methods, such as the Gillespie algorithm, model chemical reactions as random walks in particle number space, without the introduction of spatial coordinates. The inter-reaction times are exponentially distributed under the assumption that the system is well mixed. In real systems, transport limitations lead to incomplete mixing and decreased reaction efficiency. We introduce an arbitrary inter-reaction time distribution, which may account for the impact of incomplete mixing. This process defines an inhomogeneous continuous time random walk in particle number space, from which we derive a generalized chemical Master equation and formulate a generalized Gillespie algorithm. We then determine the modified chemical rate laws for different inter-reaction time distributions. We trace Michaelis-Menten-type kinetics back to finite-mean delay times, and predict time-nonlocal macroscopic reaction kinetics as a consequence of broadly distributed delays. Non-Markovian kinetics exhibit weak ergodicity breaking and show key features of reactions under local non-equilibrium.

  7. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  8. How random are random numbers generated using photons?

    International Nuclear Information System (INIS)

    Solis, Aldo; Angulo Martínez, Alí M; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U’Ren, Alfred B; Hirsch, Jorge G

    2015-01-01

    Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined. (paper)

  9. Tailored Random Graph Ensembles

    International Nuclear Information System (INIS)

    Roberts, E S; Annibale, A; Coolen, A C C

    2013-01-01

    Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.

  10. Gossip in Random Networks

    Science.gov (United States)

    Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.

    2006-11-01

    We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.

  11. The random projection method

    CERN Document Server

    Vempala, Santosh S

    2005-01-01

    Random projection is a simple geometric technique for reducing the dimensionality of a set of points in Euclidean space while preserving pairwise distances approximately. The technique plays a key role in several breakthrough developments in the field of algorithms. In other cases, it provides elegant alternative proofs. The book begins with an elementary description of the technique and its basic properties. Then it develops the method in the context of applications, which are divided into three groups. The first group consists of combinatorial optimization problems such as maxcut, graph coloring, minimum multicut, graph bandwidth and VLSI layout. Presented in this context is the theory of Euclidean embeddings of graphs. The next group is machine learning problems, specifically, learning intersections of halfspaces and learning large margin hypotheses. The projection method is further refined for the latter application. The last set consists of problems inspired by information retrieval, namely, nearest neig...

  12. Random volumes from matrices

    Energy Technology Data Exchange (ETDEWEB)

    Fukuma, Masafumi; Sugishita, Sotaro; Umeda, Naoya [Department of Physics, Kyoto University,Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)

    2015-07-17

    We propose a class of models which generate three-dimensional random volumes, where each configuration consists of triangles glued together along multiple hinges. The models have matrices as the dynamical variables and are characterized by semisimple associative algebras A. Although most of the diagrams represent configurations which are not manifolds, we show that the set of possible diagrams can be drastically reduced such that only (and all of the) three-dimensional manifolds with tetrahedral decompositions appear, by introducing a color structure and taking an appropriate large N limit. We examine the analytic properties when A is a matrix ring or a group ring, and show that the models with matrix ring have a novel strong-weak duality which interchanges the roles of triangles and hinges. We also give a brief comment on the relationship of our models with the colored tensor models.

  13. Random Intercept and Random Slope 2-Level Multilevel Models

    Directory of Open Access Journals (Sweden)

    Rehan Ahmad Khan

    2012-11-01

    Full Text Available Random intercept model and random intercept & random slope model carrying two-levels of hierarchy in the population are presented and compared with the traditional regression approach. The impact of students’ satisfaction on their grade point average (GPA was explored with and without controlling teachers influence. The variation at level-1 can be controlled by introducing the higher levels of hierarchy in the model. The fanny movement of the fitted lines proves variation of student grades around teachers.

  14. Random walk of passive tracers among randomly moving obstacles

    OpenAIRE

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-01-01

    Background: This study is mainly motivated by the need of understanding how the diffusion behaviour of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. Method: By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random en...

  15. General distributions in process algebra

    NARCIS (Netherlands)

    Katoen, Joost P.; d' Argenio, P.R.; Brinksma, Hendrik; Hermanns, H.; Katoen, Joost P.

    2001-01-01

    This paper is an informal tutorial on stochastic process algebras, i.e., process calculi where action occurrences may be subject to a delay that is governed by a (mostly continuous) random variable. Whereas most stochastic process algebras consider delays determined by negative exponential

  16. Random lasing in human tissues

    International Nuclear Information System (INIS)

    Polson, Randal C.; Vardeny, Z. Valy

    2004-01-01

    A random collection of scatterers in a gain medium can produce coherent laser emission lines dubbed 'random lasing'. We show that biological tissues, including human tissues, can support coherent random lasing when infiltrated with a concentrated laser dye solution. To extract a typical random resonator size within the tissue we average the power Fourier transform of random laser spectra collected from many excitation locations in the tissue; we verified this procedure by a computer simulation. Surprisingly, we found that malignant tissues show many more laser lines compared to healthy tissues taken from the same organ. Consequently, the obtained typical random resonator was found to be different for healthy and cancerous tissues, and this may lead to a technique for separating malignant from healthy tissues for diagnostic imaging

  17. Random Correlation Matrix and De-Noising

    OpenAIRE

    Ken-ichi Mitsui; Yoshio Tabata

    2006-01-01

    In Finance, the modeling of a correlation matrix is one of the important problems. In particular, the correlation matrix obtained from market data has the noise. Here we apply the de-noising processing based on the wavelet analysis to the noisy correlation matrix, which is generated by a parametric function with random parameters. First of all, we show that two properties, i.e. symmetry and ones of all diagonal elements, of the correlation matrix preserve via the de-noising processing and the...

  18. Groupies in multitype random graphs

    OpenAIRE

    Shang, Yilun

    2016-01-01

    A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erd?s-R?nyi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.

  19. Groupies in multitype random graphs.

    Science.gov (United States)

    Shang, Yilun

    2016-01-01

    A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erdős-Rényi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.

  20. Temporal changes in randomness of bird communities across Central Europe.

    Science.gov (United States)

    Renner, Swen C; Gossner, Martin M; Kahl, Tiemo; Kalko, Elisabeth K V; Weisser, Wolfgang W; Fischer, Markus; Allan, Eric

    2014-01-01

    Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  1. Temporal changes in randomness of bird communities across Central Europe.

    Directory of Open Access Journals (Sweden)

    Swen C Renner

    Full Text Available Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63, implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  2. Mobile access to virtual randomization for investigator-initiated trials.

    Science.gov (United States)

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization

  3. Dissecting the circle, at random*

    Directory of Open Access Journals (Sweden)

    Curien Nicolas

    2014-01-01

    Full Text Available Random laminations of the disk are the continuous limits of random non-crossing configurations of regular polygons. We provide an expository account on this subject. Initiated by the work of Aldous on the Brownian triangulation, this field now possesses many characters such as the random recursive triangulation, the stable laminations and the Markovian hyperbolic triangulation of the disk. We will review the properties and constructions of these objects as well as the close relationships they enjoy with the theory of continuous random trees. Some open questions are scattered along the text.

  4. Efficient search by optimized intermittent random walks

    International Nuclear Information System (INIS)

    Oshanin, Gleb; Lindenberg, Katja; Wio, Horacio S; Burlatsky, Sergei

    2009-01-01

    We study the kinetics for the search of an immobile target by randomly moving searchers that detect it only upon encounter. The searchers perform intermittent random walks on a one-dimensional lattice. Each searcher can step on a nearest neighbor site with probability α or go off lattice with probability 1 - α to move in a random direction until it lands back on the lattice at a fixed distance L away from the departure point. Considering α and L as optimization parameters, we seek to enhance the chances of successful detection by minimizing the probability P N that the target remains undetected up to the maximal search time N. We show that even in this simple model, a number of very efficient search strategies can lead to a decrease of P N by orders of magnitude upon appropriate choices of α and L. We demonstrate that, in general, such optimal intermittent strategies are much more efficient than Brownian searches and are as efficient as search algorithms based on random walks with heavy-tailed Cauchy jump-length distributions. In addition, such intermittent strategies appear to be more advantageous than Levy-based ones in that they lead to more thorough exploration of visited regions in space and thus lend themselves to parallelization of the search processes.

  5. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  6. Doing the Impossible: A Note on Induction and the Experience of Randomness.

    Science.gov (United States)

    Lopes, Lola L.

    1982-01-01

    The process of induction is formulated as a problem in detecting nonrandomness, or pattern, against a background of randomness, or noise. Experimental and philosophical approaches to human conceptions of randomness are contrasted. The relation between induction and the experience of randomness is discussed in terms of signal-detection theory.…

  7. Random matrix ensembles with random interactions: Results for ...

    Indian Academy of Sciences (India)

    ... Public Lectures · Lecture Workshops · Refresher Courses · Symposia · Live Streaming. Home; Journals; Pramana – Journal of Physics; Volume 73; Issue 3. Random matrix ensembles with random interactions: Results for EGUE(2)-(4). Manan Vyas Manan Vyas. Volume 73 Issue 3 September 2009 pp 521-531 ...

  8. Dynamic computing random access memory

    International Nuclear Information System (INIS)

    Traversa, F L; Bonani, F; Pershin, Y V; Di Ventra, M

    2014-01-01

    The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200–2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology. (paper)

  9. Pseudo-random number generator based on asymptotic deterministic randomness

    Science.gov (United States)

    Wang, Kai; Pei, Wenjiang; Xia, Haishan; Cheung, Yiu-ming

    2008-06-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks.

  10. Pseudo-random number generator based on asymptotic deterministic randomness

    International Nuclear Information System (INIS)

    Wang Kai; Pei Wenjiang; Xia Haishan; Cheung Yiuming

    2008-01-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks

  11. Random walk of passive tracers among randomly moving obstacles.

    Science.gov (United States)

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  12. Random distributed feedback fibre lasers

    Energy Technology Data Exchange (ETDEWEB)

    Turitsyn, Sergei K., E-mail: s.k.turitsyn@aston.ac.uk [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Babin, Sergey A. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Churkin, Dmitry V. [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Vatnik, Ilya D.; Nikulin, Maxim [Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Podivilov, Evgenii V. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation)

    2014-09-10

    The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors–random distributed feedback fibre laser–was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (∼0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the

  13. Random distributed feedback fibre lasers

    International Nuclear Information System (INIS)

    Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.

    2014-01-01

    The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors–random distributed feedback fibre laser–was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (∼0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the

  14. Experimental Analysis of a Piezoelectric Energy Harvesting System for Harmonic, Random, and Sine on Random Vibration

    Energy Technology Data Exchange (ETDEWEB)

    Cryns, Jackson W.; Hatchell, Brian K.; Santiago-Rojas, Emiliano; Silvers, Kurt L.

    2013-07-01

    Formal journal article Experimental analysis of a piezoelectric energy harvesting system for harmonic, random, and sine on random vibration Abstract: Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random and sine on random (SOR) input vibration scenarios. Additionally, the implications of source vibration characteristics on harvester design are discussed. Studies in vibration harvesting have yielded numerous alternatives for harvesting electrical energy from vibrations but piezoceramics arose as the most compact, energy dense means of energy transduction. The rise in popularity of harvesting energy from ambient vibrations has made piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. In this manuscript, variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. We characterize the source vibration by its acceleration response for repeatability and transcription to general application. The results agree with numerical and theoretical predictions for in previous literature that load optimal resistance varies with transducer natural frequency and source type, and the findings demonstrate that significant gains are seen with lower tuned transducer natural frequencies for similar source amplitudes. Going beyond idealized steady state sinusoidal and simplified random vibration input, SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibrational sources significantly alter power generation and power processing

  15. From random process to chaotic behavior in swarms of UAVs

    OpenAIRE

    Rosalie , Martin; Danoy , Grégoire; Chaumette , Serge; Bouvry , Pascal

    2016-01-01

    International audience; Unmanned Aerial Vehicles (UAVs) applications have seen an important increase in the last decade for both military and civilian applications ranging from fire and high seas rescue to military surveillance and target detection. While this technology is now mature for a single UAV, new methods are needed to operate UAVs in swarms, also referred to as fleets. This work focuses on the mobility management of one single autonomous swarm of UAVs which mission is to cover a giv...

  16. Strategies for processing diffraction data from randomly oriented particles

    International Nuclear Information System (INIS)

    Elser, Veit

    2011-01-01

    The high intensity of free-electron X-ray light sources may enable structure determinations of viruses or even individual proteins without the encumbrance of first forming crystals. This note compares two schemes of non-crystalline diffraction data collection that have been proposed: serial single-shot data from individual particles, and averaged cross-correlation data from particle ensembles. The information content of these schemes is easily compared and we show that the single-shot approach, although experimentally more challenging, is always superior in this respect. In fact, for 3D structure determination a constraint counting argument shows that the cross-correlation scheme suffers from data deficiency. -- Research Highlights: →We compare two data collection schemes for imaging single particles with x-rays. →Cross-correlation data suffers an information deficit relative to single-shot data. →We recognize John Spence for his many contributions to single particle imaging.

  17. The random continued fraction transformation

    Science.gov (United States)

    Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny

    2017-03-01

    We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.

  18. Bell inequalities for random fields

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Peter [Physics Department, Yale University, CT 06520 (United States)

    2006-06-09

    The assumptions required for the derivation of Bell inequalities are not satisfied for random field models in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  19. Bell inequalities for random fields

    OpenAIRE

    Morgan, Peter

    2004-01-01

    The assumptions required for the derivation of Bell inequalities are not usually satisfied for random fields in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  20. Object grammars and random generation

    Directory of Open Access Journals (Sweden)

    I. Dutour

    1998-12-01

    Full Text Available This paper presents a new systematic approach for the uniform random generation of combinatorial objects. The method is based on the notion of object grammars which give recursive descriptions of objects and generalize context-freegrammars. The application of particular valuations to these grammars leads to enumeration and random generation of objects according to non algebraic parameters.

  1. Fields on a random lattice

    International Nuclear Information System (INIS)

    Itzykson, C.

    1983-10-01

    We review the formulation of field theory and statistical mechanics on a Poissonian random lattice. Topics discussed include random geometry, the construction of field equations for arbitrary spin, the free field spectrum and the question of localization illustrated in the one dimensional case

  2. a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    MS Yıldırım

    2016-02-01

    Full Text Available The aim of this study was to compare the effects of static stretching, proprioceptive neuromuscular facilitation (PNF stretching and Mulligan technique on hip flexion range of motion (ROM in subjects with bilateral hamstring tightness. A total of 40 students (mean age: 21.5±1.3 years, mean body height: 172.8±8.2 cm, mean body mass index: 21.9±3.0 kg • m-2 with bilateral hamstring tightness were enrolled in this randomized trial, of whom 26 completed the study. Subjects were divided into 4 groups performing (I typical static stretching, (II PNF stretching, (III Mulligan traction straight leg raise (TSLR technique, (IV no intervention. Hip flexion ROM was measured using a digital goniometer with the passive straight leg raise test before and after 4 weeks by two physiotherapists blinded to the groups. 52 extremities of 26 subjects were analyzed. Hip flexion ROM increased in all three intervention groups (p<0.05 but not in the no-intervention group after 4 weeks. A statistically significant change in initial–final assessment differences of hip flexion ROM was found between groups (p<0.001 in favour of PNF stretching and Mulligan TSLR technique in comparison to typical static stretching (p=0.016 and p=0.02, respectively. No significant difference was found between Mulligan TSLR technique and PNF stretching (p=0.920. The initial–final assessment difference of hip flexion ROM was similar in typical static stretching and no intervention (p=0.491. A 4-week stretching intervention is beneficial for increasing hip flexion ROM in bilateral hamstring tightness. However, PNF stretching and Mulligan TSLR technique are superior to typical static stretching. These two interventions can be alternatively used for stretching in hamstring tightness.

  3. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  4. A Solution Method for Linear and Geometrically Nonlinear MDOF Systems with Random Properties subject to Random Excitation

    DEFF Research Database (Denmark)

    Micaletti, R. C.; Cakmak, A. S.; Nielsen, Søren R. K.

    structural properties. The resulting state-space formulation is a system of ordinary stochastic differential equations with random coefficient and deterministic initial conditions which are subsequently transformed into ordinary stochastic differential equations with deterministic coefficients and random......A method for computing the lower-order moments of randomly-excited multi-degree-of-freedom (MDOF) systems with random structural properties is proposed. The method is grounded in the techniques of stochastic calculus, utilizing a Markov diffusion process to model the structural system with random...... initial conditions. This transformation facilitates the derivation of differential equations which govern the evolution of the unconditional statistical moments of response. Primary consideration is given to linear systems and systems with odd polynomial nonlinearities, for in these cases...

  5. Implementing traceability using particle randomness-based textile printed tags

    Science.gov (United States)

    Agrawal, T. K.; Koehl, L.; Campagne, C.

    2017-10-01

    This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.

  6. Experimental Analysis of a Piezoelectric Energy Harvesting System for Harmonic, Random, and Sine on Random Vibration

    Directory of Open Access Journals (Sweden)

    Jackson W. Cryns

    2013-01-01

    Full Text Available Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random, and sine on random (SOR input vibration scenarios; the implications of source vibration characteristics on harvester design are discussed. The rise in popularity of harvesting energy from ambient vibrations has made compact, energy dense piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. Variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. The results agree with numerical and theoretical predictions in the previous literature for optimal power harvesting in sinusoidal and flat broadband vibration scenarios. Going beyond idealized steady-state sinusoidal and flat random vibration input, experimental SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibration sources significantly alter power generation and processing requirements by varying harvested power, shifting optimal conditioning impedance, inducing voltage fluctuations, and ultimately rendering idealized sinusoidal and random analyses incorrect.

  7. Dimer coverings on random multiple chains of planar honeycomb lattices

    International Nuclear Information System (INIS)

    Ren, Haizhen; Zhang, Fuji; Qian, Jianguo

    2012-01-01

    We study dimer coverings on random multiple chains. A multiple chain is a planar honeycomb lattice constructed by successively fusing copies of a ‘straight’ condensed hexagonal chain at the bottom of the previous one in two possible ways. A random multiple chain is then generated by admitting the Bernoulli distribution on the two types of fusing, which describes a zeroth-order Markov process. We determine the expectation of the number of the pure dimer coverings (perfect matchings) over the ensemble of random multiple chains by the transfer matrix approach. Our result shows that, with only two exceptions, the average of the logarithm of this expectation (i.e., the annealed entropy per dimer) is asymptotically nonzero when the fusing process goes to infinity and the length of the hexagonal chain is fixed, though it is zero when the fusing process and the length of the hexagonal chain go to infinity simultaneously. Some numerical results are provided to support our conclusion, from which we can see that the asymptotic behavior fits well to the theoretical results. We also apply the transfer matrix approach to the quenched entropy and reveal that the quenched entropy of random multiple chains has a close connection with the well-known Lyapunov exponent of random matrices. Using the theory of Lyapunov exponents we show that, for some random multiple chains, the quenched entropy per dimer is strictly smaller than the annealed one when the fusing process goes to infinity. Finally, we determine the expectation of the free energy per dimer over the ensemble of the random multiple chains in which the three types of dimers in different orientations are distinguished, and specify a series of non-random multiple chains whose free energy per dimer is asymptotically equal to this expectation. (paper)

  8. Description of two-process surface topography

    International Nuclear Information System (INIS)

    Grabon, W; Pawlus, P

    2014-01-01

    After two machining processes, a large number of surface topography measurements were made using Talyscan 150 stylus measuring equipment. The measured samples were divided into two groups. The first group contained two-process surfaces of random nature, while the second group used random-deterministic textures of random plateau parts and portions of deterministic valleys. For comparison, one-process surfaces were also analysed. Correlation and regression analysis was used to study the dependencies among surface texture parameters in 2D and 3D systems. As the result of this study, sets of parameters describing multi-process surface topography were obtained for two-process surfaces of random and of random-deterministic types. (papers)

  9. An introduction to random sets

    CERN Document Server

    Nguyen, Hung T

    2006-01-01

    The study of random sets is a large and rapidly growing area with connections to many areas of mathematics and applications in widely varying disciplines, from economics and decision theory to biostatistics and image analysis. The drawback to such diversity is that the research reports are scattered throughout the literature, with the result that in science and engineering, and even in the statistics community, the topic is not well known and much of the enormous potential of random sets remains untapped.An Introduction to Random Sets provides a friendly but solid initiation into the theory of random sets. It builds the foundation for studying random set data, which, viewed as imprecise or incomplete observations, are ubiquitous in today''s technological society. The author, widely known for his best-selling A First Course in Fuzzy Logic text as well as his pioneering work in random sets, explores motivations, such as coarse data analysis and uncertainty analysis in intelligent systems, for studying random s...

  10. Quantifiers for randomness of chaotic pseudo-random number generators.

    Science.gov (United States)

    De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A

    2009-08-28

    We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.

  11. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  12. Orthogonal polynomials and random matrices

    CERN Document Server

    Deift, Percy

    2000-01-01

    This volume expands on a set of lectures held at the Courant Institute on Riemann-Hilbert problems, orthogonal polynomials, and random matrix theory. The goal of the course was to prove universality for a variety of statistical quantities arising in the theory of random matrix models. The central question was the following: Why do very general ensembles of random n {\\times} n matrices exhibit universal behavior as n {\\rightarrow} {\\infty}? The main ingredient in the proof is the steepest descent method for oscillatory Riemann-Hilbert problems.

  13. Curvature of random walks and random polygons in confinement

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2013-01-01

    The purpose of this paper is to study the curvature of equilateral random walks and polygons that are confined in a sphere. Curvature is one of several basic geometric properties that can be used to describe random walks and polygons. We show that confinement affects curvature quite strongly, and in the limit case where the confinement diameter equals the edge length the unconfined expected curvature value doubles from π/2 to π. To study curvature a simple model of an equilateral random walk in spherical confinement in dimensions 2 and 3 is introduced. For this simple model we derive explicit integral expressions for the expected value of the total curvature in both dimensions. These expressions are functions that depend only on the radius R of the confinement sphere. We then show that the values obtained by numeric integration of these expressions agrees with numerical average curvature estimates obtained from simulations of random walks. Finally, we compare the confinement effect on curvature of random walks with random polygons. (paper)

  14. Local randomness: Examples and application

    Science.gov (United States)

    Fu, Honghao; Miller, Carl A.

    2018-03-01

    When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).

  15. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  16. Microcomputer Unit: Generating Random Numbers.

    Science.gov (United States)

    Haigh, William E.

    1986-01-01

    Presents an activity, suitable for students in grades 6-12, on generating random numbers. Objectives, equipment needed, list of prerequisite experiences, instructional strategies, and ready-to-copy student worksheets are included. (JN)

  17. Chaotic systems are dynamically random

    International Nuclear Information System (INIS)

    Svozil, K.

    1988-01-01

    The idea is put forward that the significant route to chaos is driven by recursive iterations of suitable evolution functions. The corresponding formal notion of randomness is not based on dynamic complexity rather than on static complexity. 24 refs. (Author)

  18. A Randomized Central Limit Theorem

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2010-01-01

    The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√(n)), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √(n). This Letter considers scaling schemes which are stochastic and non-uniform, and presents a 'Randomized Central Limit Theorem' (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Levy laws.

  19. Electromagnetic scattering from random media

    CERN Document Server

    Field, Timothy R

    2009-01-01

    - ;The book develops the dynamical theory of scattering from random media from first principles. Its key findings are to characterize the time evolution of the scattered field in terms of stochastic differential equations, and to illustrate this framework

  20. Cluster randomization and political philosophy.

    Science.gov (United States)

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.

  1. Quantum-noise randomized ciphers

    International Nuclear Information System (INIS)

    Nair, Ranjith; Yuen, Horace P.; Kumar, Prem; Corndorf, Eric; Eguchi, Takami

    2006-01-01

    We review the notion of a classical random cipher and its advantages. We sharpen the usual description of random ciphers to a particular mathematical characterization suggested by the salient feature responsible for their increased security. We describe a concrete system known as αη and show that it is equivalent to a random cipher in which the required randomization is affected by coherent-state quantum noise. We describe the currently known security features of αη and similar systems, including lower bounds on the unicity distances against ciphertext-only and known-plaintext attacks. We show how αη used in conjunction with any standard stream cipher such as the Advanced Encryption Standard provides an additional, qualitatively different layer of security from physical encryption against known-plaintext attacks on the key. We refute some claims in the literature that αη is equivalent to a nonrandom stream cipher

  2. Random matrix improved subspace clustering

    KAUST Repository

    Couillet, Romain; Kammoun, Abla

    2017-01-01

    This article introduces a spectral method for statistical subspace clustering. The method is built upon standard kernel spectral clustering techniques, however carefully tuned by theoretical understanding arising from random matrix findings. We show

  3. Weak convergence to isotropic complex S α S $S\\alpha S$ random measure

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2017-09-01

    Full Text Available Abstract In this paper, we prove that an isotropic complex symmetric α-stable random measure ( 0 < α < 2 $0<\\alpha<2$ can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  4. Generating and using truly random quantum states in Mathematica

    Science.gov (United States)

    Miszczak, Jarosław Adam

    2012-01-01

    The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.

  5. Experimentally generated randomness certified by the impossibility of superluminal signals.

    Science.gov (United States)

    Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K

    2018-04-01

    From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.

  6. Random sequential adsorption of cubes

    Science.gov (United States)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  7. Random walk through fractal environments

    OpenAIRE

    Isliker, H.; Vlahos, L.

    2002-01-01

    We analyze random walk through fractal environments, embedded in 3-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e. of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D of the fractal is ...

  8. Randomness in Contemporary Graphic Art

    OpenAIRE

    Zavřelová, Veronika

    2016-01-01

    Veronika Zavřelová Bachelor thesis Charles University in Prague, Faculty of Education, Department of Art Education Randomness in contemporary graphic art imaginative picture card game ANNOTATION This (bachelor) thesis concerns itself with a connection between verbal and visual character system within the topic of Randomness in contemporary graphic art - imaginative picture card game. The thesis is mainly based on the practical part - exclusively created card game Piktim. The card game uses as...

  9. Staggered chiral random matrix theory

    International Nuclear Information System (INIS)

    Osborn, James C.

    2011-01-01

    We present a random matrix theory for the staggered lattice QCD Dirac operator. The staggered random matrix theory is equivalent to the zero-momentum limit of the staggered chiral Lagrangian and includes all taste breaking terms at their leading order. This is an extension of previous work which only included some of the taste breaking terms. We will also present some results for the taste breaking contributions to the partition function and the Dirac eigenvalues.

  10. Digital random-number generator

    Science.gov (United States)

    Brocker, D. H.

    1973-01-01

    For binary digit array of N bits, use N noise sources to feed N nonlinear operators; each flip-flop in digit array is set by nonlinear operator to reflect whether amplitude of generator which feeds it is above or below mean value of generated noise. Fixed-point uniform distribution random number generation method can also be used to generate random numbers with other than uniform distribution.

  11. Distribution functions for fluids in random media

    International Nuclear Information System (INIS)

    Madden, W.G.; Glandt, E.D.

    1988-01-01

    A random medium is considered, composed of identifiable interactive sites or obstacles equilibrated at a high temperature and then quenched rapidly to form a rigid structure, statistically homogeneous on all but molecular length scales. The equilibrium statistical mechanics of a fluid contained inside this quenched medium is discussed. Various particle-particle and particle-obstacle correlation functions, which differ form the corresponding functions for a fully equilibrated binary mixture, are defined through an averaging process over the static ensemble of obstacle configurations and applications of topological reduction techniques. The Ornstein-Zernike equations also differ from their equilibrium counterparts

  12. Connectivity ranking of heterogeneous random conductivity models

    Science.gov (United States)

    Rizzo, C. B.; de Barros, F.

    2017-12-01

    To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.

  13. Random walks on generalized Koch networks

    International Nuclear Information System (INIS)

    Sun, Weigang

    2013-01-01

    For deterministically growing networks, it is a theoretical challenge to determine the topological properties and dynamical processes. In this paper, we study random walks on generalized Koch networks with features that include an initial state that is a globally connected network to r nodes. In each step, every existing node produces m complete graphs. We then obtain the analytical expressions for first passage time (FPT), average return time (ART), i.e. the average of FPTs for random walks from node i to return to the starting point i for the first time, and average sending time (AST), defined as the average of FPTs from a hub node to all other nodes, excluding the hub itself with regard to network parameters m and r. For this family of Koch networks, the ART of the new emerging nodes is identical and increases with the parameters m or r. In addition, the AST of our networks grows with network size N as N ln N and also increases with parameter m. The results obtained in this paper are the generalizations of random walks for the original Koch network. (paper)

  14. Physical Principle for Generation of Randomness

    Science.gov (United States)

    Zak, Michail

    2009-01-01

    A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)

  15. Fuel processing. Wastes processing

    International Nuclear Information System (INIS)

    Bourgeois, M.

    2000-01-01

    The gaseous, liquid and solid radioactive effluents generated by the fuel reprocessing, can't be release in the environment. They have to be treated in order to respect the limits of the pollution regulations. These processing are detailed and discussed in this technical paper. A second part is devoted to the SPIN research program relative to the separation of the long life radionuclides in order to reduce the radioactive wastes storage volume. (A.L.B.)

  16. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  17. Perceptions of randomized security schedules.

    Science.gov (United States)

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  18. A relation between non-Markov and Markov processes

    International Nuclear Information System (INIS)

    Hara, H.

    1980-01-01

    With the aid of a transformation technique, it is shown that some memory effects in the non-Markov processes can be eliminated. In other words, some non-Markov processes are rewritten in a form obtained by the random walk process; the Markov process. To this end, two model processes which have some memory or correlation in the random walk process are introduced. An explanation of the memory in the processes is given. (orig.)

  19. Virial expansion for almost diagonal random matrices

    Science.gov (United States)

    Yevtushenko, Oleg; Kravtsov, Vladimir E.

    2003-08-01

    Energy level statistics of Hermitian random matrices hat H with Gaussian independent random entries Higeqj is studied for a generic ensemble of almost diagonal random matrices with langle|Hii|2rangle ~ 1 and langle|Hi\

  20. Analysis of random number generators in abnormal usage conditions

    International Nuclear Information System (INIS)

    Soucarros, M.

    2012-01-01

    Random numbers have been used through the ages for games of chance, more recently for secret codes and today they are necessary to the execution of computer programs. Random number generators have now evolved from simple dices to electronic circuits and algorithms. Accordingly, the ability to distinguish between random and non-random numbers has become more difficult. Furthermore, whereas in the past dices were loaded in order to increase winning chances, it is now possible to influence the outcome of random number generators. In consequence, this subject is still very much an issue and has recently made the headlines. Indeed, there was talks about the PS3 game console which generates constant random numbers and redundant distribution of secret keys on the internet. This thesis presents a study of several generators as well as different means to perturb them. It shows the inherent defects of their conceptions and possible consequences of their failure when they are embedded inside security components. Moreover, this work highlights problems yet to be solved concerning the testing of random numbers and the post-processing eliminating bias in these numbers distribution. (author) [fr