WorldWideScience

Sample records for random process underlying

  1. Incorrect modeling of the failure process of minimally repaired systems under random conditions: The effect on the maintenance costs

    International Nuclear Information System (INIS)

    Pulcini, Gianpaolo

    2015-01-01

    This note investigates the effect of the incorrect modeling of the failure process of minimally repaired systems that operates under random environmental conditions on the costs of a periodic replacement maintenance. The motivation of this paper is given by a recently published paper, where a wrong formulation of the expected cost for unit time under a periodic replacement policy is obtained. This wrong formulation is due to the incorrect assumption that the intensity function of minimally repaired systems that operate under random conditions has the same functional form as the failure rate of the first failure time. This produced an incorrect optimization of the replacement maintenance. Thus, in this note the conceptual differences between the intensity function and the failure rate of the first failure time are first highlighted. Then, the correct expressions of the expected cost and of the optimal replacement period are provided. Finally, a real application is used to measure how severe can be the economical consequences caused by the incorrect modeling of the failure process.

  2. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  3. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  4. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  5. Melnikov processes and chaos in randomly perturbed dynamical systems

    Science.gov (United States)

    Yagasaki, Kazuyuki

    2018-07-01

    We consider a wide class of randomly perturbed systems subjected to stationary Gaussian processes and show that chaotic orbits exist almost surely under some nondegenerate condition, no matter how small the random forcing terms are. This result is very contrasting to the deterministic forcing case, in which chaotic orbits exist only if the influence of the forcing terms overcomes that of the other terms in the perturbations. To obtain the result, we extend Melnikov’s method and prove that the corresponding Melnikov functions, which we call the Melnikov processes, have infinitely many zeros, so that infinitely many transverse homoclinic orbits exist. In addition, a theorem on the existence and smoothness of stable and unstable manifolds is given and the Smale–Birkhoff homoclinic theorem is extended in an appropriate form for randomly perturbed systems. We illustrate our theory for the Duffing oscillator subjected to the Ornstein–Uhlenbeck process parametrically.

  6. Effect of random microstructure on crack propagation in cortical bone tissue under dynamic loading

    International Nuclear Information System (INIS)

    Gao, X; Li, S; Adel-Wahab, A; Silberschmidt, V

    2013-01-01

    A fracture process in a cortical bone tissue depends on various factors, such as bone loss, heterogeneous microstructure, variation of its material properties and accumulation of microcracks. Therefore, it is crucial to comprehend and describe the effect of microstructure and material properties of the components of cortical bone on crack propagation in a dynamic loading regime. At the microscale level, osteonal bone demonstrates a random distribution of osteons imbedded in an interstitial matrix and surrounded by a thin layer known as cement line. Such a distribution of osteons can lead to localization of deformation processes. The global mechanical behavior of bone and the crack-propagation process are affected by such localization under external loads. Hence, the random distribution of microstructural features plays a key role in the fracture process of cortical bone. The purpose of this study is two-fold: firstly, to develop two-dimensional microstructured numerical models of cortical bone tissue in order to examine the interaction between the propagating crack and bone microstructure using an extended finite-element method under both quasi-static and dynamic loading conditions; secondly, to investigate the effect of randomly distributed microstructural constituents on the crack propagation processes and crack paths. The obtained results of numerical simulations showed the influence of random microstructure on the global response of bone tissue at macroscale and on the crack-propagation process for quasi-static and dynamic loading conditions

  7. Spatial birth-and-death processes in random environment

    OpenAIRE

    Fernandez, Roberto; Ferrari, Pablo A.; Guerberoff, Gustavo R.

    2004-01-01

    We consider birth-and-death processes of objects (animals) defined in ${\\bf Z}^d$ having unit death rates and random birth rates. For animals with uniformly bounded diameter we establish conditions on the rate distribution under which the following holds for almost all realizations of the birth rates: (i) the process is ergodic with at worst power-law time mixing; (ii) the unique invariant measure has exponential decay of (spatial) correlations; (iii) there exists a perfect-simulation algorit...

  8. Random processes in nuclear reactors

    CERN Document Server

    Williams, M M R

    1974-01-01

    Random Processes in Nuclear Reactors describes the problems that a nuclear engineer may meet which involve random fluctuations and sets out in detail how they may be interpreted in terms of various models of the reactor system. Chapters set out to discuss topics on the origins of random processes and sources; the general technique to zero-power problems and bring out the basic effect of fission, and fluctuations in the lifetime of neutrons, on the measured response; the interpretation of power reactor noise; and associated problems connected with mechanical, hydraulic and thermal noise sources

  9. A signal theoretic introduction to random processes

    CERN Document Server

    Howard, Roy M

    2015-01-01

    A fresh introduction to random processes utilizing signal theory By incorporating a signal theory basis, A Signal Theoretic Introduction to Random Processes presents a unique introduction to random processes with an emphasis on the important random phenomena encountered in the electronic and communications engineering field. The strong mathematical and signal theory basis provides clarity and precision in the statement of results. The book also features:  A coherent account of the mathematical fundamentals and signal theory that underpin the presented material Unique, in-depth coverage of

  10. A Campbell random process

    International Nuclear Information System (INIS)

    Reuss, J.D.; Misguich, J.H.

    1993-02-01

    The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution

  11. Asymptotic theory of weakly dependent random processes

    CERN Document Server

    Rio, Emmanuel

    2017-01-01

    Presenting tools to aid understanding of asymptotic theory and weakly dependent processes, this book is devoted to inequalities and limit theorems for sequences of random variables that are strongly mixing in the sense of Rosenblatt, or absolutely regular. The first chapter introduces covariance inequalities under strong mixing or absolute regularity. These covariance inequalities are applied in Chapters 2, 3 and 4 to moment inequalities, rates of convergence in the strong law, and central limit theorems. Chapter 5 concerns coupling. In Chapter 6 new deviation inequalities and new moment inequalities for partial sums via the coupling lemmas of Chapter 5 are derived and applied to the bounded law of the iterated logarithm. Chapters 7 and 8 deal with the theory of empirical processes under weak dependence. Lastly, Chapter 9 describes links between ergodicity, return times and rates of mixing in the case of irreducible Markov chains. Each chapter ends with a set of exercises. The book is an updated and extended ...

  12. Convergence to equilibrium under a random Hamiltonian

    Science.gov (United States)

    Brandão, Fernando G. S. L.; Ćwikliński, Piotr; Horodecki, Michał; Horodecki, Paweł; Korbicz, Jarosław K.; Mozrzymas, Marek

    2012-09-01

    We analyze equilibration times of subsystems of a larger system under a random total Hamiltonian, in which the basis of the Hamiltonian is drawn from the Haar measure. We obtain that the time of equilibration is of the order of the inverse of the arithmetic average of the Bohr frequencies. To compute the average over a random basis, we compute the inverse of a matrix of overlaps of operators which permute four systems. We first obtain results on such a matrix for a representation of an arbitrary finite group and then apply it to the particular representation of the permutation group under consideration.

  13. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  14. Elements of random walk and diffusion processes

    CERN Document Server

    Ibe, Oliver C

    2013-01-01

    Presents an important and unique introduction to random walk theory Random walk is a stochastic process that has proven to be a useful model in understanding discrete-state discrete-time processes across a wide spectrum of scientific disciplines. Elements of Random Walk and Diffusion Processes provides an interdisciplinary approach by including numerous practical examples and exercises with real-world applications in operations research, economics, engineering, and physics. Featuring an introduction to powerful and general techniques that are used in the application of physical and dynamic

  15. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  16. Stochastic Games for Continuous-Time Jump Processes Under Finite-Horizon Payoff Criterion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Qingda, E-mail: weiqd@hqu.edu.cn [Huaqiao University, School of Economics and Finance (China); Chen, Xian, E-mail: chenxian@amss.ac.cn [Peking University, School of Mathematical Sciences (China)

    2016-10-15

    In this paper we study two-person nonzero-sum games for continuous-time jump processes with the randomized history-dependent strategies under the finite-horizon payoff criterion. The state space is countable, and the transition rates and payoff functions are allowed to be unbounded from above and from below. Under the suitable conditions, we introduce a new topology for the set of all randomized Markov multi-strategies and establish its compactness and metrizability. Then by constructing the approximating sequences of the transition rates and payoff functions, we show that the optimal value function for each player is a unique solution to the corresponding optimality equation and obtain the existence of a randomized Markov Nash equilibrium. Furthermore, we illustrate the applications of our main results with a controlled birth and death system.

  17. Stochastic Games for Continuous-Time Jump Processes Under Finite-Horizon Payoff Criterion

    International Nuclear Information System (INIS)

    Wei, Qingda; Chen, Xian

    2016-01-01

    In this paper we study two-person nonzero-sum games for continuous-time jump processes with the randomized history-dependent strategies under the finite-horizon payoff criterion. The state space is countable, and the transition rates and payoff functions are allowed to be unbounded from above and from below. Under the suitable conditions, we introduce a new topology for the set of all randomized Markov multi-strategies and establish its compactness and metrizability. Then by constructing the approximating sequences of the transition rates and payoff functions, we show that the optimal value function for each player is a unique solution to the corresponding optimality equation and obtain the existence of a randomized Markov Nash equilibrium. Furthermore, we illustrate the applications of our main results with a controlled birth and death system.

  18. Stochastic stability of mechanical systems under renewal jump process parametric excitation

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R.K.; Larsen, Jesper Winther

    2005-01-01

    A dynamic system under parametric excitation in the form of a non-Erlang renewal jump process is considered. The excitation is a random train of nonoverlapping rectangular pulses with equal, deterministic heights. The time intervals between two consecutive jumps up (or down), are the sum of two...

  19. Performance of Power Systems under Sustained Random Perturbations

    Directory of Open Access Journals (Sweden)

    Humberto Verdejo

    2014-01-01

    Full Text Available This paper studies linear systems under sustained additive random perturbations. The stable operating point of an electric power system is replaced by an attracting stationary solution if the system is subjected to (small random additive perturbations. The invariant distribution of this stationary solution gives rise to several performance indices that measure how well the system copes with the randomness. These indices are introduced, showing how they can be used for the optimal tuning of system parameters in the presence of noise. Results on a four-generator two-area system are presented and discussed.

  20. A Computerized Approach to Trickle-Process, Random Assignment.

    Science.gov (United States)

    Braucht, G. Nicholas; Reichardt, Charles S.

    1993-01-01

    Procedures for implementing random assignment with trickle processing and ways they can be corrupted are described. A computerized method for implementing random assignment with trickle processing is presented as a desirable alternative in many situations and a way of protecting against threats to assignment validity. (SLD)

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. On the joint statistics of stable random processes

    International Nuclear Information System (INIS)

    Hopcraft, K I; Jakeman, E

    2011-01-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

  3. Dynamic analysis and reliability assessment of structures with uncertain-but-bounded parameters under stochastic process excitations

    International Nuclear Information System (INIS)

    Do, Duy Minh; Gao, Wei; Song, Chongmin; Tangaramvong, Sawekchai

    2014-01-01

    This paper presents the non-deterministic dynamic analysis and reliability assessment of structures with uncertain-but-bounded parameters under stochastic process excitations. Random ground acceleration from earthquake motion is adopted to illustrate the stochastic process force. The exact change ranges of natural frequencies, random vibration displacement and stress responses of structures are investigated under the interval analysis framework. Formulations for structural reliability are developed considering the safe boundary and structural random vibration responses as interval parameters. An improved particle swarm optimization algorithm, namely randomised lower sequence initialized high-order nonlinear particle swarm optimization algorithm, is employed to capture the better bounds of structural dynamic characteristics, random vibration responses and reliability. Three numerical examples are used to demonstrate the presented method for interval random vibration analysis and reliability assessment of structures. The accuracy of the results obtained by the presented method is verified by the randomised Quasi-Monte Carlo simulation method (QMCSM) and direct Monte Carlo simulation method (MCSM). - Highlights: • Interval uncertainty is introduced into structural random vibration responses. • Interval dynamic reliability assessments of structures are implemented. • Boundaries of structural dynamic response and reliability are achieved

  4. Discrete random signal processing and filtering primer with Matlab

    CERN Document Server

    Poularikas, Alexander D

    2013-01-01

    Engineers in all fields will appreciate a practical guide that combines several new effective MATLAB® problem-solving approaches and the very latest in discrete random signal processing and filtering.Numerous Useful Examples, Problems, and Solutions - An Extensive and Powerful ReviewWritten for practicing engineers seeking to strengthen their practical grasp of random signal processing, Discrete Random Signal Processing and Filtering Primer with MATLAB provides the opportunity to doubly enhance their skills. The author, a leading expert in the field of electrical and computer engineering, offe

  5. Characterisation of random Gaussian and non-Gaussian stress processes in terms of extreme responses

    Directory of Open Access Journals (Sweden)

    Colin Bruno

    2015-01-01

    Full Text Available In the field of military land vehicles, random vibration processes generated by all-terrain wheeled vehicles in motion are not classical stochastic processes with a stationary and Gaussian nature. Non-stationarity of processes induced by the variability of the vehicle speed does not form a major difficulty because the designer can have good control over the vehicle speed by characterising the histogram of instantaneous speed of the vehicle during an operational situation. Beyond this non-stationarity problem, the hard point clearly lies in the fact that the random processes are not Gaussian and are generated mainly by the non-linear behaviour of the undercarriage and the strong occurrence of shocks generated by roughness of the terrain. This non-Gaussian nature is expressed particularly by very high flattening levels that can affect the design of structures under extreme stresses conventionally acquired by spectral approaches, inherent to Gaussian processes and based essentially on spectral moments of stress processes. Due to these technical considerations, techniques for characterisation of random excitation processes generated by this type of carrier need to be changed, by proposing innovative characterisation methods based on time domain approaches as described in the body of the text rather than spectral domain approaches.

  6. Pseudo random signal processing theory and application

    CERN Document Server

    Zepernick, Hans-Jurgen

    2013-01-01

    In recent years, pseudo random signal processing has proven to be a critical enabler of modern communication, information, security and measurement systems. The signal's pseudo random, noise-like properties make it vitally important as a tool for protecting against interference, alleviating multipath propagation and allowing the potential of sharing bandwidth with other users. Taking a practical approach to the topic, this text provides a comprehensive and systematic guide to understanding and using pseudo random signals. Covering theoretical principles, design methodologies and applications

  7. Level sets and extrema of random processes and fields

    CERN Document Server

    Azais, Jean-Marc

    2009-01-01

    A timely and comprehensive treatment of random field theory with applications across diverse areas of study Level Sets and Extrema of Random Processes and Fields discusses how to understand the properties of the level sets of paths as well as how to compute the probability distribution of its extremal values, which are two general classes of problems that arise in the study of random processes and fields and in related applications. This book provides a unified and accessible approach to these two topics and their relationship to classical theory and Gaussian processes and fields, and the most modern research findings are also discussed. The authors begin with an introduction to the basic concepts of stochastic processes, including a modern review of Gaussian fields and their classical inequalities. Subsequent chapters are devoted to Rice formulas, regularity properties, and recent results on the tails of the distribution of the maximum. Finally, applications of random fields to various areas of mathematics a...

  8. Solution-Processed Carbon Nanotube True Random Number Generator.

    Science.gov (United States)

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  9. Scaling behaviour of randomly alternating surface growth processes

    International Nuclear Information System (INIS)

    Raychaudhuri, Subhadip; Shapir, Yonathan

    2002-01-01

    The scaling properties of the roughness of surfaces grown by two different processes randomly alternating in time are addressed. The duration of each application of the two primary processes is assumed to be independently drawn from given distribution functions. We analytically address processes in which the two primary processes are linear and extend the conclusions to nonlinear processes as well. The growth scaling exponent of the average roughness with the number of applications is found to be determined by the long time tail of the distribution functions. For processes in which both mean application times are finite, the scaling behaviour follows that of the corresponding cyclical process in which the uniform application time of each primary process is given by its mean. If the distribution functions decay with a small enough power law for the mean application times to diverge, the growth exponent is found to depend continuously on this power-law exponent. In contrast, the roughness exponent does not depend on the timing of the applications. The analytical results are supported by numerical simulations of various pairs of primary processes and with different distribution functions. Self-affine surfaces grown by two randomly alternating processes are common in nature (e.g., due to randomly changing weather conditions) and in man-made devices such as rechargeable batteries

  10. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    Science.gov (United States)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  11. X-ray microtomography study of the compaction process of rods under tapping.

    Science.gov (United States)

    Fu, Yang; Xi, Yan; Cao, Yixin; Wang, Yujie

    2012-05-01

    We present an x-ray microtomography study of the compaction process of cylindrical rods under tapping. The process is monitored by measuring the evolution of the orientational order parameter, local, and overall packing densities as a function of the tapping number for different tapping intensities. The slow relaxation dynamics of the orientational order parameter can be well fitted with a stretched-exponential law with stretching exponents ranging from 0.9 to 1.6. The corresponding relaxation time versus tapping intensity follows an Arrhenius behavior which is reminiscent of the slow dynamics in thermal glassy systems. We also investigated the boundary effect on the ordering process and found that boundary rods order faster than interior ones. In searching for the underlying mechanism of the slow dynamics, we estimated the initial random velocities of the rods under tapping and found that the ordering process is compatible with a diffusion mechanism. The average coordination number as a function of the tapping number at different tapping intensities has also been measured, which spans a range from 6 to 8.

  12. The Damage Effects in Steel Bridges under Highway Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning; Nielsen, Jette Andkjær

    1996-01-01

    In the present investigation, fatigue damage accumulation in steel bridges under highway random loading is studied. In the experimental part of the investigation, fatigue test series on welded plate test specimens have been carried through. The fatigue tests have been carried out using load histo...... indicate that the linear fatigue damage accumulation formula, which is normally used in the design against fatigue in steel bridges, may give results, which are unconservative.......In the present investigation, fatigue damage accumulation in steel bridges under highway random loading is studied. In the experimental part of the investigation, fatigue test series on welded plate test specimens have been carried through. The fatigue tests have been carried out using load...

  13. A prospective randomized study comparing percutaneous nephrolithotomy under combined spinal-epidural anesthesia with percutaneous nephrolithotomy under general anesthesia.

    Science.gov (United States)

    Singh, Vishwajeet; Sinha, Rahul Janak; Sankhwar, S N; Malik, Anita

    2011-01-01

    A prospective randomized study was executed to compare the surgical parameters and stone clearance in patients who underwent percutaneous nephrolithotomy (PNL) under combined spinal-epidural anesthesia (CSEA) versus those who underwent PNL under general anesthesia (GA). Between January 2008 to December 2009, 64 patients with renal calculi were randomized into 2 groups and evaluated for the purpose of this study. Group 1 consisted of patients who underwent PNL under CSEA and Group 2 consisted of patients who underwent PNL under GA. The operative time, stone clearance rate, visual pain analog score, mean analgesic dose and mean hospital stay were compared amongst other parameters. The difference between visual pain analog score after the operation and the dose of analgesic requirement was significant on statistical analysis between both groups. PNL under CSEA is as effective and safe as PNL under GA. Patients who undergo PNL under CESA require lesser analgesic dose and have a shorter hospital stay. Copyright © 2011 S. Karger AG, Basel.

  14. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  15. Crack closure and growth behavior of short fatigue cracks under random loading (part I : details of crack closure behavior)

    International Nuclear Information System (INIS)

    Lee, Shin Young; Song, Ji Ho

    2000-01-01

    Crack closure and growth behavior of physically short fatigue cracks under random loading are investigated by performing narrow-and wide-band random loading tests for various stress ratios. Artificially prepared two-dimensional, short through-thickness cracks are used. The closure behavior of short cracks under random loading is discussed, comparing with that of short cracks under constant-amplitude loading and also that of long cracks under random loading. Irrespective of random loading spectrum or block length, the crack opening load of short cracks is much lower under random loading than under constant-amplitude loading corresponding to the largest load cycle in a random load history, contrary to the behavior of long cracks that the crack opening load under random loading is nearly the same as or slightly higher than constant-amplitude results. This result indicates that the largest load cycle in a random load history has an effect to enhance crack opening of short cracks

  16. Renewal theory for perturbed random walks and similar processes

    CERN Document Server

    Iksanov, Alexander

    2016-01-01

    This book offers a detailed review of perturbed random walks, perpetuities, and random processes with immigration. Being of major importance in modern probability theory, both theoretical and applied, these objects have been used to model various phenomena in the natural sciences as well as in insurance and finance. The book also presents the many significant results and efficient techniques and methods that have been worked out in the last decade. The first chapter is devoted to perturbed random walks and discusses their asymptotic behavior and various functionals pertaining to them, including supremum and first-passage time. The second chapter examines perpetuities, presenting results on continuity of their distributions and the existence of moments, as well as weak convergence of divergent perpetuities. Focusing on random processes with immigration, the third chapter investigates the existence of moments, describes long-time behavior and discusses limit theorems, both with and without scaling. Chapters fou...

  17. Time-varying output performances of piezoelectric vibration energy harvesting under nonstationary random vibrations

    Science.gov (United States)

    Yoon, Heonjun; Kim, Miso; Park, Choon-Su; Youn, Byeng D.

    2018-01-01

    Piezoelectric vibration energy harvesting (PVEH) has received much attention as a potential solution that could ultimately realize self-powered wireless sensor networks. Since most ambient vibrations in nature are inherently random and nonstationary, the output performances of PVEH devices also randomly change with time. However, little attention has been paid to investigating the randomly time-varying electroelastic behaviors of PVEH systems both analytically and experimentally. The objective of this study is thus to make a step forward towards a deep understanding of the time-varying performances of PVEH devices under nonstationary random vibrations. Two typical cases of nonstationary random vibration signals are considered: (1) randomly-varying amplitude (amplitude modulation; AM) and (2) randomly-varying amplitude with randomly-varying instantaneous frequency (amplitude and frequency modulation; AM-FM). In both cases, this study pursues well-balanced correlations of analytical predictions and experimental observations to deduce the relationships between the time-varying output performances of the PVEH device and two primary input parameters, such as a central frequency and an external electrical resistance. We introduce three correlation metrics to quantitatively compare analytical prediction and experimental observation, including the normalized root mean square error, the correlation coefficient, and the weighted integrated factor. Analytical predictions are in an excellent agreement with experimental observations both mechanically and electrically. This study provides insightful guidelines for designing PVEH devices to reliably generate electric power under nonstationary random vibrations.

  18. Circular random motion in diatom gliding under isotropic conditions

    International Nuclear Information System (INIS)

    Gutiérrez-Medina, Braulio; Maldonado, Ana Iris Peña; Guerra, Andrés Jiménez; Rubio, Yadiralia Covarrubias; Meza, Jessica Viridiana García

    2014-01-01

    How cells migrate has been investigated primarily for the case of trajectories composed by joined straight segments. In contrast, little is known when cellular motion follows intrinsically curved paths. Here, we use time-lapse optical microscopy and automated trajectory tracking to investigate how individual cells of the diatom Nitzschia communis glide across surfaces under isotropic environmental conditions. We find a distinct kind of random motion, where trajectories are formed by circular arcs traveled at constant speed, alternated with random stoppages, direction reversals and changes in the orientation of the arcs. Analysis of experimental and computer-simulated trajectories show that the circular random motion of diatom gliding is not optimized for long-distance travel but rather for recurrent coverage of limited surface area. These results suggest that one main biological role for this type of diatom motility is to efficiently build the foundation of algal biofilms. (paper)

  19. Money creation process in a random redistribution model

    Science.gov (United States)

    Chen, Siyan; Wang, Yougui; Li, Keqiang; Wu, Jinshan

    2014-01-01

    In this paper, the dynamical process of money creation in a random exchange model with debt is investigated. The money creation kinetics are analyzed by both the money-transfer matrix method and the diffusion method. From both approaches, we attain the same conclusion: the source of money creation in the case of random exchange is the agents with neither money nor debt. These analytical results are demonstrated by computer simulations.

  20. Quasi-steady-state analysis of two-dimensional random intermittent search processes

    KAUST Repository

    Bressloff, Paul C.

    2011-06-01

    We use perturbation methods to analyze a two-dimensional random intermittent search process, in which a searcher alternates between a diffusive search phase and a ballistic movement phase whose velocity direction is random. A hidden target is introduced within a rectangular domain with reflecting boundaries. If the searcher moves within range of the target and is in the search phase, it has a chance of detecting the target. A quasi-steady-state analysis is applied to the corresponding Chapman-Kolmogorov equation. This generates a reduced Fokker-Planck description of the search process involving a nonzero drift term and an anisotropic diffusion tensor. In the case of a uniform direction distribution, for which there is zero drift, and isotropic diffusion, we use the method of matched asymptotics to compute the mean first passage time (MFPT) to the target, under the assumption that the detection range of the target is much smaller than the size of the domain. We show that an optimal search strategy exists, consistent with previous studies of intermittent search in a radially symmetric domain that were based on a decoupling or moment closure approximation. We also show how the decoupling approximation can break down in the case of biased search processes. Finally, we analyze the MFPT in the case of anisotropic diffusion and find that anisotropy can be useful when the searcher starts from a fixed location. © 2011 American Physical Society.

  1. Quasi-steady-state analysis of two-dimensional random intermittent search processes

    KAUST Repository

    Bressloff, Paul C.; Newby, Jay M.

    2011-01-01

    We use perturbation methods to analyze a two-dimensional random intermittent search process, in which a searcher alternates between a diffusive search phase and a ballistic movement phase whose velocity direction is random. A hidden target is introduced within a rectangular domain with reflecting boundaries. If the searcher moves within range of the target and is in the search phase, it has a chance of detecting the target. A quasi-steady-state analysis is applied to the corresponding Chapman-Kolmogorov equation. This generates a reduced Fokker-Planck description of the search process involving a nonzero drift term and an anisotropic diffusion tensor. In the case of a uniform direction distribution, for which there is zero drift, and isotropic diffusion, we use the method of matched asymptotics to compute the mean first passage time (MFPT) to the target, under the assumption that the detection range of the target is much smaller than the size of the domain. We show that an optimal search strategy exists, consistent with previous studies of intermittent search in a radially symmetric domain that were based on a decoupling or moment closure approximation. We also show how the decoupling approximation can break down in the case of biased search processes. Finally, we analyze the MFPT in the case of anisotropic diffusion and find that anisotropy can be useful when the searcher starts from a fixed location. © 2011 American Physical Society.

  2. Probabilistic SSME blades structural response under random pulse loading

    Science.gov (United States)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  3. Multifractal detrended fluctuation analysis of analog random multiplicative processes

    Energy Technology Data Exchange (ETDEWEB)

    Silva, L.B.M.; Vermelho, M.V.D. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil); Lyra, M.L. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)], E-mail: marcelo@if.ufal.br; Viswanathan, G.M. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)

    2009-09-15

    We investigate non-Gaussian statistical properties of stationary stochastic signals generated by an analog circuit that simulates a random multiplicative process with weak additive noise. The random noises are originated by thermal shot noise and avalanche processes, while the multiplicative process is generated by a fully analog circuit. The resulting signal describes stochastic time series of current interest in several areas such as turbulence, finance, biology and environment, which exhibit power-law distributions. Specifically, we study the correlation properties of the signal by employing a detrended fluctuation analysis and explore its multifractal nature. The singularity spectrum is obtained and analyzed as a function of the control circuit parameter that tunes the asymptotic power-law form of the probability distribution function.

  4. Fatigue in Aluminum Highway Bridges under Random Loading

    DEFF Research Database (Denmark)

    Rom, Søren; Agerskov, Henning

    2014-01-01

    Fatigue damage accumulation in aluminum highway bridges under random loading is studied. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis. In the experimental part of the investigation, fatigue test series on welded plate test...... is normally used in the design against fatigue in aluminum bridges, may give results which are unconservative. The validity of the results obtained from Miner’s rule will depend on the distribution of the load history in tension and compression....

  5. The Initial Regression Statistical Characteristics of Intervals Between Zeros of Random Processes

    Directory of Open Access Journals (Sweden)

    V. K. Hohlov

    2014-01-01

    Full Text Available The article substantiates the initial regression statistical characteristics of intervals between zeros of realizing random processes, studies their properties allowing the use these features in the autonomous information systems (AIS of near location (NL. Coefficients of the initial regression (CIR to minimize the residual sum of squares of multiple initial regression views are justified on the basis of vector representations associated with a random vector notion of analyzed signal parameters. It is shown that even with no covariance-based private CIR it is possible to predict one random variable through another with respect to the deterministic components. The paper studies dependences of CIR interval sizes between zeros of the narrowband stationary in wide-sense random process with its energy spectrum. Particular CIR for random processes with Gaussian and rectangular energy spectra are obtained. It is shown that the considered CIRs do not depend on the average frequency of spectra, are determined by the relative bandwidth of the energy spectra, and weakly depend on the type of spectrum. CIR properties enable its use as an informative parameter when implementing temporary regression methods of signal processing, invariant to the average rate and variance of the input implementations. We consider estimates of the average energy spectrum frequency of the random stationary process by calculating the length of the time interval corresponding to the specified number of intervals between zeros. It is shown that the relative variance in estimation of the average energy spectrum frequency of stationary random process with increasing relative bandwidth ceases to depend on the last process implementation in processing above ten intervals between zeros. The obtained results can be used in the AIS NL to solve the tasks of detection and signal recognition, when a decision is made in conditions of unknown mathematical expectations on a limited observation

  6. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    International Nuclear Information System (INIS)

    Fujimoto, Kazufumi; Nagai, Hideo; Runggaldier, Wolfgang J.

    2013-01-01

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand it considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).

  7. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    Energy Technology Data Exchange (ETDEWEB)

    Fujimoto, Kazufumi, E-mail: m_fuji@kvj.biglobe.ne.jp [Bank of Tokyo-Mitsubishi UFJ, Ltd., Corporate Risk Management Division (Japan); Nagai, Hideo, E-mail: nagai@sigmath.es.osaka-u.ac.jp [Osaka University, Division of Mathematical Science for Social Systems, Graduate School of Engineering Science (Japan); Runggaldier, Wolfgang J., E-mail: runggal@math.unipd.it [Universita di Padova, Dipartimento di Matematica Pura ed Applicata (Italy)

    2013-02-15

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand it considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).

  8. Random accumulated damage evaluation under multiaxial fatigue loading conditions

    Directory of Open Access Journals (Sweden)

    V. Anes

    2015-07-01

    Full Text Available Multiaxial fatigue is a very important physical phenomenon to take into account in several mechanical components; its study is of utmost importance to avoid unexpected failure of equipment, vehicles or structures. Among several fatigue characterization tools, a correct definition of a damage parameter and a load cycle counting method under multiaxial loading conditions show to be crucial to estimate multiaxial fatigue life. In this paper, the SSF equivalent stress and the virtual cycle counting method are presented and discussed, regarding their physical foundations and their capability to characterize multiaxial fatigue damage under complex loading blocks. Moreover, it is presented their applicability to evaluate random fatigue damage.

  9. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    Science.gov (United States)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  10. Provable quantum advantage in randomness processing

    OpenAIRE

    Dale, H; Jennings, D; Rudolph, T

    2015-01-01

    Quantum advantage is notoriously hard to find and even harder to prove. For example the class of functions computable with classical physics actually exactly coincides with the class computable quantum-mechanically. It is strongly believed, but not proven, that quantum computing provides exponential speed-up for a range of problems, such as factoring. Here we address a computational scenario of "randomness processing" in which quantum theory provably yields, not only resource reduction over c...

  11. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  12. UNDERSTANDING SEVERE WEATHER PROCESSES THROUGH SPATIOTEMPORAL RELATIONAL RANDOM FORESTS

    Data.gov (United States)

    National Aeronautics and Space Administration — UNDERSTANDING SEVERE WEATHER PROCESSES THROUGH SPATIOTEMPORAL RELATIONAL RANDOM FORESTS AMY MCGOVERN, TIMOTHY SUPINIE, DAVID JOHN GAGNE II, NATHANIEL TROUTMAN,...

  13. Optimal redundant systems for works with random processing time

    International Nuclear Information System (INIS)

    Chen, M.; Nakagawa, T.

    2013-01-01

    This paper studies the optimal redundant policies for a manufacturing system processing jobs with random working times. The redundant units of the parallel systems and standby systems are subject to stochastic failures during the continuous production process. First, a job consisting of only one work is considered for both redundant systems and the expected cost functions are obtained. Next, each redundant system with a random number of units is assumed for a single work. The expected cost functions and the optimal expected numbers of units are derived for redundant systems. Subsequently, the production processes of N tandem works are introduced for parallel and standby systems, and the expected cost functions are also summarized. Finally, the number of works is estimated by a Poisson distribution for the parallel and standby systems. Numerical examples are given to demonstrate the optimization problems of redundant systems

  14. High cycle fatigue of austenitic stainless steels under random loading

    International Nuclear Information System (INIS)

    Gauthier, J.P.; Petrequin, P.

    1987-08-01

    To investigate reactor components, load control random fatigue tests were performed at 300 0 C and 550 0 C, on specimens from austenitic stainless steels plates in the transverse orientation. Random solicitations are produced on closed loop servo-hydraulic machines by a mini computer which generates random load sequence by the use of reduced Markovian matrix. The method has the advantage of taking into account the mean load for each cycle. The solicitations generated are those of a stationary gaussian process. Fatigue tests have been mainly performed in the endurance region of fatigue curve, with scattering determination using stair case method. Experimental results have been analysed aiming at determining design curves for components calculations, depending on irregularity factor and temperature. Analysis in term of mean square root fatigue limit calculation, shows that random loading gives more damage than constant amplitude loading. Damage calculations following Miner rule have been made using the probability density function for the case where the irregularity factor is nearest to 100 %. The Miner rule is too conservative for our results. A method using design curves including random loading effects with irregularity factor as an indexing parameter is proposed

  15. Apparent scale correlations in a random multifractal process

    DEFF Research Database (Denmark)

    Cleve, Jochen; Schmiegel, Jürgen; Greiner, Martin

    2008-01-01

    We discuss various properties of a homogeneous random multifractal process, which are related to the issue of scale correlations. By design, the process has no built-in scale correlations. However, when it comes to observables like breakdown coefficients, which are based on a coarse......-graining of the multifractal field, scale correlations do appear. In the log-normal limit of the model process, the conditional distributions and moments of breakdown coefficients reproduce the observations made in fully developed small-scale turbulence. These findings help to understand several puzzling empirical details...

  16. Robustness of Dengue Complex Network under Targeted versus Random Attack

    Directory of Open Access Journals (Sweden)

    Hafiz Abid Mahmood Malik

    2017-01-01

    Full Text Available Dengue virus infection is one of those epidemic diseases that require much consideration in order to save the humankind from its unsafe impacts. According to the World Health Organization (WHO, 3.6 billion individuals are at risk because of the dengue virus sickness. Researchers are striving to comprehend the dengue threat. This study is a little commitment to those endeavors. To observe the robustness of the dengue network, we uprooted the links between nodes randomly and targeted by utilizing different centrality measures. The outcomes demonstrated that 5% targeted attack is equivalent to the result of 65% random assault, which showed the topology of this complex network validated a scale-free network instead of random network. Four centrality measures (Degree, Closeness, Betweenness, and Eigenvector have been ascertained to look for focal hubs. It has been observed through the results in this study that robustness of a node and links depends on topology of the network. The dengue epidemic network presented robust behaviour under random attack, and this network turned out to be more vulnerable when the hubs of higher degree have higher probability to fail. Moreover, representation of this network has been projected, and hub removal impact has been shown on the real map of Gombak (Malaysia.

  17. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  18. Network formation determined by the diffusion process of random walkers

    International Nuclear Information System (INIS)

    Ikeda, Nobutoshi

    2008-01-01

    We studied the diffusion process of random walkers in networks formed by their traces. This model considers the rise and fall of links determined by the frequency of transports of random walkers. In order to examine the relation between the formed network and the diffusion process, a situation in which multiple random walkers start from the same vertex is investigated. The difference in diffusion rate of random walkers according to the difference in dimension of the initial lattice is very important for determining the time evolution of the networks. For example, complete subgraphs can be formed on a one-dimensional lattice while a graph with a power-law vertex degree distribution is formed on a two-dimensional lattice. We derived some formulae for predicting network changes for the 1D case, such as the time evolution of the size of nearly complete subgraphs and conditions for their collapse. The networks formed on the 2D lattice are characterized by the existence of clusters of highly connected vertices and their life time. As the life time of such clusters tends to be small, the exponent of the power-law distribution changes from γ ≅ 1-2 to γ ≅ 3

  19. Random covering of the circle: the configuration-space of the free deposition process

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-12-12

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = {rho}, for some finite density {rho} of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Renyi's random sequential adsorption model.

  20. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  1. On a Stochastic Failure Model under Random Shocks

    Science.gov (United States)

    Cha, Ji Hwan

    2013-02-01

    In most conventional settings, the events caused by an external shock are initiated at the moments of its occurrence. In this paper, we study a new classes of shock model, where each shock from a nonhomogeneous Poisson processes can trigger a failure of a system not immediately, as in classical extreme shock models, but with delay of some random time. We derive the corresponding survival and failure rate functions. Furthermore, we study the limiting behaviour of the failure rate function where it is applicable.

  2. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  3. Design of Energy Aware Adder Circuits Considering Random Intra-Die Process Variations

    Directory of Open Access Journals (Sweden)

    Marco Lanuzza

    2011-04-01

    Full Text Available Energy consumption is one of the main barriers to current high-performance designs. Moreover, the increased variability experienced in advanced process technologies implies further timing yield concerns and therefore intensifies this obstacle. Thus, proper techniques to achieve robust designs are a critical requirement for integrated circuit success. In this paper, the influence of intra-die random process variations is analyzed considering the particular case of the design of energy aware adder circuits. Five well known adder circuits were designed exploiting an industrial 45 nm static complementary metal-oxide semiconductor (CMOS standard cell library. The designed adders were comparatively evaluated under different energy constraints. As a main result, the performed analysis demonstrates that, for a given energy budget, simpler circuits (which are conventionally identified as low-energy slow architectures operating at higher power supply voltages can achieve a timing yield significantly better than more complex faster adders when used in low-power design with supply voltages lower than nominal.

  4. Synthesis for robust synchronization of chaotic systems under output feedback control with multiple random delays

    International Nuclear Information System (INIS)

    Wen Guilin; Wang Qingguo; Lin Chong; Han Xu; Li Guangyao

    2006-01-01

    Synchronization under output feedback control with multiple random time delays is studied, using the paradigm in nonlinear physics-Chua's circuit. Compared with other synchronization control methods, output feedback control with multiple random delay is superior for a realistic synchronization application to secure communications. Sufficient condition for global stability of delay-dependent synchronization is established based on the LMI technique. Numerical simulations fully support the analytical approach, in spite of the random delays

  5. Random walk on a population of random walkers

    International Nuclear Information System (INIS)

    Agliari, E; Burioni, R; Cassi, D; Neri, F M

    2008-01-01

    We consider a population of N labelled random walkers moving on a substrate, and an excitation jumping among the walkers upon contact. The label X(t) of the walker carrying the excitation at time t can be viewed as a stochastic process, where the transition probabilities are a stochastic process themselves. Upon mapping onto two simpler processes, the quantities characterizing X(t) can be calculated in the limit of long times and low walkers density. The results are compared with numerical simulations. Several different topologies for the substrate underlying diffusion are considered

  6. High-Performance Pseudo-Random Number Generation on Graphics Processing Units

    OpenAIRE

    Nandapalan, Nimalan; Brent, Richard P.; Murray, Lawrence M.; Rendell, Alistair

    2011-01-01

    This work considers the deployment of pseudo-random number generators (PRNGs) on graphics processing units (GPUs), developing an approach based on the xorgens generator to rapidly produce pseudo-random numbers of high statistical quality. The chosen algorithm has configurable state size and period, making it ideal for tuning to the GPU architecture. We present a comparison of both speed and statistical quality with other common parallel, GPU-based PRNGs, demonstrating favourable performance o...

  7. Nonlinear Estimation of Discrete-Time Signals Under Random Observation Delay

    International Nuclear Information System (INIS)

    Caballero-Aguila, R.; Jimenez-Lopez, J. D.; Hermoso-Carazo, A.; Linares-Perez, J.; Nakamori, S.

    2008-01-01

    This paper presents an approximation to the nonlinear least-squares estimation problem of discrete-time stochastic signals using nonlinear observations with additive white noise which can be randomly delayed by one sampling time. The observation delay is modelled by a sequence of independent Bernoulli random variables whose values, zero or one, indicate that the real observation arrives on time or it is delayed and, hence, the available measurement to estimate the signal is not up-to-date. Assuming that the state-space model generating the signal is unknown and only the covariance functions of the processes involved in the observation equation are ready for use, a filtering algorithm based on linear approximations of the real observations is proposed.

  8. The Fatigue Behavior of Steel Structures under Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning

    2008-01-01

    Fatigue damage accumulation in steel structures under random loading has been studied in a number of investigations at the Technical University of Denmark. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis. In the experimental part...... and variable amplitude fatigue test results. Both the fracture mechanics analysis and the fatigue test results indicate that Miner’s rule, which is normally used in the design against fatigue in steel structures, may give results, which are unconservative, and that the validity of the results obtained from...

  9. Modeling Random Telegraph Noise Under Switched Bias Conditions Using Cyclostationary RTS Noise

    NARCIS (Netherlands)

    van der Wel, A.P.; Klumperink, Eric A.M.; Vandamme, L.K.J.; Nauta, Bram

    In this paper, we present measurements and simulation of random telegraph signal (RTS) noise in n-channel MOSFETs under periodic large signal gate-source excitation (switched bias conditions). This is particularly relevant to analog CMOS circuit design where large signal swings occur and where LF

  10. An empirical test of pseudo random number generators by means of an exponential decaying process

    International Nuclear Information System (INIS)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A.; Mora F, L.E.

    2007-01-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  11. Generation and monitoring of a discrete stable random process

    CERN Document Server

    Hopcraft, K I; Matthews, J O

    2002-01-01

    A discrete stochastic process with stationary power law distribution is obtained from a death-multiple immigration population model. Emigrations from the population form a random series of events which are monitored by a counting process with finite-dynamic range and response time. It is shown that the power law behaviour of the population is manifested in the intermittent behaviour of the series of events. (letter to the editor)

  12. Scaling behaviour of randomly alternating surface growth processes

    CERN Document Server

    Raychaudhuri, S

    2002-01-01

    The scaling properties of the roughness of surfaces grown by two different processes randomly alternating in time are addressed. The duration of each application of the two primary processes is assumed to be independently drawn from given distribution functions. We analytically address processes in which the two primary processes are linear and extend the conclusions to nonlinear processes as well. The growth scaling exponent of the average roughness with the number of applications is found to be determined by the long time tail of the distribution functions. For processes in which both mean application times are finite, the scaling behaviour follows that of the corresponding cyclical process in which the uniform application time of each primary process is given by its mean. If the distribution functions decay with a small enough power law for the mean application times to diverge, the growth exponent is found to depend continuously on this power-law exponent. In contrast, the roughness exponent does not depe...

  13. Nuclear structure and weak rates of heavy waiting point nuclei under rp-process conditions

    Science.gov (United States)

    Nabi, Jameel-Un; Böyükata, Mahmut

    2017-01-01

    The structure and the weak interaction mediated rates of the heavy waiting point (WP) nuclei 80Zr, 84Mo, 88Ru, 92Pd and 96Cd along N = Z line were studied within the interacting boson model-1 (IBM-1) and the proton-neutron quasi-particle random phase approximation (pn-QRPA). The energy levels of the N = Z WP nuclei were calculated by fitting the essential parameters of IBM-1 Hamiltonian and their geometric shapes were predicted by plotting potential energy surfaces (PESs). Half-lives, continuum electron capture rates, positron decay rates, electron capture cross sections of WP nuclei, energy rates of β-delayed protons and their emission probabilities were later calculated using the pn-QRPA. The calculated Gamow-Teller strength distributions were compared with previous calculation. We present positron decay and continuum electron capture rates on these WP nuclei under rp-process conditions using the same model. For the rp-process conditions, the calculated total weak rates are twice the Skyrme HF+BCS+QRPA rates for 80Zr. For remaining nuclei the two calculations compare well. The electron capture rates are significant and compete well with the corresponding positron decay rates under rp-process conditions. The finding of the present study supports that electron capture rates form an integral part of the weak rates under rp-process conditions and has an important role for the nuclear model calculations.

  14. Fatigue in Steel Structures under Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning

    1999-01-01

    types of welded plate test specimens and full-scale offshore tubular joints. The materials that have been used are either conventional structural steel with a yield stress of ~ 360-410 MPa or high-strength steel with a yield stress of ~ 810-1010 MPa. The fatigue tests and the fracture mechanics analyses......Fatigue damage accumulation in steel structures under random loading is studied. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis. In the experimental part of the investigation, fatigue test series have been carried through on various...... have been carried out using load histories, which are realistic in relation to the types of structures studied, i.e. primarily bridges, offshore structures and chimneys. In general, the test series carried through show a significant difference between constant amplitude and variable amplitude fatigue...

  15. Order out of Randomness: Self-Organization Processes in Astrophysics

    Science.gov (United States)

    Aschwanden, Markus J.; Scholkmann, Felix; Béthune, William; Schmutz, Werner; Abramenko, Valentina; Cheung, Mark C. M.; Müller, Daniel; Benz, Arnold; Chernov, Guennadi; Kritsuk, Alexei G.; Scargle, Jeffrey D.; Melatos, Andrew; Wagoner, Robert V.; Trimble, Virginia; Green, William H.

    2018-03-01

    Self-organization is a property of dissipative nonlinear processes that are governed by a global driving force and a local positive feedback mechanism, which creates regular geometric and/or temporal patterns, and decreases the entropy locally, in contrast to random processes. Here we investigate for the first time a comprehensive number of (17) self-organization processes that operate in planetary physics, solar physics, stellar physics, galactic physics, and cosmology. Self-organizing systems create spontaneous " order out of randomness", during the evolution from an initially disordered system to an ordered quasi-stationary system, mostly by quasi-periodic limit-cycle dynamics, but also by harmonic (mechanical or gyromagnetic) resonances. The global driving force can be due to gravity, electromagnetic forces, mechanical forces (e.g., rotation or differential rotation), thermal pressure, or acceleration of nonthermal particles, while the positive feedback mechanism is often an instability, such as the magneto-rotational (Balbus-Hawley) instability, the convective (Rayleigh-Bénard) instability, turbulence, vortex attraction, magnetic reconnection, plasma condensation, or a loss-cone instability. Physical models of astrophysical self-organization processes require hydrodynamic, magneto-hydrodynamic (MHD), plasma, or N-body simulations. Analytical formulations of self-organizing systems generally involve coupled differential equations with limit-cycle solutions of the Lotka-Volterra or Hopf-bifurcation type.

  16. Random migration processes between two stochastic epidemic centers.

    Science.gov (United States)

    Sazonov, Igor; Kelbert, Mark; Gravenor, Michael B

    2016-04-01

    We consider the epidemic dynamics in stochastic interacting population centers coupled by random migration. Both the epidemic and the migration processes are modeled by Markov chains. We derive explicit formulae for the probability distribution of the migration process, and explore the dependence of outbreak patterns on initial parameters, population sizes and coupling parameters, using analytical and numerical methods. We show the importance of considering the movement of resident and visitor individuals separately. The mean field approximation for a general migration process is derived and an approximate method that allows the computation of statistical moments for networks with highly populated centers is proposed and tested numerically. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  18. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  19. Timing of the Crab pulsar III. The slowing down and the nature of the random process

    International Nuclear Information System (INIS)

    Groth, E.J.

    1975-01-01

    The Crab pulsar arrival times are analyzed. The data are found to be consistent with a smooth slowing down with a braking index of 2.515+-0.005. Superposed on the smooth slowdown is a random process which has the same second moments as a random walk in the frequency. The strength of the random process is R 2 >=0.53 (+0.24, -0.12) x10 -22 Hz 2 s -1 , where R is the mean rate of steps and 2 > is the second moment of the step amplitude distribution. Neither the braking index nor the strength of the random process shows evidence of statistically significant time variations, although small fluctuations in the braking index and rather large fluctuations in the noise strength cannot be ruled out. There is a possibility that the random process contains a small component with the same second moments as a random walk in the phase. If so, a time scale of 3.5 days is indicated

  20. Continuous state branching processes in random environment: The Brownian case

    OpenAIRE

    Palau, Sandra; Pardo, Juan Carlos

    2015-01-01

    We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...

  1. DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING

    Data.gov (United States)

    National Aeronautics and Space Administration — DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING SUBHASISH MOHANTY*, ADITI CHATTOPADHYAY, JOHN N. RAJADAS, AND CLYDE...

  2. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  3. Research on the reliability of friction system under combined additive and multiplicative random excitations

    Science.gov (United States)

    Sun, Jiaojiao; Xu, Wei; Lin, Zifei

    2018-01-01

    In this paper, the reliability of a non-linearly damped friction oscillator under combined additive and multiplicative Gaussian white noise excitations is investigated. The stochastic averaging method, which is usually applied to the research of smooth system, has been extended to the study of the reliability of non-smooth friction system. The results indicate that the reliability of friction system can be improved by Coulomb friction and reduced by random excitations. In particular, the effect of the external random excitation on the reliability is larger than the effect of the parametric random excitation. The validity of the analytical results is verified by the numerical results.

  4. A new crack growth model for life prediction under random loading

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Chen, Zhi Wei

    1999-01-01

    The load interaction effect in variable amplitude fatigue test is a very important issue for correctly predicting fatigue life. Some prediction methods for retardation are reviewed and the problems discussed. The so-called 'under-load' effect is also of importance for a prediction model to work properly under random load spectrum. A new model that is simple in form but combines overload plastic zone and residual stress considerations together with Elber's closure concept is proposed to fully take account of the load-interaction effects including both over-load and under-load effects. Applying this new model to complex load sequence is explored here. Simulations of tests show the improvement of the new model over other models. The best prediction (mostly closely resembling test curve) is given by the newly proposed Chen-Lee model

  5. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  6. Tempered stable laws as random walk limits

    OpenAIRE

    Chakrabarty, Arijit; Meerschaert, Mark M.

    2010-01-01

    Stable laws can be tempered by modifying the L\\'evy measure to cool the probability of large jumps. Tempered stable laws retain their signature power law behavior at infinity, and infinite divisibility. This paper develops random walk models that converge to a tempered stable law under a triangular array scheme. Since tempered stable laws and processes are useful in statistical physics, these random walk models can provide a basic physical model for the underlying physical phenomena.

  7. Traffic and random processes an introduction

    CERN Document Server

    Mauro, Raffaele

    2015-01-01

    This book deals in a basic and systematic manner with a the fundamentals of random function theory and looks at some aspects related to arrival, vehicle headway and operational speed processes at the same time. The work serves as a useful practical and educational tool and aims at providing stimulus and motivation to investigate issues of such a strong applicative interest. It has a clearly discursive and concise structure, in which numerical examples are given to clarify the applications of the suggested theoretical model. Some statistical characterizations are fully developed in order to illustrate the peculiarities of specific modeling approaches; finally, there is a useful bibliography for in-depth thematic analysis.

  8. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    Science.gov (United States)

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  9. Post-processing Free Quantum Random Number Generator Based on Avalanche Photodiode Array

    International Nuclear Information System (INIS)

    Li Yang; Liao Sheng-Kai; Liang Fu-Tian; Shen Qi; Liang Hao; Peng Cheng-Zhi

    2016-01-01

    Quantum random number generators adopting single photon detection have been restricted due to the non-negligible dead time of avalanche photodiodes (APDs). We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32 × 32 APD array is up to tens of Gbits/s. (paper)

  10. Component processes underlying future thinking.

    Science.gov (United States)

    D'Argembeau, Arnaud; Ortoleva, Claudia; Jumentier, Sabrina; Van der Linden, Martial

    2010-09-01

    This study sought to investigate the component processes underlying the ability to imagine future events, using an individual-differences approach. Participants completed several tasks assessing different aspects of future thinking (i.e., fluency, specificity, amount of episodic details, phenomenology) and were also assessed with tasks and questionnaires measuring various component processes that have been hypothesized to support future thinking (i.e., executive processes, visual-spatial processing, relational memory processing, self-consciousness, and time perspective). The main results showed that executive processes were correlated with various measures of future thinking, whereas visual-spatial processing abilities and time perspective were specifically related to the number of sensory descriptions reported when specific future events were imagined. Furthermore, individual differences in self-consciousness predicted the subjective feeling of experiencing the imagined future events. These results suggest that future thinking involves a collection of processes that are related to different facets of future-event representation.

  11. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  12. Making working memory work: the effects of extended practice on focus capacity and the processes of updating, forward access, and random access.

    Science.gov (United States)

    Price, John M; Colflesh, Gregory J H; Cerella, John; Verhaeghen, Paul

    2014-05-01

    We investigated the effects of 10h of practice on variations of the N-Back task to investigate the processes underlying possible expansion of the focus of attention within working memory. Using subtractive logic, we showed that random access (i.e., Sternberg-like search) yielded a modest effect (a 50% increase in speed) whereas the processes of forward access (i.e., retrieval in order, as in a standard N-Back task) and updating (i.e., changing the contents of working memory) were executed about 5 times faster after extended practice. We additionally found that extended practice increased working memory capacity as measured by the size of the focus of attention for the forward-access task, but not for variations where probing was in random order. This suggests that working memory capacity may depend on the type of search process engaged, and that certain working-memory-related cognitive processes are more amenable to practice than others. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  14. Non-destructive Testing by Infrared Thermography Under Random Excitation and ARMA Analysis

    Science.gov (United States)

    Bodnar, J. L.; Nicolas, J. L.; Candoré, J. C.; Detalle, V.

    2012-11-01

    Photothermal thermography is a non-destructive testing (NDT) method, which has many applications in the field of control and characterization of thin materials. This technique is usually implemented under CW or flash excitation. Such excitations are not adapted for control of fragile materials or for multi-frequency analysis. To allow these analyses, in this article, the use of a new control mode is proposed: infrared thermography under random excitation and auto regressive moving average analysis. First, the principle of this NDT method is presented. Then, the method is shown to permit detection, with low energy constraints, of detachments situated in mural paintings.

  15. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  16. Efficient tests for equivalence of hidden Markov processes and quantum random walks

    NARCIS (Netherlands)

    U. Faigle; A. Schönhuth (Alexander)

    2011-01-01

    htmlabstractWhile two hidden Markov process (HMP) resp.~quantum random walk (QRW) parametrizations can differ from one another, the stochastic processes arising from them can be equivalent. Here a polynomial-time algorithm is presented which can determine equivalence of two HMP parametrizations

  17. Increased certification of semi-device independent random numbers using many inputs and more post-processing

    International Nuclear Information System (INIS)

    Mironowicz, Piotr; Tavakoli, Armin; Hameedi, Alley; Marques, Breno; Bourennane, Mohamed; Pawłowski, Marcin

    2016-01-01

    Quantum communication with systems of dimension larger than two provides advantages in information processing tasks. Examples include higher rates of key distribution and random number generation. The main disadvantage of using such multi-dimensional quantum systems is the increased complexity of the experimental setup. Here, we analyze a not-so-obvious problem: the relation between randomness certification and computational requirements of the post-processing of experimental data. In particular, we consider semi-device independent randomness certification from an experiment using a four dimensional quantum system to violate the classical bound of a random access code. Using state-of-the-art techniques, a smaller quantum violation requires more computational power to demonstrate randomness, which at some point becomes impossible with today’s computers although the randomness is (probably) still there. We show that by dedicating more input settings of the experiment to randomness certification, then by more computational postprocessing of the experimental data which corresponds to a quantum violation, one may increase the amount of certified randomness. Furthermore, we introduce a method that significantly lowers the computational complexity of randomness certification. Our results show how more randomness can be generated without altering the hardware and indicate a path for future semi-device independent protocols to follow. (paper)

  18. Analysis of axial compressive loaded beam under random support excitations

    Science.gov (United States)

    Xiao, Wensheng; Wang, Fengde; Liu, Jian

    2017-12-01

    An analytical procedure to investigate the response spectrum of a uniform Bernoulli-Euler beam with axial compressive load subjected to random support excitations is implemented based on the Mindlin-Goodman method and the mode superposition method in the frequency domain. The random response spectrum of the simply supported beam subjected to white noise excitation and to Pierson-Moskowitz spectrum excitation is investigated, and the characteristics of the response spectrum are further explored. Moreover, the effect of axial compressive load is studied and a method to determine the axial load is proposed. The research results show that the response spectrum mainly consists of the beam's additional displacement response spectrum when the excitation is white noise; however, the quasi-static displacement response spectrum is the main component when the excitation is the Pierson-Moskowitz spectrum. Under white noise excitation, the amplitude of the power spectral density function decreased as the axial compressive load increased, while the frequency band of the vibration response spectrum increased with the increase of axial compressive load.

  19. Erotic stimulus processing under amisulpride and reboxetine: a placebo-controlled fMRI study in healthy subjects.

    Science.gov (United States)

    Graf, Heiko; Wiegers, Maike; Metzger, Coraline D; Walter, Martin; Grön, Georg; Abler, Birgit

    2014-10-31

    Impaired sexual function is increasingly recognized as a side effect of psychopharmacological treatment. However, underlying mechanisms of action of the different drugs on sexual processing are still to be explored. Using functional magnetic resonance imaging, we previously investigated effects of serotonergic (paroxetine) and dopaminergic (bupropion) antidepressants on sexual functioning (Abler et al., 2011). Here, we studied the impact of noradrenergic and antidopaminergic medication on neural correlates of visual sexual stimulation in a new sample of subjects. Nineteen healthy heterosexual males (mean age 24 years, SD 3.1) under subchronic intake (7 days) of the noradrenergic agent reboxetine (4 mg/d), the antidopaminergic agent amisulpride (200mg/d), and placebo were included and studied with functional magnetic resonance imaging within a randomized, double-blind, placebo-controlled, within-subjects design during an established erotic video-clip task. Subjective sexual functioning was assessed using the Massachusetts General Hospital-Sexual Functioning Questionnaire. Relative to placebo, subjective sexual functioning was attenuated under reboxetine along with diminished neural activations within the caudate nucleus. Altered neural activations correlated with decreased sexual interest. Under amisulpride, neural activations and subjective sexual functioning remained unchanged. In line with previous interpretations of the role of the caudate nucleus in the context of primary reward processing, attenuated caudate activation may reflect detrimental effects on motivational aspects of erotic stimulus processing under noradrenergic agents. © The Author 2015. Published by Oxford University Press on behalf of CINP.

  20. Fatigue in Steel Highway Bridges under Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning; Nielsen, J.A.; Vejrum, Tina

    1997-01-01

    on welded plate test specimens have been carried through. The materials that have been used are either conventional structural steel with a yield stress of ~ 400-410 MPa or high-strength steel with a yield stress of ~ 810-840 MPa.The fatigue tests have been carried out using load histories, which correspond......In the present investigation, fatigue damage accumulation in steel highway bridges under random loading is studied. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis.In the experimental part of the investigation, fatigue test series...... to one week's traffic loading, determined by means of strain gage measurements on the orthotropic steel deck structure of the Farø Bridges in Denmark.The test series which have been carried through show a significant difference between constant amplitude and variable amplitude fatigue test results. Both...

  1. Fatigue in Steel Highway Bridges under Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning; Nielsen, Jette Andkjær

    1999-01-01

    have been carried through. The materials that have been used are either conventional structural steel with a yield stress of f(y) similar to 400-410 MPa or high-strength steel with a yield stress of f(y) similar to 810-840 MPa. The fatigue tests have been carried out using load histories, which......Fatigue damage accumulation in steel highway bridges under random loading is studied. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis. In the experimental part of the investigation, fatigue test series on welded plate test specimens...... correspond to one week's traffic loading, determined by means of strain gauge measurements on the orthotropic steel deck structure of the Faro Bridges in Denmark. The test series carried through show a significant difference between constant amplitude and variable amplitude fatigue test results. Both...

  2. Matrix product approach for the asymmetric random average process

    International Nuclear Information System (INIS)

    Zielen, F; Schadschneider, A

    2003-01-01

    We consider the asymmetric random average process which is a one-dimensional stochastic lattice model with nearest-neighbour interaction but continuous and unbounded state variables. First, the explicit functional representations, so-called beta densities, of all local interactions leading to steady states of product measure form are rigorously derived. This also completes an outstanding proof given in a previous publication. Then we present an alternative solution for the processes with factorized stationary states by using a matrix product ansatz. Due to continuous state variables we obtain a matrix algebra in the form of a functional equation which can be solved exactly

  3. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  4. A method of signal transmission path analysis for multivariate random processes

    International Nuclear Information System (INIS)

    Oguma, Ritsuo

    1984-04-01

    A method for noise analysis called ''STP (signal transmission path) analysis'' is presentd as a tool to identify noise sources and their propagation paths in multivariate random proceses. Basic idea of the analysis is to identify, via time series analysis, effective network for the signal power transmission among variables in the system and to make use of its information to the noise analysis. In the present paper, we accomplish this through two steps of signal processings; first, we estimate, using noise power contribution analysis, variables which have large contribution to the power spectrum of interest, and then evaluate the STPs for each pair of variables to identify STPs which play significant role for the generated noise to transmit to the variable under evaluation. The latter part of the analysis is executed through comparison of partial coherence function and newly introduced partial noise power contribution function. This paper presents the procedure of the STP analysis and demonstrates, using simulation data as well as Borssele PWR noise data, its effectiveness for investigation of noise generation and propagation mechanisms. (author)

  5. Multifractal properties of diffusion-limited aggregates and random multiplicative processes

    International Nuclear Information System (INIS)

    Canessa, E.

    1991-04-01

    We consider the multifractal properties of irreversible diffusion-limited aggregation (DLA) from the point of view of the self-similarity of fluctuations in random multiplicative processes. In particular we analyse the breakdown of multifractal behaviour and phase transition associated with the negative moments of the growth probabilities in DLA. (author). 20 refs, 5 figs

  6. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    Science.gov (United States)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  7. The levels of processing effect under nitrogen narcosis.

    Science.gov (United States)

    Kneller, Wendy; Hobbs, Malcolm

    2013-01-01

    Previous research has consistently demonstrated that inert gas (nitrogen) narcosis affects free recall but not recognition memory in the depth range of 30 to 50 meters of sea water (msw), possibly as a result of narcosis preventing processing when learned material is encoded. The aim of the current research was to test this hypothesis by applying a levels of processing approach to the measurement of free recall under narcosis. Experiment 1 investigated the effect of depth (0-2 msw vs. 37-39 msw) and level of processing (shallow vs. deep) on free recall memory performance in 67 divers. When age was included as a covariate, recall was significantly worse in deep water (i.e., under narcosis), compared to shallow water, and was significantly higher in the deep processing compared to shallow processing conditions in both depth conditions. Experiment 2 demonstrated that this effect was not simply due to the different underwater environments used for the depth conditions in Experiment 1. It was concluded memory performance can be altered by processing under narcosis and supports the contention that narcosis affects the encoding stage of memory as opposed to self-guided search (retrieval).

  8. Art Therapy and Cognitive Processing Therapy for Combat-Related PTSD: A Randomized Controlled Trial

    Science.gov (United States)

    Campbell, Melissa; Decker, Kathleen P.; Kruk, Kerry; Deaver, Sarah P.

    2016-01-01

    This randomized controlled trial was designed to determine if art therapy in conjunction with Cognitive Processing Therapy (CPT) was more effective for reducing symptoms of combat posttraumatic stress disorder (PTSD) than CPT alone. Veterans (N = 11) were randomized to receive either individual CPT, or individual CPT in conjunction with individual…

  9. Random Walk on a Perturbation of the Infinitely-Fast Mixing Interchange Process

    Science.gov (United States)

    Salvi, Michele; Simenhaus, François

    2018-03-01

    We consider a random walk in dimension d≥1 in a dynamic random environment evolving as an interchange process with rate γ >0 . We prove that, if we choose γ large enough, almost surely the empirical velocity of the walker X_t/t eventually lies in an arbitrary small ball around the annealed drift. This statement is thus a perturbation of the case γ =+∞ where the environment is refreshed between each step of the walker. We extend three-way part of the results of Huveneers and Simenhaus (Electron J Probab 20(105):42, 2015), where the environment was given by the 1-dimensional exclusion process: (i) We deal with any dimension d≥1 ; (ii) We treat the much more general interchange process, where each particle carries a transition vector chosen according to an arbitrary law μ ; (iii) We show that X_t/t is not only in the same direction of the annealed drift, but that it is also close to it.

  10. Random Walk on a Perturbation of the Infinitely-Fast Mixing Interchange Process

    Science.gov (United States)

    Salvi, Michele; Simenhaus, François

    2018-05-01

    We consider a random walk in dimension d≥ 1 in a dynamic random environment evolving as an interchange process with rate γ >0. We prove that, if we choose γ large enough, almost surely the empirical velocity of the walker X_t/t eventually lies in an arbitrary small ball around the annealed drift. This statement is thus a perturbation of the case γ =+∞ where the environment is refreshed between each step of the walker. We extend three-way part of the results of Huveneers and Simenhaus (Electron J Probab 20(105):42, 2015), where the environment was given by the 1-dimensional exclusion process: (i) We deal with any dimension d≥1; (ii) We treat the much more general interchange process, where each particle carries a transition vector chosen according to an arbitrary law μ ; (iii) We show that X_t/t is not only in the same direction of the annealed drift, but that it is also close to it.

  11. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  12. The Fatigue Behavior of Steel Structures under Random Loading

    DEFF Research Database (Denmark)

    Agerskov, Henning

    2009-01-01

    of the investigation, fatigue test series with a total of 540 fatigue tests have been carried through on various types of welded plate test specimens and full-scale offshore tubular joints. The materials that have been used are either conventional structural steel or high-strength steel. The fatigue tests......Fatigue damage accumulation in steel structures under random loading has been studied in a number of investigations at the Technical University of Denmark. The fatigue life of welded joints has been determined both experimentally and from a fracture mechanics analysis. In the experimental part...... and the fracture mechanics analyses have been carried out using load histories, which are realistic in relation to the types of structures studied, i.e. primarily bridges, offshore structures and chimneys. In general, the test series carried through show a significant difference between constant amplitude...

  13. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  14. Processes underlying treatment success and failure in assertive community treatment.

    Science.gov (United States)

    Stull, Laura G; McGrew, John H; Salyers, Michelle P

    2012-02-01

    Processes underlying success and failure in assertive community treatment (ACT), a widely investigated treatment model for persons with severe mental illness, are poorly understood. The purpose of the current study was to examine processes in ACT by (1) understanding how consumers and staff describe the processes underlying treatment success and failure and (2) comparing processes identified by staff and consumers. Investigators conducted semi-structured interviews with 25 staff and 23 consumers from four ACT teams. Both staff and consumers identified aspects of the ACT team itself as the most critical in the process of consumer success. For failure, consumers identified consumer characteristics as most critical and staff identified lack of social relationships. Processes underlying failure were not viewed as merely the opposite of processes underlying success. In addition, there was notable disagreement between staff and consumers on important processes. Findings overlap with critical ingredients identified in previous studies, including aspects of the ACT team, social involvement and employment. In contrast to prior studies, there was little emphasis on hospitalizations and greater emphasis on not abusing substances, obtaining wants and desires, and consumer characteristics.

  15. Random Process Theory Approach to Geometric Heterogeneous Surfaces: Effective Fluid-Solid Interaction

    Science.gov (United States)

    Khlyupin, Aleksey; Aslyamov, Timur

    2017-06-01

    Realistic fluid-solid interaction potentials are essential in description of confined fluids especially in the case of geometric heterogeneous surfaces. Correlated random field is considered as a model of random surface with high geometric roughness. We provide the general theory of effective coarse-grained fluid-solid potential by proper averaging of the free energy of fluid molecules which interact with the solid media. This procedure is largely based on the theory of random processes. We apply first passage time probability problem and assume the local Markov properties of random surfaces. General expression of effective fluid-solid potential is obtained. In the case of small surface irregularities analytical approximation for effective potential is proposed. Both amorphous materials with large surface roughness and crystalline solids with several types of fcc lattices are considered. It is shown that the wider the lattice spacing in terms of molecular diameter of the fluid, the more obtained potentials differ from classical ones. A comparison with published Monte-Carlo simulations was discussed. The work provides a promising approach to explore how the random geometric heterogeneity affects on thermodynamic properties of the fluids.

  16. Stability of prebiotic, laminaran oligosaccharide under food processing conditions

    Science.gov (United States)

    Chamidah, A.

    2018-04-01

    Prebiotic stability tests on laminaran oligosaccharide under food processing conditions were urgently performed to determine the ability of prebiotics deal with processing. Laminaran, oligosaccharide is produced from enzymatic hydrolysis. To further apply this prebiotic, it is necessary to test its performance on food processing. Single prebiotic or in combination with probiotic can improve human digestive health. The effectiveness evaluation of prebiotic should be taken into account in regards its chemical and functional stabilities. This study aims to investigate the stability of laminaran, oligosaccharide under food processing condition.

  17. Optimization of Wireless Transceivers under Processing Energy Constraints

    Science.gov (United States)

    Wang, Gaojian; Ascheid, Gerd; Wang, Yanlu; Hanay, Oner; Negra, Renato; Herrmann, Matthias; Wehn, Norbert

    2017-09-01

    Focus of the article is on achieving maximum data rates under a processing energy constraint. For a given amount of processing energy per information bit, the overall power consumption increases with the data rate. When targeting data rates beyond 100 Gb/s, the system's overall power consumption soon exceeds the power which can be dissipated without forced cooling. To achieve a maximum data rate under this power constraint, the processing energy per information bit must be minimized. Therefore, in this article, suitable processing efficient transmission schemes together with energy efficient architectures and their implementations are investigated in a true cross-layer approach. Target use cases are short range wireless transmitters working at carrier frequencies around 60 GHz and bandwidths between 1 GHz and 10 GHz.

  18. The effects of divided attention on encoding processes under incidental and intentional learning instructions: underlying mechanisms?

    Science.gov (United States)

    Naveh-Benjamin, Moshe; Guez, Jonathan; Hara, Yoko; Brubaker, Matthew S; Lowenschuss-Erlich, Iris

    2014-01-01

    Divided attention (DA) at encoding has been shown to significantly disrupt later memory for the studied information. However, what type of processing gets disrupted during DA remains unresolved. In this study, we assessed the degree to which strategic effortful processes are affected under DA by comparing the effects of DA at encoding under intentional and pure incidental learning instructions. In three experiments, participants studied list of words or word pairs under either full or divided attention. Results of three experiments, which used different methodologies, converged to show that the effects of DA at encoding reduce memory performance to the same degree under incidental and intentional learning. Secondary task performance indicated that encoding under intentional learning instructions was more effortful than under incidental learning instructions. In addition, the results indicated enhanced attention to the initial appearance of the words under both types of learning instructions. Results are interpreted to imply that other processes, rather than only strategic effortful ones, might be affected by DA at encoding.

  19. Modeling random telegraph signal noise in CMOS image sensor under low light based on binomial distribution

    International Nuclear Information System (INIS)

    Zhang Yu; Wang Guangyi; Lu Xinmiao; Hu Yongcai; Xu Jiangtao

    2016-01-01

    The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result, the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated, and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures. (paper)

  20. Quantification of damage and fatigue life under random loading

    Directory of Open Access Journals (Sweden)

    Sahnoun ZENGAH

    2017-12-01

    Full Text Available The fatigue of components and structures under real stress is a very complex process that appears at the grain scale. The present work is to highlight a variable loading fatigue life prediction process using the Rain-flow cycle count method and cumulative damage models. Four damage cumulative models are retained and used to estimate the lifetime and to evaluate the indicator of the damage (D namely: the model Miner, the model of the damaged stress "DSM", the theory unified and finally Henry's law.

  1. Time at which the maximum of a random acceleration process is reached

    International Nuclear Information System (INIS)

    Majumdar, Satya N; Rosso, Alberto; Zoia, Andrea

    2010-01-01

    We study the random acceleration model, which is perhaps one of the simplest, yet nontrivial, non-Markov stochastic processes, and is key to many applications. For this non-Markov process, we present exact analytical results for the probability density p(t m |T) of the time t m at which the process reaches its maximum, within a fixed time interval [0, T]. We study two different boundary conditions, which correspond to the process representing respectively (i) the integral of a Brownian bridge and (ii) the integral of a free Brownian motion. Our analytical results are also verified by numerical simulations.

  2. Total-Evidence Dating under the Fossilized Birth-Death Process.

    Science.gov (United States)

    Zhang, Chi; Stadler, Tanja; Klopfstein, Seraina; Heath, Tracy A; Ronquist, Fredrik

    2016-03-01

    Bayesian total-evidence dating involves the simultaneous analysis of morphological data from the fossil record and morphological and sequence data from recent organisms, and it accommodates the uncertainty in the placement of fossils while dating the phylogenetic tree. Due to the flexibility of the Bayesian approach, total-evidence dating can also incorporate additional sources of information. Here, we take advantage of this and expand the analysis to include information about fossilization and sampling processes. Our work is based on the recently described fossilized birth-death (FBD) process, which has been used to model speciation, extinction, and fossilization rates that can vary over time in a piecewise manner. So far, sampling of extant and fossil taxa has been assumed to be either complete or uniformly at random, an assumption which is only valid for a minority of data sets. We therefore extend the FBD process to accommodate diversified sampling of extant taxa, which is standard practice in studies of higher-level taxa. We verify the implementation using simulations and apply it to the early radiation of Hymenoptera (wasps, ants, and bees). Previous total-evidence dating analyses of this data set were based on a simple uniform tree prior and dated the initial radiation of extant Hymenoptera to the late Carboniferous (309 Ma). The analyses using the FBD prior under diversified sampling, however, date the radiation to the Triassic and Permian (252 Ma), slightly older than the age of the oldest hymenopteran fossils. By exploring a variety of FBD model assumptions, we show that it is mainly the accommodation of diversified sampling that causes the push toward more recent divergence times. Accounting for diversified sampling thus has the potential to close the long-discussed gap between rocks and clocks. We conclude that the explicit modeling of fossilization and sampling processes can improve divergence time estimates, but only if all important model aspects

  3. Gaussian random-matrix process and universal parametric correlations in complex systems

    International Nuclear Information System (INIS)

    Attias, H.; Alhassid, Y.

    1995-01-01

    We introduce the framework of the Gaussian random-matrix process as an extension of Dyson's Gaussian ensembles and use it to discuss the statistical properties of complex quantum systems that depend on an external parameter. We classify the Gaussian processes according to the short-distance diffusive behavior of their energy levels and demonstrate that all parametric correlation functions become universal upon the appropriate scaling of the parameter. The class of differentiable Gaussian processes is identified as the relevant one for most physical systems. We reproduce the known spectral correlators and compute eigenfunction correlators in their universal form. Numerical evidence from both a chaotic model and weakly disordered model confirms our predictions

  4. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  5. Quantum Entanglement Growth under Random Unitary Dynamics

    Directory of Open Access Journals (Sweden)

    Adam Nahum

    2017-07-01

    Full Text Available Characterizing how entanglement grows with time in a many-body system, for example, after a quantum quench, is a key problem in nonequilibrium quantum physics. We study this problem for the case of random unitary dynamics, representing either Hamiltonian evolution with time-dependent noise or evolution by a random quantum circuit. Our results reveal a universal structure behind noisy entanglement growth, and also provide simple new heuristics for the “entanglement tsunami” in Hamiltonian systems without noise. In 1D, we show that noise causes the entanglement entropy across a cut to grow according to the celebrated Kardar-Parisi-Zhang (KPZ equation. The mean entanglement grows linearly in time, while fluctuations grow like (time^{1/3} and are spatially correlated over a distance ∝(time^{2/3}. We derive KPZ universal behavior in three complementary ways, by mapping random entanglement growth to (i a stochastic model of a growing surface, (ii a “minimal cut” picture, reminiscent of the Ryu-Takayanagi formula in holography, and (iii a hydrodynamic problem involving the dynamical spreading of operators. We demonstrate KPZ universality in 1D numerically using simulations of random unitary circuits. Importantly, the leading-order time dependence of the entropy is deterministic even in the presence of noise, allowing us to propose a simple coarse grained minimal cut picture for the entanglement growth of generic Hamiltonians, even without noise, in arbitrary dimensionality. We clarify the meaning of the “velocity” of entanglement growth in the 1D entanglement tsunami. We show that in higher dimensions, noisy entanglement evolution maps to the well-studied problem of pinning of a membrane or domain wall by disorder.

  6. Quantum Entanglement Growth under Random Unitary Dynamics

    Science.gov (United States)

    Nahum, Adam; Ruhman, Jonathan; Vijay, Sagar; Haah, Jeongwan

    2017-07-01

    Characterizing how entanglement grows with time in a many-body system, for example, after a quantum quench, is a key problem in nonequilibrium quantum physics. We study this problem for the case of random unitary dynamics, representing either Hamiltonian evolution with time-dependent noise or evolution by a random quantum circuit. Our results reveal a universal structure behind noisy entanglement growth, and also provide simple new heuristics for the "entanglement tsunami" in Hamiltonian systems without noise. In 1D, we show that noise causes the entanglement entropy across a cut to grow according to the celebrated Kardar-Parisi-Zhang (KPZ) equation. The mean entanglement grows linearly in time, while fluctuations grow like (time )1/3 and are spatially correlated over a distance ∝(time )2/3. We derive KPZ universal behavior in three complementary ways, by mapping random entanglement growth to (i) a stochastic model of a growing surface, (ii) a "minimal cut" picture, reminiscent of the Ryu-Takayanagi formula in holography, and (iii) a hydrodynamic problem involving the dynamical spreading of operators. We demonstrate KPZ universality in 1D numerically using simulations of random unitary circuits. Importantly, the leading-order time dependence of the entropy is deterministic even in the presence of noise, allowing us to propose a simple coarse grained minimal cut picture for the entanglement growth of generic Hamiltonians, even without noise, in arbitrary dimensionality. We clarify the meaning of the "velocity" of entanglement growth in the 1D entanglement tsunami. We show that in higher dimensions, noisy entanglement evolution maps to the well-studied problem of pinning of a membrane or domain wall by disorder.

  7. Nonstationary random acoustic and electromagnetic fields as wave diffusion processes

    International Nuclear Information System (INIS)

    Arnaut, L R

    2007-01-01

    We investigate the effects of relatively rapid variations of the boundaries of an overmoded cavity on the stochastic properties of its interior acoustic or electromagnetic field. For quasi-static variations, this field can be represented as an ideal incoherent and statistically homogeneous isotropic random scalar or vector field, respectively. A physical model is constructed showing that the field dynamics can be characterized as a generalized diffusion process. The Langevin-It o-hat and Fokker-Planck equations are derived and their associated statistics and distributions for the complex analytic field, its magnitude and energy density are computed. The energy diffusion parameter is found to be proportional to the square of the ratio of the standard deviation of the source field to the characteristic time constant of the dynamic process, but is independent of the initial energy density, to first order. The energy drift vanishes in the asymptotic limit. The time-energy probability distribution is in general not separable, as a result of nonstationarity. A general solution of the Fokker-Planck equation is obtained in integral form, together with explicit closed-form solutions for several asymptotic cases. The findings extend known results on statistics and distributions of quasi-stationary ideal random fields (pure diffusions), which are retrieved as special cases

  8. To be and not to be: scale correlations in random multifractal processes

    DEFF Research Database (Denmark)

    Cleve, Jochen; Schmiegel, Jürgen; Greiner, Martin

    We discuss various properties of a random multifractal process, which are related to the issue of scale correlations. By design, the process is homogeneous, non-conservative and has no built-in scale correlations. However, when it comes to observables like breakdown coefficients, which are based...... on a coarse-graining of the multifractal field, scale correlations do appear. In the log-normal limit of the model process, the conditional distributions and moments of breakdown coefficients reproduce the observations made in fully developed small-scale turbulence. These findings help to understand several...

  9. An introduction to random interlacements

    CERN Document Server

    Drewitz, Alexander; Sapozhnikov, Artëm

    2014-01-01

    This book gives a self-contained introduction to the theory of random interlacements. The intended reader of the book is a graduate student with a background in probability theory who wants to learn about the fundamental results and methods of this rapidly emerging field of research. The model was introduced by Sznitman in 2007 in order to describe the local picture left by the trace of a random walk on a large discrete torus when it runs up to times proportional to the volume of the torus. Random interlacements is a new percolation model on the d-dimensional lattice. The main results covered by the book include the full proof of the local convergence of random walk trace on the torus to random interlacements and the full proof of the percolation phase transition of the vacant set of random interlacements in all dimensions. The reader will become familiar with the techniques relevant to working with the underlying Poisson Process and the method of multi-scale renormalization, which helps in overcoming the ch...

  10. Repairable system analysis in presence of covariates and random effects

    International Nuclear Information System (INIS)

    Giorgio, M.; Guida, M.; Pulcini, G.

    2014-01-01

    This paper aims to model the failure pattern of repairable systems in presence of explained and unexplained heterogeneity. The failure pattern of each system is described by a Power Law Process. Part of the heterogeneity among the patterns is explained through the use of a covariate, and the residual unexplained heterogeneity (random effects) is modeled via a joint probability distribution on the PLP parameters. The proposed approach is applied to a real set of failure time data of powertrain systems mounted on 33 buses employed in urban and suburban routes. Moreover, the joint probability distribution on the PLP parameters estimated from the data is used as an informative prior to make Bayesian inference on the future failure process of a generic system belonging to the same population and employed in an urban or suburban route under randomly chosen working conditions. - Highlights: • We describe the failure process of buses powertrain system subject to heterogeneity. • Heterogeneity due to different service types is explained by a covariate. • Random effect is modeled through a joint pdf on failure process parameters. • The powertrain reliability under new future operating conditions is estimated

  11. Is neutron evaporation from highly excited nuclei a poisson random process

    International Nuclear Information System (INIS)

    Simbel, M.H.

    1982-01-01

    It is suggested that neutron emission from highly excited nuclei follows a Poisson random process. The continuous variable of the process is the excitation energy excess over the binding energy of the emitted neutrons and the discrete variable is the number of emitted neutrons. Cross sections for (HI,xn) reactions are analyzed using a formula containing a Poisson distribution function. The post- and pre-equilibrium components of the cross section are treated separately. The agreement between the predictions of this formula and the experimental results is very good. (orig.)

  12. Some functional limit theorems for compound Cox processes

    Energy Technology Data Exchange (ETDEWEB)

    Korolev, Victor Yu. [Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Institute of Informatics Problems FRC CSC RAS (Russian Federation); Chertok, A. V. [Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Euphoria Group LLC (Russian Federation); Korchagin, A. Yu. [Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Kossova, E. V. [Higher School of Economics National Research University, Moscow (Russian Federation); Zeifman, Alexander I. [Vologda State University, S.Orlova, 6, Vologda (Russian Federation); Institute of Informatics Problems FRC CSC RAS, ISEDT RAS (Russian Federation)

    2016-06-08

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  13. Some functional limit theorems for compound Cox processes

    International Nuclear Information System (INIS)

    Korolev, Victor Yu.; Chertok, A. V.; Korchagin, A. Yu.; Kossova, E. V.; Zeifman, Alexander I.

    2016-01-01

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  14. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    Science.gov (United States)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  15. Random Gap Detection Test (RGDT) performance of individuals with central auditory processing disorders from 5 to 25 years of age.

    Science.gov (United States)

    Dias, Karin Ziliotto; Jutras, Benoît; Acrani, Isabela Olszanski; Pereira, Liliane Desgualdo

    2012-02-01

    The aim of the present study was to assess the auditory temporal resolution ability in individuals with central auditory processing disorders, to examine the maturation effect and to investigate the relationship between the performance on a temporal resolution test with the performance on other central auditory tests. Participants were divided in two groups: 131 with Central Auditory Processing Disorder and 94 with normal auditory processing. They had pure-tone air-conduction thresholds no poorer than 15 dB HL bilaterally, normal admittance measures and presence of acoustic reflexes. Also, they were assessed with a central auditory test battery. Participants who failed at least one or more tests were included in the Central Auditory Processing Disorder group and those in the control group obtained normal performance on all tests. Following the auditory processing assessment, the Random Gap Detection Test was administered to the participants. A three-way ANOVA was performed. Correlation analyses were also done between the four Random Gap Detection Test subtests data as well as between Random Gap Detection Test data and the other auditory processing test results. There was a significant difference between the age-group performances in children with and without Central Auditory Processing Disorder. Also, 48% of children with Central Auditory Processing Disorder failed the Random Gap Detection Test and the percentage decreased as a function of age. The highest percentage (86%) was found in the 5-6 year-old children. Furthermore, results revealed a strong significant correlation between the four Random Gap Detection Test subtests. There was a modest correlation between the Random Gap Detection Test results and the dichotic listening tests. No significant correlation was observed between the Random Gap Detection Test data and the results of the other tests in the battery. Random Gap Detection Test should not be administered to children younger than 7 years old because

  16. Random pulse generator

    International Nuclear Information System (INIS)

    Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing

    2007-01-01

    Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)

  17. Art Therapy and Cognitive Processing Therapy for Combat-Related PTSD: A Randomized Controlled Trial

    Science.gov (United States)

    Campbell, Melissa; Decker, Kathleen P.; Kruk, Kerry; Deaver, Sarah P.

    2018-01-01

    This randomized controlled trial was designed to determine if art therapy in conjunction with Cognitive Processing Therapy (CPT) was more effective for reducing symptoms of combat posttraumatic stress disorder (PTSD) than CPT alone. Veterans (N = 11) were randomized to receive either individual CPT, or individual CPT in conjunction with individual art therapy. PTSD Checklist–Military Version and Beck Depression Inventory–II scores improved with treatment in both groups with no significant difference in improvement between the experimental and control groups. Art therapy in conjunction with CPT was found to improve trauma processing and veterans considered it to be an important part of their treatment as it provided healthy distancing, enhanced trauma recall, and increased access to emotions. PMID:29332989

  18. Polymers and Random graphs: Asymptotic equivalence to branching processes

    International Nuclear Information System (INIS)

    Spouge, J.L.

    1985-01-01

    In 1974, Falk and Thomas did a computer simulation of Flory's Equireactive RA/sub f/ Polymer model, rings forbidden and rings allowed. Asymptotically, the Rings Forbidden model tended to Stockmayer's RA/sub f/ distribution (in which the sol distribution ''sticks'' after gelation), while the Rings Allowed model tended to the Flory version of the RA/sub f/ distribution. In 1965, Whittle introduced the Tree and Pseudomultigraph models. We show that these random graphs generalize the Falk and Thomas models by incorporating first-shell substitution effects. Moreover, asymptotically the Tree model displays postgelation ''sticking.'' Hence this phenomenon results from the absence of rings and occurs independently of equireactivity. We also show that the Pseudomultigraph model is asymptotically identical to the Branching Process model introduced by Gordon in 1962. This provides a possible basis for the Branching Process model in standard statistical mechanics

  19. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    CERN Document Server

    Vamos, C; Vereecken, H

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested.

  20. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    International Nuclear Information System (INIS)

    Vamos, Calin; Suciu, Nicolae; Vereecken, Harry

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested

  1. False Operation of Static Random Access Memory Cells under Alternating Current Power Supply Voltage Variation

    Science.gov (United States)

    Sawada, Takuya; Takata, Hidehiro; Nii, Koji; Nagata, Makoto

    2013-04-01

    Static random access memory (SRAM) cores exhibit susceptibility against power supply voltage variation. False operation is investigated among SRAM cells under sinusoidal voltage variation on power lines introduced by direct RF power injection. A standard SRAM core of 16 kbyte in a 90 nm 1.5 V technology is diagnosed with built-in self test and on-die noise monitor techniques. The sensitivity of bit error rate is shown to be high against the frequency of injected voltage variation, while it is not greatly influenced by the difference in frequency and phase against SRAM clocking. It is also observed that the distribution of false bits is substantially random in a cell array.

  2. Formation process of Malaysian modern architecture under influence of nationalism

    OpenAIRE

    宇高, 雄志; 山崎, 大智

    2001-01-01

    This paper examines the Formation Process of Malaysian Modern Architecture under Influence of Nationalism,through the process of independence of Malaysia. The national style as "Malaysian national architecture" which hasengaged on background of political environment under the post colonial situation. Malaysian urban design is alsodetermined under the balance of both of ethnic culture and the national culture. In Malaysia, they decided to choosethe Malay ethnic culture as the national culture....

  3. Coverage maximization under resource constraints using a nonuniform proliferating random walk.

    Science.gov (United States)

    Saha, Sudipta; Ganguly, Niloy

    2013-02-01

    Information management services on networks, such as search and dissemination, play a key role in any large-scale distributed system. One of the most desirable features of these services is the maximization of the coverage, i.e., the number of distinctly visited nodes under constraints of network resources as well as time. However, redundant visits of nodes by different message packets (modeled, e.g., as walkers) initiated by the underlying algorithms for these services cause wastage of network resources. In this work, using results from analytical studies done in the past on a K-random-walk-based algorithm, we identify that redundancy quickly increases with an increase in the density of the walkers. Based on this postulate, we design a very simple distributed algorithm which dynamically estimates the density of the walkers and thereby carefully proliferates walkers in sparse regions. We use extensive computer simulations to test our algorithm in various kinds of network topologies whereby we find it to be performing particularly well in networks that are highly clustered as well as sparse.

  4. 5th Seminar on Stochastic Processes, Random Fields and Applications

    CERN Document Server

    Russo, Francesco; Dozzi, Marco

    2008-01-01

    This volume contains twenty-eight refereed research or review papers presented at the 5th Seminar on Stochastic Processes, Random Fields and Applications, which took place at the Centro Stefano Franscini (Monte Verità) in Ascona, Switzerland, from May 30 to June 3, 2005. The seminar focused mainly on stochastic partial differential equations, random dynamical systems, infinite-dimensional analysis, approximation problems, and financial engineering. The book will be a valuable resource for researchers in stochastic analysis and professionals interested in stochastic methods in finance. Contributors: Y. Asai, J.-P. Aubin, C. Becker, M. Benaïm, H. Bessaih, S. Biagini, S. Bonaccorsi, N. Bouleau, N. Champagnat, G. Da Prato, R. Ferrière, F. Flandoli, P. Guasoni, V.B. Hallulli, D. Khoshnevisan, T. Komorowski, R. Léandre, P. Lescot, H. Lisei, J.A. López-Mimbela, V. Mandrekar, S. Méléard, A. Millet, H. Nagai, A.D. Neate, V. Orlovius, M. Pratelli, N. Privault, O. Raimond, M. Röckner, B. Rüdiger, W.J. Runggaldi...

  5. Asymmetric Spatial Processing Under Cognitive Load.

    Science.gov (United States)

    Naert, Lien; Bonato, Mario; Fias, Wim

    2018-01-01

    Spatial attention allows us to selectively process information within a certain location in space. Despite the vast literature on spatial attention, the effect of cognitive load on spatial processing is still not fully understood. In this study we added cognitive load to a spatial processing task, so as to see whether it would differentially impact upon the processing of visual information in the left versus the right hemispace. The main paradigm consisted of a detection task that was performed during the maintenance interval of a verbal working memory task. We found that increasing cognitive working memory load had a more negative impact on detecting targets presented on the left side compared to those on the right side. The strength of the load effect correlated with the strength of the interaction on an individual level. The implications of an asymmetric attentional bias with a relative disadvantage for the left (vs the right) hemispace under high verbal working memory (WM) load are discussed.

  6. Asymmetric Spatial Processing Under Cognitive Load

    Directory of Open Access Journals (Sweden)

    Lien Naert

    2018-04-01

    Full Text Available Spatial attention allows us to selectively process information within a certain location in space. Despite the vast literature on spatial attention, the effect of cognitive load on spatial processing is still not fully understood. In this study we added cognitive load to a spatial processing task, so as to see whether it would differentially impact upon the processing of visual information in the left versus the right hemispace. The main paradigm consisted of a detection task that was performed during the maintenance interval of a verbal working memory task. We found that increasing cognitive working memory load had a more negative impact on detecting targets presented on the left side compared to those on the right side. The strength of the load effect correlated with the strength of the interaction on an individual level. The implications of an asymmetric attentional bias with a relative disadvantage for the left (vs the right hemispace under high verbal working memory (WM load are discussed.

  7. Reliability analysis of structures under periodic proof tests in service

    Science.gov (United States)

    Yang, J.-N.

    1976-01-01

    A reliability analysis of structures subjected to random service loads and periodic proof tests treats gust loads and maneuver loads as random processes. Crack initiation, crack propagation, and strength degradation are treated as the fatigue process. The time to fatigue crack initiation and ultimate strength are random variables. Residual strength decreases during crack propagation, so that failure rate increases with time. When a structure fails under periodic proof testing, a new structure is built and proof-tested. The probability of structural failure in service is derived from treatment of all the random variables, strength degradations, service loads, proof tests, and the renewal of failed structures. Some numerical examples are worked out.

  8. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  9. Diffusion in randomly perturbed dissipative dynamics

    Science.gov (United States)

    Rodrigues, Christian S.; Chechkin, Aleksei V.; de Moura, Alessandro P. S.; Grebogi, Celso; Klages, Rainer

    2014-11-01

    Dynamical systems having many coexisting attractors present interesting properties from both fundamental theoretical and modelling points of view. When such dynamics is under bounded random perturbations, the basins of attraction are no longer invariant and there is the possibility of transport among them. Here we introduce a basic theoretical setting which enables us to study this hopping process from the perspective of anomalous transport using the concept of a random dynamical system with holes. We apply it to a simple model by investigating the role of hyperbolicity for the transport among basins. We show numerically that our system exhibits non-Gaussian position distributions, power-law escape times, and subdiffusion. Our simulation results are reproduced consistently from stochastic continuous time random walk theory.

  10. Simulation study on characteristics of long-range interaction in randomly asymmetric exclusion process

    Science.gov (United States)

    Zhao, Shi-Bo; Liu, Ming-Zhe; Yang, Lan-Ying

    2015-04-01

    In this paper we investigate the dynamics of an asymmetric exclusion process on a one-dimensional lattice with long-range hopping and random update via Monte Carlo simulations theoretically. Particles in the model will firstly try to hop over successive unoccupied sites with a probability q, which is different from previous exclusion process models. The probability q may represent the random access of particles. Numerical simulations for stationary particle currents, density profiles, and phase diagrams are obtained. There are three possible stationary phases: the low density (LD) phase, high density (HD) phase, and maximal current (MC) in the system, respectively. Interestingly, bulk density in the LD phase tends to zero, while the MC phase is governed by α, β, and q. The HD phase is nearly the same as the normal TASEP, determined by exit rate β. Theoretical analysis is in good agreement with simulation results. The proposed model may provide a better understanding of random interaction dynamics in complex systems. Project supported by the National Natural Science Foundation of China (Grant Nos. 41274109 and 11104022), the Fund for Sichuan Youth Science and Technology Innovation Research Team (Grant No. 2011JTD0013), and the Creative Team Program of Chengdu University of Technology.

  11. Pseudo-random number generators for Monte Carlo simulations on ATI Graphics Processing Units

    Science.gov (United States)

    Demchik, Vadim

    2011-03-01

    Basic uniform pseudo-random number generators are implemented on ATI Graphics Processing Units (GPU). The performance results of the realized generators (multiplicative linear congruential (GGL), XOR-shift (XOR128), RANECU, RANMAR, RANLUX and Mersenne Twister (MT19937)) on CPU and GPU are discussed. The obtained speed up factor is hundreds of times in comparison with CPU. RANLUX generator is found to be the most appropriate for using on GPU in Monte Carlo simulations. The brief review of the pseudo-random number generators used in modern software packages for Monte Carlo simulations in high-energy physics is presented.

  12. A random point process model for the score in sport matches

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2009-01-01

    Roč. 20, č. 2 (2009), s. 121-131 ISSN 1471-678X R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : sport statistics * scoring intensity * Cox’s regression model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/SI/volf-a random point process model for the score in sport matches.pdf

  13. On the generation of log-Levy distributions and extreme randomness

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2011-01-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Levy distributions. The log-Levy distributions are the Levy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Levy distributions emerge universally-the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot's extreme randomness. (paper)

  14. Palm theory for random time changes

    Directory of Open Access Journals (Sweden)

    Masakiyo Miyazawa

    2001-01-01

    Full Text Available Palm distributions are basic tools when studying stationarity in the context of point processes, queueing systems, fluid queues or random measures. The framework varies with the random phenomenon of interest, but usually a one-dimensional group of measure-preserving shifts is the starting point. In the present paper, by alternatively using a framework involving random time changes (RTCs and a two-dimensional family of shifts, we are able to characterize all of the above systems in a single framework. Moreover, this leads to what we call the detailed Palm distribution (DPD which is stationary with respect to a certain group of shifts. The DPD has a very natural interpretation as the distribution seen at a randomly chosen position on the extended graph of the RTC, and satisfies a general duality criterion: the DPD of the DPD gives the underlying probability P in return.

  15. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  16. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    On a randomly imperfect spherical cap pressurized by a random dynamic load. ... In this paper, we investigate a dynamical system in a random setting of dual ... characterization of the random process for determining the dynamic buckling load ...

  17. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    Science.gov (United States)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  18. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    Science.gov (United States)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  19. Setting up a randomized clinical trial in the UK: approvals and process.

    Science.gov (United States)

    Greene, Louise Eleanor; Bearn, David R

    2013-06-01

    Randomized clinical trials are considered the 'gold standard' in primary research for healthcare interventions. However, they can be expensive and time-consuming to set up and require many approvals to be in place before they can begin. This paper outlines how to determine what approvals are required for a trial, the background of each approval and the process for obtaining them.

  20. Generation and monitoring of discrete stable random processes using multiple immigration population models

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, J O; Hopcraft, K I; Jakeman, E [Applied Mathematics Division, School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD (United Kingdom)

    2003-11-21

    Some properties of classical population processes that comprise births, deaths and multiple immigrations are investigated. The rates at which the immigrants arrive can be tailored to produce a population whose steady state fluctuations are described by a pre-selected distribution. Attention is focused on the class of distributions with a discrete stable law, which have power-law tails and whose moments and autocorrelation function do not exist. The separate problem of monitoring and characterizing the fluctuations is studied, analysing the statistics of individuals that leave the population. The fluctuations in the size of the population are transferred to the times between emigrants that form an intermittent time series of events. The emigrants are counted with a detector of finite dynamic range and response time. This is modelled through clipping the time series or saturating it at an arbitrary but finite level, whereupon its moments and correlation properties become finite. Distributions for the time to the first counted event and for the time between events exhibit power-law regimes that are characteristic of the fluctuations in population size. The processes provide analytical models with which properties of complex discrete random phenomena can be explored, and in addition provide generic means by which random time series encompassing a wide range of intermittent and other discrete random behaviour may be generated.

  1. Generation and monitoring of discrete stable random processes using multiple immigration population models

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E

    2003-01-01

    Some properties of classical population processes that comprise births, deaths and multiple immigrations are investigated. The rates at which the immigrants arrive can be tailored to produce a population whose steady state fluctuations are described by a pre-selected distribution. Attention is focused on the class of distributions with a discrete stable law, which have power-law tails and whose moments and autocorrelation function do not exist. The separate problem of monitoring and characterizing the fluctuations is studied, analysing the statistics of individuals that leave the population. The fluctuations in the size of the population are transferred to the times between emigrants that form an intermittent time series of events. The emigrants are counted with a detector of finite dynamic range and response time. This is modelled through clipping the time series or saturating it at an arbitrary but finite level, whereupon its moments and correlation properties become finite. Distributions for the time to the first counted event and for the time between events exhibit power-law regimes that are characteristic of the fluctuations in population size. The processes provide analytical models with which properties of complex discrete random phenomena can be explored, and in addition provide generic means by which random time series encompassing a wide range of intermittent and other discrete random behaviour may be generated

  2. Fatigue life of drilling bit bearings under arbitrary random loads

    Energy Technology Data Exchange (ETDEWEB)

    Talimi, M.; Farshidi, R. [Calgary Univ., AB (Canada)

    2009-07-01

    A fatigue analysis was conducted in order to estimate the bearing life of a roller cone rock bit under arbitrary random loads. The aim of the study was to reduce bearing failures that can interrupt well operations. Fatigue was considered as the main reason for bearing failure. The expected value of cumulative fatigue damage was used to estimate bearing life. An equation was used to express the relation between bearing life and bearing load when the bearing was subjected to a steady load and constant speed. The Palmgren-Miner hypothesis was used to determine the ultimate tensile strength of the material. The rain flow counting principle was used to determine distinct amplitude cycles. Hertzian equations were used to determine maximum stress loads. Fourier series were used to obtain simple harmonic functions for estimating stress-life relations. It was concluded that the method can be used during the well planning phase to prevent bearing failures. 6 refs.

  3. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  4. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  5. Randomly transitional phenomena in the system governed by Duffing's equation

    International Nuclear Information System (INIS)

    Ueda, Yoshisuke.

    1978-06-01

    This paper deals with turbulent or chaotic phenomena which occur in the system governed by Duffing's equation, a special type of 2-dimensional periodic systems. By using analog and digital computers, experiments are undertaken with special reference to the changes of attractors and of average power spectra of the random processes under the variation of the system parameters. On the basis of the experimental results, an outline of the random process is made clear. The results obtained in this paper will be applied to the phenomena of the same kind which occur in 3-dimensional autonomous systems. (author)

  6. Characteristics of the probability function for three random-walk models of reaction--diffusion processes

    International Nuclear Information System (INIS)

    Musho, M.K.; Kozak, J.J.

    1984-01-01

    A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes

  7. Interplay of Determinism and Randomness: From Irreversibility to Chaos, Fractals, and Stochasticity

    Science.gov (United States)

    Tsonis, A.

    2017-12-01

    We will start our discussion into randomness by looking exclusively at our formal mathematical system to show that even in this pure and strictly logical system one cannot do away with randomness. By employing simple mathematical models, we will identify the three possible sources of randomness: randomness due to inability to find the rules (irreversibility), randomness due to inability to have infinite power (chaos), and randomness due to stochastic processes. Subsequently we will move from the mathematical system to our physical world to show that randomness, through the quantum mechanical character of small scales, through chaos, and because of the second law of thermodynamics, is an intrinsic property of nature as well. We will subsequently argue that the randomness in the physical world is consistent with the three sources of randomness suggested from the study of simple mathematical systems. Many examples ranging from purely mathematical to natural processes will be presented, which clearly demonstrate how the combination of rules and randomness produces the world we live in. Finally, the principle of least effort or the principle of minimum energy consumption will be suggested as the underlying principle behind this symbiosis between determinism and randomness.

  8. Navigation by anomalous random walks on complex networks.

    Science.gov (United States)

    Weng, Tongfeng; Zhang, Jie; Khajehnejad, Moein; Small, Michael; Zheng, Rui; Hui, Pan

    2016-11-23

    Anomalous random walks having long-range jumps are a critical branch of dynamical processes on networks, which can model a number of search and transport processes. However, traditional measurements based on mean first passage time are not useful as they fail to characterize the cost associated with each jump. Here we introduce a new concept of mean first traverse distance (MFTD) to characterize anomalous random walks that represents the expected traverse distance taken by walkers searching from source node to target node, and we provide a procedure for calculating the MFTD between two nodes. We use Lévy walks on networks as an example, and demonstrate that the proposed approach can unravel the interplay between diffusion dynamics of Lévy walks and the underlying network structure. Moreover, applying our framework to the famous PageRank search, we show how to inform the optimality of the PageRank search. The framework for analyzing anomalous random walks on complex networks offers a useful new paradigm to understand the dynamics of anomalous diffusion processes, and provides a unified scheme to characterize search and transport processes on networks.

  9. Navigation by anomalous random walks on complex networks

    Science.gov (United States)

    Weng, Tongfeng; Zhang, Jie; Khajehnejad, Moein; Small, Michael; Zheng, Rui; Hui, Pan

    2016-11-01

    Anomalous random walks having long-range jumps are a critical branch of dynamical processes on networks, which can model a number of search and transport processes. However, traditional measurements based on mean first passage time are not useful as they fail to characterize the cost associated with each jump. Here we introduce a new concept of mean first traverse distance (MFTD) to characterize anomalous random walks that represents the expected traverse distance taken by walkers searching from source node to target node, and we provide a procedure for calculating the MFTD between two nodes. We use Lévy walks on networks as an example, and demonstrate that the proposed approach can unravel the interplay between diffusion dynamics of Lévy walks and the underlying network structure. Moreover, applying our framework to the famous PageRank search, we show how to inform the optimality of the PageRank search. The framework for analyzing anomalous random walks on complex networks offers a useful new paradigm to understand the dynamics of anomalous diffusion processes, and provides a unified scheme to characterize search and transport processes on networks.

  10. The emergence of typical entanglement in two-party random processes

    International Nuclear Information System (INIS)

    Dahlsten, O C O; Oliveira, R; Plenio, M B

    2007-01-01

    We investigate the entanglement within a system undergoing a random, local process. We find that there is initially a phase of very fast generation and spread of entanglement. At the end of this phase the entanglement is typically maximal. In Oliveira et al (2007 Phys. Rev. Lett. 98 130502) we proved that the maximal entanglement is reached to a fixed arbitrary accuracy within O(N 3 ) steps, where N is the total number of qubits. Here we provide a detailed and more pedagogical proof. We demonstrate that one can use the so-called stabilizer gates to simulate this process efficiently on a classical computer. Furthermore, we discuss three ways of identifying the transition from the phase of rapid spread of entanglement to the stationary phase: (i) the time when saturation of the maximal entanglement is achieved, (ii) the cutoff moment, when the entanglement probability distribution is practically stationary, and (iii) the moment block entanglement exhibits volume scaling. We furthermore investigate the mixed state and multipartite setting. Numerically, we find that the mutual information appears to behave similarly to the quantum correlations and that there is a well-behaved phase-space flow of entanglement properties towards an equilibrium. We describe how the emergence of typical entanglement can be used to create a much simpler tripartite entanglement description. The results form a bridge between certain abstract results concerning typical (also known as generic) entanglement relative to an unbiased distribution on pure states and the more physical picture of distributions emerging from random local interactions

  11. Effects of improved sanitation on diarrheal reduction for children under five in Idiofa, DR Congo: a cluster randomized trial.

    Science.gov (United States)

    Cha, Seungman; Lee, JaeEun; Seo, DongSik; Park, Byoung Mann; Mansiangi, Paul; Bernard, Kabore; Mulakub-Yazho, Guy Jerome Nkay; Famasulu, Honore Minka

    2017-09-19

    The lack of safe water and sanitation contributes to the rampancy of diarrhea in many developing countries. This study describes the design of a cluster-randomized trial in Idiofa, the Democratic Republic of the Congo, seeking evidence of the impact of improved sanitation on diarrhea for children under four. Of the 276 quartiers, 18 quartiers were randomly allocated to the intervention or control arm. Seven hundred and-twenty households were sampled and the youngest under-four child in each household was registered for this study. The primary endpoint of the study is diarrheal incidence, prevalence and duration in children under five. Material subsidies will be provided only to the households who complete pit digging plus superstructure and roof construction, regardless of their income level. This study employs a Sanitation Calendar so that the mother of each household can record the diarrheal episodes of her under-four child on a daily basis. The diary enables examination of the effect of the sanitation intervention on diarrhea duration and also resolves the limitation of the small number of clusters in the trial. In addition, the project will be monitored through the 'Sanitation Map', on which all households in the study area, including both the control and intervention arms, are registered. To avoid information bias or courtesy bias, photos will be taken of the latrine during the household visit, and a supervisor will determine well-equipped latrine uptake based on the photos. This reduces the possibility of recall bias and under- or over-estimation of diarrhea, which was the main limitation of previous studies. The study was approved by the Institutional Review Board of the School of Public Health, Kinshasa University (ESP/CE/040/15; April 13, 2015) and registered as an International Standard Randomized Controlled Trial (ISRCTN: 10,419,317) on March 13, 2015.

  12. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  13. Permutational distribution of the log-rank statistic under random censorship with applications to carcinogenicity assays.

    Science.gov (United States)

    Heimann, G; Neuhaus, G

    1998-03-01

    In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.

  14. Stochastic rocket dynamics under random nozzle side loads: Ornstein-Uhlenbeck boundary layer separation and its coarse grained connection to side loading and rocket response

    Energy Technology Data Exchange (ETDEWEB)

    Keanini, R.G.; Srivastava, N.; Tkacik, P.T. [Department of Mechanical Engineering, University of North Carolina at Charlotte, 9201 University City Blvd., Charlotte, NC 28078 (United States); Weggel, D.C. [Department of Civil and Environmental Engineering, University of North Carolina at Charlotte, 9201 University City Blvd., Charlotte, NC 28078 (United States); Knight, P.D. [Mitchell Aerospace and Engineering, Statesville, North Carolina 28677 (United States)

    2011-06-15

    A long-standing, though ill-understood problem in rocket dynamics, rocket response to random, altitude-dependent nozzle side-loads, is investigated. Side loads arise during low altitude flight due to random, asymmetric, shock-induced separation of in-nozzle boundary layers. In this paper, stochastic evolution of the in-nozzle boundary layer separation line, an essential feature underlying side load generation, is connected to random, altitude-dependent rotational and translational rocket response via a set of simple analytical models. Separation line motion, extant on a fast boundary layer time scale, is modeled as an Ornstein-Uhlenbeck process. Pitch and yaw responses, taking place on a long, rocket dynamics time scale, are shown to likewise evolve as OU processes. Stochastic, altitude-dependent rocket translational motion follows from linear, asymptotic versions of the full nonlinear equations of motion; the model is valid in the practical limit where random pitch, yaw, and roll rates all remain small. Computed altitude-dependent rotational and translational velocity and displacement statistics are compared against those obtained using recently reported high fidelity simulations [Srivastava, Tkacik, and Keanini, J. Appl. Phys. 108, 044911 (2010)]; in every case, reasonable agreement is observed. As an important prelude, evidence indicating the physical consistency of the model introduced in the above article is first presented: it is shown that the study's separation line model allows direct derivation of experimentally observed side load amplitude and direction densities. Finally, it is found that the analytical models proposed in this paper allow straightforward identification of practical approaches for: (i) reducing pitch/yaw response to side loads, and (ii) enhancing pitch/yaw damping once side loads cease. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  15. Quantum random number generator based on quantum tunneling effect

    OpenAIRE

    Zhou, Haihan; Li, Junlin; Pan, Dong; Zhang, Weixing; Long, Guilu

    2017-01-01

    In this paper, we proposed an experimental implementation of quantum random number generator(QRNG) with inherent randomness of quantum tunneling effect of electrons. We exploited InGaAs/InP diodes, whose valance band and conduction band shared a quasi-constant energy barrier. We applied a bias voltage on the InGaAs/InP avalanche diode, which made the diode works under Geiger mode, and triggered the tunneling events with a periodic pulse. Finally, after data collection and post-processing, our...

  16. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    Science.gov (United States)

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  17. MINIMUM ENTROPY DECONVOLUTION OF ONE-AND MULTI-DIMENSIONAL NON-GAUSSIAN LINEAR RANDOM PROCESSES

    Institute of Scientific and Technical Information of China (English)

    程乾生

    1990-01-01

    The minimum entropy deconvolution is considered as one of the methods for decomposing non-Gaussian linear processes. The concept of peakedness of a system response sequence is presented and its properties are studied. With the aid of the peakedness, the convergence theory of the minimum entropy deconvolution is established. The problem of the minimum entropy deconvolution of multi-dimensional non-Gaussian linear random processes is first investigated and the corresponding theory is given. In addition, the relation between the minimum entropy deconvolution and parameter method is discussed.

  18. Large deviations and mixing for dissipative PDEs with unbounded random kicks

    Science.gov (United States)

    Jakšić, V.; Nersesyan, V.; Pillet, C.-A.; Shirikyan, A.

    2018-02-01

    We study the problem of exponential mixing and large deviations for discrete-time Markov processes associated with a class of random dynamical systems. Under some dissipativity and regularisation hypotheses for the underlying deterministic dynamics and a non-degeneracy condition for the driving random force, we discuss the existence and uniqueness of a stationary measure and its exponential stability in the Kantorovich-Wasserstein metric. We next turn to the large deviations principle (LDP) and establish its validity for the occupation measures of the Markov processes in question. The proof is based on Kifer’s criterion for non-compact spaces, a result on large-time asymptotics for generalised Markov semigroup, and a coupling argument. These tools combined together constitute a new approach to LDP for infinite-dimensional processes without strong Feller property in a non-compact space. The results obtained can be applied to the two-dimensional Navier-Stokes system in a bounded domain and to the complex Ginzburg-Landau equation.

  19. Dispute settlement process under GATT/WTO diplomatic or judicial ...

    African Journals Online (AJOL)

    This paper probes the mechanisms of the dispute resolution process under the World Trade Organisation (WTO) and the General Agreement on Tariff and Trade (GATT). It tries to analyse the evolution of the dispute process which was initially based on diplomatic procedures and gives an account of its evolution and ...

  20. Damage Detection in Bridge Structure Using Vibration Data under Random Travelling Vehicle Loads

    International Nuclear Information System (INIS)

    Loh, C H; Hung, T Y; Chen, S F; Hsu, W T

    2015-01-01

    Due to the random nature of the road excitation and the inherent uncertainties in bridge-vehicle system, damage identification of bridge structure through continuous monitoring under operating situation become a challenge problem. Methods for system identification and damage detection of a continuous two-span concrete bridge structure in time domain is presented using interaction forces from random moving vehicles as excitation. The signals recorded in different locations of the instrumented bridge are mixed with signals from different internal and external (road roughness) vibration sources. The damage structure is also modelled as the stiffness reduction in one of the beam element. For the purpose of system identification and damage detection three different output-only modal analysis techniques are proposed: The covariance-driven stochastic subspace identification (SSI-COV), the blind source separation algorithms (called Second Order Blind Identification) and the multivariate AR model. The advantages and disadvantages of the three algorithms are discussed. Finally, the null-space damage index, subspace damage indices and mode shape slope change are used to detect and locate the damage. The proposed approaches has been tested in simulation and proved to be effective for structural health monitoring. (paper)

  1. An Artificial Bee Colony Algorithm for the Job Shop Scheduling Problem with Random Processing Times

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2011-09-01

    Full Text Available Due to the influence of unpredictable random events, the processing time of each operation should be treated as random variables if we aim at a robust production schedule. However, compared with the extensive research on the deterministic model, the stochastic job shop scheduling problem (SJSSP has not received sufficient attention. In this paper, we propose an artificial bee colony (ABC algorithm for SJSSP with the objective of minimizing the maximum lateness (which is an index of service quality. First, we propose a performance estimate for preliminary screening of the candidate solutions. Then, the K-armed bandit model is utilized for reducing the computational burden in the exact evaluation (through Monte Carlo simulation process. Finally, the computational results on different-scale test problems validate the effectiveness and efficiency of the proposed approach.

  2. Optical image encryption based on interference under convergent random illumination

    International Nuclear Information System (INIS)

    Kumar, Pramod; Joseph, Joby; Singh, Kehar

    2010-01-01

    In an optical image encryption system based on the interference principle, two pure phase masks are designed analytically to hide an image. These two masks are illuminated with a plane wavefront to retrieve the original image in the form of an interference pattern at the decryption plane. Replacement of the plane wavefront with convergent random illumination in the proposed scheme leads to an improvement in the security of interference based encryption. The proposed encryption scheme retains the simplicity of an interference based method, as the two pure masks are generated with an analytical method without any iterative algorithm. In addition to the free-space propagation distance and the two pure phase masks, the convergence distance and the randomized lens phase function are two new encryption parameters to enhance the system security. The robustness of this scheme against occlusion of the random phase mask of the randomized lens phase function is investigated. The feasibility of the proposed scheme is demonstrated with numerical simulation results

  3. Gaussian Mixture Random Coefficient model based framework for SHM in structures with time-dependent dynamics under uncertainty

    Science.gov (United States)

    Avendaño-Valencia, Luis David; Fassois, Spilios D.

    2017-12-01

    The problem of vibration-based damage diagnosis in structures characterized by time-dependent dynamics under significant environmental and/or operational uncertainty is considered. A stochastic framework consisting of a Gaussian Mixture Random Coefficient model of the uncertain time-dependent dynamics under each structural health state, proper estimation methods, and Bayesian or minimum distance type decision making, is postulated. The Random Coefficient (RC) time-dependent stochastic model with coefficients following a multivariate Gaussian Mixture Model (GMM) allows for significant flexibility in uncertainty representation. Certain of the model parameters are estimated via a simple procedure which is founded on the related Multiple Model (MM) concept, while the GMM weights are explicitly estimated for optimizing damage diagnostic performance. The postulated framework is demonstrated via damage detection in a simple simulated model of a quarter-car active suspension with time-dependent dynamics and considerable uncertainty on the payload. Comparisons with a simpler Gaussian RC model based method are also presented, with the postulated framework shown to be capable of offering considerable improvement in diagnostic performance.

  4. Interspinous process device versus standard conventional surgical decompression for lumbar spinal stenosis: Randomized controlled trial

    NARCIS (Netherlands)

    W.A. Moojen (Wouter); M.P. Arts (Mark); W.C.H. Jacobs (Wilco); E.W. van Zwet (Erik); M.E. van den Akker-van Marle (Elske); B.W. Koes (Bart); C.L.A.M. Vleggeert-Lankamp (Carmen); W.C. Peul (Wilco)

    2013-01-01

    markdownabstractAbstract Objective To assess whether interspinous process device implantation is more effective in the short term than conventional surgical decompression for patients with intermittent neurogenic claudication due to lumbar spinal stenosis. Design Randomized controlled

  5. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  6. Concentration processes under tubesheet sludge piles in nuclear steam generators

    International Nuclear Information System (INIS)

    Gonzalez, F.; Spekkens, P.

    1987-01-01

    The process by which bulk water solutes are concentrated under tubesheet sludge piles in nuclear steam generators was investigated in the laboratory under simulated CANDU operating conditions. Concentration rates were found to depend on the tube heat flux and pile depth, although beyond a critical depth the concentration efficiency decreased. This efficiency could be expressed by a concentration coefficient, and was found to depend also on the sludge pile porosity. Solute concentration profiles in the sludge pile suggested that the concentration mechanism in a high-porosity/permeability pile is characterized by boiling mainly near or at the tube surface, while in low-porosity piles, the change of phase may also become important in the body of the sludge pile. In all cases, the full depth of the pile was active to some extent in the concentration process. As long as the heat transfer under the pile was continued, the solute remained under the pile and slowly migrated toward the bottom. When the heat transfer was stopped, the solute diffused back into the bulk solution at a rate slower than that of the concentration process

  7. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  8. Portfolio Selection with Jumps under Regime Switching

    Directory of Open Access Journals (Sweden)

    Lin Zhao

    2010-01-01

    Full Text Available We investigate a continuous-time version of the mean-variance portfolio selection model with jumps under regime switching. The portfolio selection is proposed and analyzed for a market consisting of one bank account and multiple stocks. The random regime switching is assumed to be independent of the underlying Brownian motion and jump processes. A Markov chain modulated diffusion formulation is employed to model the problem.

  9. A prospective randomized trial of content expertise versus process expertise in small group teaching.

    Science.gov (United States)

    Peets, Adam D; Cooke, Lara; Wright, Bruce; Coderre, Sylvain; McLaughlin, Kevin

    2010-10-14

    Effective teaching requires an understanding of both what (content knowledge) and how (process knowledge) to teach. While previous studies involving medical students have compared preceptors with greater or lesser content knowledge, it is unclear whether process expertise can compensate for deficient content expertise. Therefore, the objective of our study was to compare the effect of preceptors with process expertise to those with content expertise on medical students' learning outcomes in a structured small group environment. One hundred and fifty-one first year medical students were randomized to 11 groups for the small group component of the Cardiovascular-Respiratory course at the University of Calgary. Each group was then block randomized to one of three streams for the entire course: tutoring exclusively by physicians with content expertise (n = 5), tutoring exclusively by physicians with process expertise (n = 3), and tutoring by content experts for 11 sessions and process experts for 10 sessions (n = 3). After each of the 21 small group sessions, students evaluated their preceptors' teaching with a standardized instrument. Students' knowledge acquisition was assessed by an end-of-course multiple choice (EOC-MCQ) examination. Students rated the process experts significantly higher on each of the instrument's 15 items, including the overall rating. Students' mean score (±SD) on the EOC-MCQ exam was 76.1% (8.1) for groups taught by content experts, 78.2% (7.8) for the combination group and 79.5% (9.2) for process expert groups (p = 0.11). By linear regression student performance was higher if they had been taught by process experts (regression coefficient 2.7 [0.1, 5.4], p teach first year medical students within a structured small group environment; preceptors with process expertise result in at least equivalent, if not superior, student outcomes in this setting.

  10. [Scientific connotation of processing Bombyx Batryticatus under high temperature].

    Science.gov (United States)

    Ma, Li; Wang, Xuan; Ma, Lin; Wang, Man-yuan; Qiu, Feng

    2015-12-01

    The aim of this study was to elucidate the scientific connotation of Bombyx Batryticatus processing with wheat bran under high temperature. The contents of soluble protein extracted from Bombyx Batryticatus and its processed products and the limited content of AFT in Bombyx Batryticatus and the processed one were compared. The concentration of protein was measured with the Bradford methods and the difference of protein between Bombyx Batryticatus and its processed products was compared by SDS-PAGE analysis. Aflatoxin B1, B2, G1, and G2 were determined by reversed-phase HPLC. The results showed that the soluble protein content of Bombyx Batryticatus and its processed products were (47.065 +/- 0.249), (29.756 +/- 1.961) mg x g(-1), correspondingly. Analysis of protein gel electrophoresis showed that there were no significant differences between the crude and processed one in protein varieties. 6 bands were detected: 31.90, 26.80, 18.71, 15.00, 10.18, 8.929 kDa. Below 10 kDa, the color of bands of the processed one was deeper than the crude one, which demonstrate that macromolecular protein was degradated into micromolecule. The content of AFG1, AFB1, AFG2, AFB2 were 0.382, 0.207, 0.223, 0.073 g x kg(-1), not exceeded 5 microg x kg(-1) while the processed one was not detected. Through processing with wheat bran under high temperature, the content of soluble protein in Bombyx Batryticatus decreased, the processing purpose for alleviating drug property was achieved. Meanwhile, the limited content of aflatoxins were reduced or cleared by processing procedure or absorbed by processing auxillary material, adding the safety of the traditional Chinese Medicine. In conclusion, as a traditional processing method, bran frying Bombyx Batryticatus was scientific and reasonable.

  11. Fuel corrosion processes under waste disposal conditions

    International Nuclear Information System (INIS)

    Shoesmith, D.W.

    2000-01-01

    The release of the majority of radionuclides from spent nuclear fuel under permanent disposal conditions will be controlled by the rate of dissolution of the UO 2 fuel matrix. In this manuscript the mechanism of the coupled anodic (fuel dissolution) and cathodic (oxidant reduction) reactions which constitute the overall fuel corrosion process is reviewed, and the many published observations on fuel corrosion under disposal conditions discussed. The primary emphasis is on summarizing the overall mechanistic behaviour and establishing the primary factors likely to control fuel corrosion. Included are discussions on the influence of various oxidants including radiolytic ones, pH, temperature, groundwater composition, and the formation of corrosion product deposits. The relevance of the data recorded on unirradiated UO 2 to the interpretation of spent fuel behaviour is included. Based on the review, the data used to develop fuel corrosion models under the conditions anticipated in Yucca Mountain (NV, USA) are evaluated

  12. Nonlinear dynamic analysis of atomic force microscopy under deterministic and random excitation

    International Nuclear Information System (INIS)

    Pishkenari, Hossein Nejat; Behzad, Mehdi; Meghdari, Ali

    2008-01-01

    The atomic force microscope (AFM) system has evolved into a useful tool for direct measurements of intermolecular forces with atomic-resolution characterization that can be employed in a broad spectrum of applications. This paper is devoted to the analysis of nonlinear behavior of amplitude modulation (AM) and frequency modulation (FM) modes of atomic force microscopy. For this, the microcantilever (which forms the basis for the operation of AFM) is modeled as a single mode approximation and the interaction between the sample and cantilever is derived from a van der Waals potential. Using perturbation methods such as averaging, and Fourier transform nonlinear equations of motion are analytically solved and the advantageous results are extracted from this nonlinear analysis. The results of the proposed techniques for AM-AFM, clearly depict the existence of two stable and one unstable (saddle) solutions for some of exciting parameters under deterministic vibration. The basin of attraction of two stable solutions is different and dependent on the exciting frequency. From this analysis the range of the frequency which will result in a unique periodic response can be obtained and used in practical experiments. Furthermore the analytical responses determined by perturbation techniques can be used to detect the parameter region where the chaotic motion is avoided. On the other hand for FM-AFM, the relation between frequency shift and the system parameters can be extracted and used for investigation of the system nonlinear behavior. The nonlinear behavior of the oscillating tip can easily explain the observed shift of frequency as a function of tip sample distance. Also in this paper we have investigated the AM-AFM system response under a random excitation. Using two different methods we have obtained the statistical properties of the tip motion. The results show that we can use the mean square value of tip motion to image the sample when the excitation signal is random

  13. Nonlinear dynamic analysis of atomic force microscopy under deterministic and random excitation

    Energy Technology Data Exchange (ETDEWEB)

    Pishkenari, Hossein Nejat [Center of Excellence in Design, Robotics and Automation (CEDRA), School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Behzad, Mehdi [Center of Excellence in Design, Robotics and Automation (CEDRA), School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of)], E-mail: m_behzad@sharif.edu; Meghdari, Ali [Center of Excellence in Design, Robotics and Automation (CEDRA), School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of)

    2008-08-15

    The atomic force microscope (AFM) system has evolved into a useful tool for direct measurements of intermolecular forces with atomic-resolution characterization that can be employed in a broad spectrum of applications. This paper is devoted to the analysis of nonlinear behavior of amplitude modulation (AM) and frequency modulation (FM) modes of atomic force microscopy. For this, the microcantilever (which forms the basis for the operation of AFM) is modeled as a single mode approximation and the interaction between the sample and cantilever is derived from a van der Waals potential. Using perturbation methods such as averaging, and Fourier transform nonlinear equations of motion are analytically solved and the advantageous results are extracted from this nonlinear analysis. The results of the proposed techniques for AM-AFM, clearly depict the existence of two stable and one unstable (saddle) solutions for some of exciting parameters under deterministic vibration. The basin of attraction of two stable solutions is different and dependent on the exciting frequency. From this analysis the range of the frequency which will result in a unique periodic response can be obtained and used in practical experiments. Furthermore the analytical responses determined by perturbation techniques can be used to detect the parameter region where the chaotic motion is avoided. On the other hand for FM-AFM, the relation between frequency shift and the system parameters can be extracted and used for investigation of the system nonlinear behavior. The nonlinear behavior of the oscillating tip can easily explain the observed shift of frequency as a function of tip sample distance. Also in this paper we have investigated the AM-AFM system response under a random excitation. Using two different methods we have obtained the statistical properties of the tip motion. The results show that we can use the mean square value of tip motion to image the sample when the excitation signal is random.

  14. Probabilistic Design in a Sheet Metal Stamping Process under Failure Analysis

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao, Jian; Chen, Wei; Xia, Z. Cedric

    2005-01-01

    Sheet metal stamping processes have been widely implemented in many industries due to its repeatability and productivity. In general, the simulations for a sheet metal forming process involve nonlinearity, complex material behavior and tool-material interaction. Instabilities in terms of tearing and wrinkling are major concerns in many sheet metal stamping processes. In this work, a sheet metal stamping process of a mild steel for a wheelhouse used in automobile industry is studied by using an explicit nonlinear finite element code and incorporating failure analysis (tearing and wrinkling) and design under uncertainty. Margins of tearing and wrinkling are quantitatively defined via stress-based criteria for system-level design. The forming process utilizes drawbeads instead of using the blank holder force to restrain the blank. The main parameters of interest in this work are friction conditions, drawbead configurations, sheet metal properties, and numerical errors. A robust design model is created to conduct a probabilistic design, which is made possible for this complex engineering process via an efficient uncertainty propagation technique. The method called the weighted three-point-based method estimates the statistical characteristics (mean and variance) of the responses of interest (margins of failures), and provide a systematic approach in designing a sheet metal forming process under the framework of design under uncertainty

  15. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  16. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  17. Random number generation as an index of controlled processing.

    Science.gov (United States)

    Jahanshahi, Marjan; Saleem, T; Ho, Aileen K; Dirnberger, Georg; Fuller, R

    2006-07-01

    Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use. ((c) 2006 APA, all rights reserved).

  18. An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System

    Science.gov (United States)

    Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed

    PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.

  19. Derrida's Generalized Random Energy models; 4, Continuous state branching and coalescents

    CERN Document Server

    Bovier, A

    2003-01-01

    In this paper we conclude our analysis of Derrida's Generalized Random Energy Models (GREM) by identifying the thermodynamic limit with a one-parameter family of probability measures related to a continuous state branching process introduced by Neveu. Using a construction introduced by Bertoin and Le Gall in terms of a coherent family of subordinators related to Neveu's branching process, we show how the Gibbs geometry of the limiting Gibbs measure is given in terms of the genealogy of this process via a deterministic time-change. This construction is fully universal in that all different models (characterized by the covariance of the underlying Gaussian process) differ only through that time change, which in turn is expressed in terms of Parisi's overlap distribution. The proof uses strongly the Ghirlanda-Guerra identities that impose the structure of Neveu's process as the only possible asymptotic random mechanism.

  20. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media

    Science.gov (United States)

    Zhou, L.; Qu, Z. G.; Ding, T.; Miao, J. Y.

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  1. Annuities under random rates of interest - revisited

    OpenAIRE

    Burnecki, K.; Marciniuk, A.; Weron, A.

    2001-01-01

    In the article we consider accumulated values of annuities-certain with yearly payments with independent random interest rates. We focus on annuities with payments varying in arithmetic and geometric progression which are important basic varying annuities (see Kellison, 1991). They appear to be a generalization of the types studied recently by Zaks (2001). We derive, via recursive relationships, mean and variance formulae of the final values of the annuities. As a consequence, we obtain momen...

  2. Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation

    DEFF Research Database (Denmark)

    Witt, Carsten

    2012-01-01

    The analysis of randomized search heuristics on classes of functions is fundamental for the understanding of the underlying stochastic process and the development of suitable proof techniques. Recently, remarkable progress has been made in bounding the expected optimization time of the simple (1...

  3. Asymptotic results for the semi-Markovian random walk with delay

    International Nuclear Information System (INIS)

    Khaniyev, T.A.; Aliyev, R.T.

    2006-12-01

    In this study, the semi-Markovian random walk with a discrete interference of chance (X(t) ) is considered and under some weak assumptions the ergodicity of this process is discussed. Characteristic function of the ergodic distribution of X(t) is expressed by means of the probability characteristics of the boundary functionals (N,S N ). Some exact formulas for first and second moments of ergodic distribution of the process X(t) are obtained when the random variable ζ 1 - s, which is describing a discrete interference of chance, has Gamma distribution on the interval [0, ∞) with parameter (α,λ) . Based on these results, the asymptotic expansions with three terms for the first two moments of the ergodic distribution of the process X(t) are obtained, as λ → 0. (author)

  4. Waiting time analysis for MX/G/1 priority queues with/without vacations under random order of service discipline

    Directory of Open Access Journals (Sweden)

    Norikazu Kawasaki

    2000-01-01

    Full Text Available We study MX/G/1 nonpreemptive and preemptive-resume priority queues with/without vacations under random order of service (ROS discipline within each class. By considering the conditional waiting times given the states of the system, which an arbitrary message observes upon arrival, we derive the Laplace-Stieltjes transforms of the waiting time distributions and explicitly obtain the first two moments. The relationship for the second moments under ROS and first-come first-served disciplines extends the one found previously by Takacs and Fuhrmann for non-priority single arrival queues.

  5. Learning process mapping heuristics under stochastic sampling overheads

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  6. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    Science.gov (United States)

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  7. 22 CFR 92.92 - Service of legal process under provisions of State law.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Service of legal process under provisions of... AND RELATED SERVICES Quasi-Legal Services § 92.92 Service of legal process under provisions of State law. It may be found that a State statue purporting to regulate the service of process in foreign...

  8. Human norovirus inactivation in oysters by high hydrostatic pressure processing: A randomized double-blinded study

    Science.gov (United States)

    This randomized, double-blinded, clinical trial assessed the effect of high hydrostatic pressure processing (HPP) on genogroup I.1 human norovirus (HuNoV) inactivation in virus-seeded oysters when ingested by subjects. The safety and efficacy of HPP treatments were assessed in three study phases wi...

  9. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation

    Energy Technology Data Exchange (ETDEWEB)

    Stipčević, Mario, E-mail: mario.stipcevic@irb.hr [Photonics and Quantum Optics Research Unit, Center of Excellence for Advanced Materials and Sensing Devices, Ruđer Bošković Institute, Bijenička 54, 10000 Zagreb (Croatia)

    2016-03-15

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.

  10. Will electrical cyber-physical interdependent networks undergo first-order transition under random attacks?

    Science.gov (United States)

    Ji, Xingpei; Wang, Bo; Liu, Dichen; Dong, Zhaoyang; Chen, Guo; Zhu, Zhenshan; Zhu, Xuedong; Wang, Xunting

    2016-10-01

    Whether the realistic electrical cyber-physical interdependent networks will undergo first-order transition under random failures still remains a question. To reflect the reality of Chinese electrical cyber-physical system, the "partial one-to-one correspondence" interdependent networks model is proposed and the connectivity vulnerabilities of three realistic electrical cyber-physical interdependent networks are analyzed. The simulation results show that due to the service demands of power system the topologies of power grid and its cyber network are highly inter-similar which can effectively avoid the first-order transition. By comparing the vulnerability curves between electrical cyber-physical interdependent networks and its single-layer network, we find that complex network theory is still useful in the vulnerability analysis of electrical cyber-physical interdependent networks.

  11. Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability

    Science.gov (United States)

    Kar, Soummya; Moura, José M. F.

    2011-04-01

    The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.

  12. BURDEN OF PROOF IN CONSTITUTIONAL PROCESS UNDER CONSIDERATION OF DEMOCRATIC STATE

    Directory of Open Access Journals (Sweden)

    Patrícia Mendanha Dias

    2016-12-01

    Full Text Available Under the aegis of the democratic rule of law, the contradictory it appears as a fundamental premise of constitutionalised process. In this view, the contradictory, more than just right party to exercise its right of defense, it should be seen as a form of compartipação in the process, allowing the parties to the probative production as an effective influence on acertamento right. Thus, it should be disproved the right mitigation attempts contradictory in the evidentiary phase and imposing aside the need to produce proofimpossible or difficult to perform, under penalty of offending the process understood in the modern normative framework.

  13. Transient Oscilliations in Mechanical Systems of Automatic Control with Random Parameters

    Science.gov (United States)

    Royev, B.; Vinokur, A.; Kulikov, G.

    2018-04-01

    Transient oscillations in mechanical systems of automatic control with random parameters is a relevant but insufficiently studied issue. In this paper, a modified spectral method was applied to investigate the problem. The nature of dynamic processes and the phase portraits are analyzed depending on the amplitude and frequency of external influence. It is evident from the obtained results, that the dynamic phenomena occurring in the systems with random parameters under external influence are complex, and their study requires further investigation.

  14. A Correlated Random Effects Model for Non-homogeneous Markov Processes with Nonignorable Missingness.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2013-05-01

    Life history data arising in clusters with prespecified assessment time points for patients often feature incomplete data since patients may choose to visit the clinic based on their needs. Markov process models provide a useful tool describing disease progression for life history data. The literature mainly focuses on time homogeneous process. In this paper we develop methods to deal with non-homogeneous Markov process with incomplete clustered life history data. A correlated random effects model is developed to deal with the nonignorable missingness, and a time transformation is employed to address the non-homogeneity in the transition model. Maximum likelihood estimate based on the Monte-Carlo EM algorithm is advocated for parameter estimation. Simulation studies demonstrate that the proposed method works well in many situations. We also apply this method to an Alzheimer's disease study.

  15. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  16. Ultrasonic signal processing for sizing under-clad flaws

    International Nuclear Information System (INIS)

    Shankar, R.; Paradiso, T.J.; Lane, S.S.; Quinn, J.R.

    1985-01-01

    Ultrasonic digital data were collected from underclad cracks in sample pressure vessel specimen blocks. These blocks were weld cladded under different processes to simulate actual conditions in US Pressure Water Reactors. Each crack was represented by a flaw-echo dynamic curve which is a plot of the transducer motion on the surface as a function of the ultrasonic response into the material. Crack depth sizing was performed by identifying in the dynamic curve the crack tip diffraction signals from the upper and lower tips. This paper describes the experimental procedure, digital signal processing methods used and algorithms developed for crack depth sizing

  17. A comparison of random walks in dependent random environments

    NARCIS (Netherlands)

    Scheinhardt, Willem R.W.; Kroese, Dirk

    2015-01-01

    Although the theoretical behavior of one-dimensional random walks in random environments is well understood, the actual evaluation of various characteristics of such processes has received relatively little attention. This paper develops new methodology for the exact computation of the drift in such

  18. Mental skills training effectively minimizes operative performance deterioration under stressful conditions: Results of a randomized controlled study.

    Science.gov (United States)

    Anton, N E; Beane, J; Yurco, A M; Howley, L D; Bean, E; Myers, E M; Stefanidis, D

    2018-02-01

    Stress can negatively impact surgical performance, but mental skills may help. We hypothesized that a comprehensive mental skills curriculum (MSC) would minimize resident performance deterioration under stress. Twenty-four residents were stratified then randomized to receive mental skills and FLS training (MSC group), or only FLS training (control group). Laparoscopic suturing skill was assessed on a live porcine model with and without external stressors. Outcomes were compared with t-tests. Twenty-three residents completed the study. The groups were similar at baseline. There were no differences in suturing at posttest or transfer test under normal conditions. Both groups experienced significantly decreased performance when stress was applied, but the MSC group significantly outperformed controls under stress. This MSC enabled residents to perform significantly better than controls in the simulated OR under unexpected stressful conditions. These findings support the use of psychological skills as an integral part of a surgical resident training. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Dynamic Processes in Nanostructured Crystals Under Ion Irradiation

    Science.gov (United States)

    Uglov, V. V.; Kvasov, N. T.; Shimanski, V. I.; Safronov, I. V.; Komarov, N. D.

    2018-02-01

    The paper presents detailed investigations of dynamic processes occurring in nanostructured Si(Fe) material under the radiation exposure, namely: heating, thermoelastic stress generation, elastic disturbances of the surrounding medium similar to weak shock waves, and dislocation generation. The performance calculations are proposed for elastic properties of the nanostructured material with a glance to size effects in nanoparticles.

  20. Optimum design of forging process parameters and preform shape under uncertainties

    International Nuclear Information System (INIS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2004-01-01

    Forging is a highly complex non-linear process that is vulnerable to various uncertainties, such as variations in billet geometry, die temperature, material properties, workpiece and forging equipment positional errors and process parameters. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion and production risk. Identifying the sources of uncertainties, quantifying and controlling them will reduce risk in the manufacturing environment, which will minimize the overall cost of production. In this paper, various uncertainties that affect forging tool life and preform design are identified, and their cumulative effect on the forging process is evaluated. Since the forging process simulation is computationally intensive, the response surface approach is used to reduce time by establishing a relationship between the system performance and the critical process design parameters. Variability in system performance due to randomness in the parameters is computed by applying Monte Carlo Simulations (MCS) on generated Response Surface Models (RSM). Finally, a Robust Methodology is developed to optimize forging process parameters and preform shape. The developed method is demonstrated by applying it to an axisymmetric H-cross section disk forging to improve the product quality and robustness

  1. On plasma stability under anisotropic random electric field influence

    International Nuclear Information System (INIS)

    Rabich, L.N.; Sosenko, P.P.

    1987-01-01

    The influence of anisotropic random field on plasma stability is studied. The thresholds and instability increments are obtained. The stabilizing influence of frequency missmatch and external magnetic field is pointed out

  2. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  3. Emergent dynamics of Cucker-Smale particles under the effects of random communication and incompressible fluids

    Science.gov (United States)

    Ha, Seung-Yeal; Xiao, Qinghua; Zhang, Xiongtao

    2018-04-01

    We study the dynamics of infinitely many Cucker-Smale (C-S) flocking particles under the interplay of random communication and incompressible fluids. For the dynamics of an ensemble of flocking particles, we use the kinetic Cucker-Smale-Fokker-Planck (CS-FP) equation with a degenerate diffusion, whereas for the fluid component, we use the incompressible Navier-Stokes (N-S) equations. These two subsystems are coupled via the drag force. For this coupled model, we present the global existence of weak and strong solutions in Rd (d = 2 , 3). Under the extra regularity assumptions of the initial data, the unique solvability of strong solutions is also established in R2. In a large coupling regime and periodic spatial domain T2 : =R2 /Z2, we show that the velocities of C-S particles and fluids are asymptotically aligned to two constant velocities which may be different.

  4. Option pricing under stochastic volatility: the exponential Ornstein–Uhlenbeck model

    International Nuclear Information System (INIS)

    Perelló, Josep; Masoliver, Jaume; Sircar, Ronnie

    2008-01-01

    We study the pricing problem for a European call option when the volatility of the underlying asset is random and follows the exponential Ornstein–Uhlenbeck model. The random diffusion model proposed is a two-dimensional market process that takes a log-Brownian motion to describe price dynamics and an Ornstein–Uhlenbeck subordinated process describing the randomness of the log-volatility. We derive an approximate option price that is valid when (i) the fluctuations of the volatility are larger than its normal level, (ii) the volatility presents a slow driving force, toward its normal level and, finally, (iii) the market price of risk is a linear function of the log-volatility. We study the resulting European call price and its implied volatility for a range of parameters consistent with daily Dow Jones index data

  5. Direct generation of all-optical random numbers from optical pulse amplitude chaos.

    Science.gov (United States)

    Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong

    2012-02-13

    We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.

  6. Performance analysis of spectral-phase-encoded optical code-division multiple-access system regarding the incorrectly decoded signal as a nonstationary random process

    Science.gov (United States)

    Yan, Meng; Yao, Minyu; Zhang, Hongming

    2005-11-01

    The performance of a spectral-phase-encoded (SPE) optical code-division multiple-access (OCDMA) system is analyzed. Regarding the incorrectly decoded signal (IDS) as a nonstationary random process, we derive a novel probability distribution for it. The probability distribution of the IDS is considered a chi-squared distribution with degrees of freedom r=1, which is more reasonable and accurate than in previous work. The bit error rate (BER) of an SPE OCDMA system under multiple-access interference is evaluated. Numerical results show that the system can sustain very low BER even when there are multiple simultaneous users, and as the code length becomes longer or the initial pulse becomes shorter, the system performs better.

  7. Mechanisms within the Parietal Cortex Correlate with the Benefits of Random Practice in Motor Adaptation

    Directory of Open Access Journals (Sweden)

    Benjamin Thürer

    2017-08-01

    Full Text Available The motor learning literature shows an increased retest or transfer performance after practicing under unstable (random conditions. This random practice effect (also known as contextual interference effect is frequently investigated on the behavioral level and discussed in the context of mechanisms of the dorsolateral prefrontal cortex and increased cognitive efforts during movement planning. However, there is a lack of studies examining the random practice effect in motor adaptation tasks and, in general, the underlying neural processes of the random practice effect are not fully understood. We tested 24 right-handed human subjects performing a reaching task using a robotic manipulandum. Subjects learned to adapt either to a blocked or a random schedule of different force field perturbations while subjects’ electroencephalography (EEG was recorded. The behavioral results showed a distinct random practice effect in terms of a more stabilized retest performance of the random compared to the blocked practicing group. Further analyses showed that this effect correlates with changes in the alpha band power in electrodes over parietal areas. We conclude that the random practice effect in this study is facilitated by mechanisms within the parietal cortex during movement execution which might reflect online feedback mechanisms.

  8. The McMillan Theorem for Colored Branching Processes and Dimensions of Random Fractals

    Directory of Open Access Journals (Sweden)

    Victor Bakhtin

    2014-12-01

    Full Text Available For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines. In this setting, the role of Shannon’s entropy is played by the Kullback–Leibler divergence, and the Hausdorff dimensions are computed by means of the so-called Billingsley–Kullback entropy, defined in the paper.

  9. Recommendations and illustrations for the evaluation of photonic random number generators

    Science.gov (United States)

    Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi

    2017-09-01

    The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  10. Recommendations and illustrations for the evaluation of photonic random number generators

    Directory of Open Access Journals (Sweden)

    Joseph D. Hart

    2017-09-01

    Full Text Available The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h(,τ as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  11. Stochastic Analysis of Natural Convection in Vertical Channels with Random Wall Temperature

    Directory of Open Access Journals (Sweden)

    Ryoichi Chiba

    2017-01-01

    Full Text Available This study attempts to derive the statistics of temperature and velocity fields of laminar natural convection in a heated vertical channel with random wall temperature. The wall temperature is expressed as a random function with respect to time, or a random process. First, analytical solutions of the transient temperature and flow velocity fields for an arbitrary temporal variation in the channel wall temperature are obtained by the integral transform and convolution theorem. Second, the autocorrelations of the temperature and velocity are formed from the solutions, assuming a stationarity in time. The mean square values of temperature and velocity are computed under the condition that the fluctuation in the channel wall temperature can be considered as white noise or a stationary Markov process. Numerical results demonstrate that a decrease in the Prandtl number or an increase in the correlation time of the random process increases the level of mean square velocity but does not change its spatial distribution tendency, which is a bell-shaped profile with a peak at a certain horizontal distance from the channel wall. The peak position is not substantially affected by the Prandtl number or the correlation time.

  12. Contextuality is about identity of random variables

    International Nuclear Information System (INIS)

    Dzhafarov, Ehtibar N; Kujala, Janne V

    2014-01-01

    Contextual situations are those in which seemingly ‘the same’ random variable changes its identity depending on the conditions under which it is recorded. Such a change of identity is observed whenever the assumption that the variable is one and the same under different conditions leads to contradictions when one considers its joint distribution with other random variables (this is the essence of all Bell-type theorems). In our Contextuality-by-Default approach, instead of asking why or how the conditions force ‘one and the same’ random variable to change ‘its’ identity, any two random variables recorded under different conditions are considered different ‘automatically.’ They are never the same, nor are they jointly distributed, but one can always impose on them a joint distribution (probabilistic coupling). The special situations when there is a coupling in which these random variables are equal with probability 1 are considered noncontextual. Contextuality means that such couplings do not exist. We argue that the determination of the identity of random variables by conditions under which they are recorded is not a causal relationship and cannot violate laws of physics. (paper)

  13. Resistance and resistance fluctuations in random resistor networks under biased percolation.

    Science.gov (United States)

    Pennetta, Cecilia; Reggiani, L; Trefán, Gy; Alfinito, E

    2002-06-01

    We consider a two-dimensional random resistor network (RRN) in the presence of two competing biased processes consisting of the breaking and recovering of elementary resistors. These two processes are driven by the joint effects of an electrical bias and of the heat exchange with a thermal bath. The electrical bias is set up by applying a constant voltage or, alternatively, a constant current. Monte Carlo simulations are performed to analyze the network evolution in the full range of bias values. Depending on the bias strength, electrical failure or steady state are achieved. Here we investigate the steady state of the RRN focusing on the properties of the non-Ohmic regime. In constant-voltage conditions, a scaling relation is found between /(0) and V/V(0), where is the average network resistance, (0) the linear regime resistance, and V0 the threshold value for the onset of nonlinearity. A similar relation is found in constant-current conditions. The relative variance of resistance fluctuations also exhibits a strong nonlinearity whose properties are investigated. The power spectral density of resistance fluctuations presents a Lorentzian spectrum and the amplitude of fluctuations shows a significant non-Gaussian behavior in the prebreakdown region. These results compare well with electrical breakdown measurements in thin films of composites and of other conducting materials.

  14. A Randomization Procedure for "Trickle-Process" Evaluations

    Science.gov (United States)

    Goldman, Jerry

    1977-01-01

    This note suggests a solution to the problem of achieving randomization in experimental settings where units deemed eligible for treatment "trickle in," that is, appear at any time. The solution permits replication of the experiment in order to test for time-dependent effects. (Author/CTM)

  15. Cognitive Processes Underlying the Artistic Experience

    Directory of Open Access Journals (Sweden)

    Alejandra Wah

    2017-08-01

    Full Text Available Based on the field of aesthetics, for centuries philosophers and more recently scientists have been concerned with understanding the artistic experience focusing on emotional responses to the perception of artworks. By contrast, in the last decades, evolutionary biology has been concerned with explaining the artistic experience by focusing on the cognitive processes underlying this experience. Up until now, the cognitive mechanisms that allow humans to experience objects and events as art remain largely unexplored and there is still no conventional use of terms for referring to the processes which may explain why the artistic experience is characteristically human and universal to human beings (Dissanayake, 1992, p. 24; Donald, 2006, p. 4. In this paper, I will first question whether it is productive to understand the artistic experience in terms of perception and emotion, and I will subsequently propose a possible alternative explanation to understand this experience. Drawing upon the work of Ellen Dissanayake (1992, 2000, 2015, Merlin Donald (2001, 2006, 2013, Antonio Damasio (1994, 2000, 2003, 2010, Barend van Heusden (2004, 2009, 2010, and Alejandra Wah (2014, I will argue that this experience is characterized by particular degrees of imagination and consciousness.

  16. Coupled continuous time-random walks in quenched random environment

    Science.gov (United States)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  17. Divided Attention and Processes Underlying Sense of Agency

    Directory of Open Access Journals (Sweden)

    Wen eWen

    2016-01-01

    Full Text Available Sense of agency refers to the subjective feeling of controlling events through one’s behavior or will. Sense of agency results from matching predictions of one’s own actions with actual feedback regarding the action. Furthermore, when an action involves a cued goal, performance-based inference contributes to sense of agency. That is, if people achieve their goal, they would believe themselves to be in control. Previous studies have shown that both action-effect comparison and performance-based inference contribute to sense of agency; however, the dominance of one process over the other may shift based on task conditions such as the presence or absence of specific goals. In this study, we examined the influence of divided attention on these two processes underlying sense of agency in two conditions. In the experimental task, participants continuously controlled a moving dot for 10 s while maintaining a string of three or seven digits in working memory. We found that when there was no cued goal (no-cued-goal condition, sense of agency was impaired by high cognitive load. Contrastingly, when participants controlled the dot based on a cued goal (cued-goal-directed condition, their sense of agency was lower than in the no-cued-goal condition and was not affected by cognitive load. The results suggest that the action-effect comparison process underlying sense of agency requires attention. On the other hand, the weaker influence of divided attention in the cued-goal-directed condition could be attributed to the dominance of performance-based inference, which is probably automatic.

  18. Stability of a nonlinear second order equation under parametric bounded noise excitation

    International Nuclear Information System (INIS)

    Wiebe, Richard; Xie, Wei-Chau

    2016-01-01

    The motivation for the following work is a structural column under dynamic axial loads with both deterministic (harmonic transmitted forces from the surrounding structure) and random (wind and/or earthquake) loading components. The bounded noise used herein is a sinusoid with an argument composed of a random (Wiener) process deviation about a mean frequency. By this approach, a noise parameter may be used to investigate the behavior through the spectrum from simple harmonic forcing, to a bounded random process with very little harmonic content. The stability of both the trivial and non-trivial stationary solutions of an axially-loaded column (which is modeled as a second order nonlinear equation) under parametric bounded noise excitation is investigated by use of Lyapunov exponents. Specifically the effect of noise magnitude, amplitude of the forcing, and damping on stability of a column is investigated. First order averaging is employed to obtain analytical approximations of the Lyapunov exponents of the trivial solution. For the non-trivial stationary solution however, the Lyapunov exponents are obtained via Monte Carlo simulation as the stability equations become analytically intractable. (paper)

  19. Processing speed and working memory training in multiple sclerosis: a double-blind randomized controlled pilot study.

    Science.gov (United States)

    Hancock, Laura M; Bruce, Jared M; Bruce, Amanda S; Lynch, Sharon G

    2015-01-01

    Between 40-65% of multiple sclerosis patients experience cognitive deficits, with processing speed and working memory most commonly affected. This pilot study investigated the effect of computerized cognitive training focused on improving processing speed and working memory. Participants were randomized into either an active or a sham training group and engaged in six weeks of training. The active training group improved on a measure of processing speed and attention following cognitive training, and data trended toward significance on measures of other domains. Results provide preliminary evidence that cognitive training with multiple sclerosis patients may produce moderate improvement in select areas of cognitive functioning.

  20. Handedness is related to neural mechanisms underlying hemispheric lateralization of face processing

    Science.gov (United States)

    Frässle, Stefan; Krach, Sören; Paulus, Frieder Michel; Jansen, Andreas

    2016-06-01

    While the right-hemispheric lateralization of the face perception network is well established, recent evidence suggests that handedness affects the cerebral lateralization of face processing at the hierarchical level of the fusiform face area (FFA). However, the neural mechanisms underlying differential hemispheric lateralization of face perception in right- and left-handers are largely unknown. Using dynamic causal modeling (DCM) for fMRI, we aimed to unravel the putative processes that mediate handedness-related differences by investigating the effective connectivity in the bilateral core face perception network. Our results reveal an enhanced recruitment of the left FFA in left-handers compared to right-handers, as evidenced by more pronounced face-specific modulatory influences on both intra- and interhemispheric connections. As structural and physiological correlates of handedness-related differences in face processing, right- and left-handers varied with regard to their gray matter volume in the left fusiform gyrus and their pupil responses to face stimuli. Overall, these results describe how handedness is related to the lateralization of the core face perception network, and point to different neural mechanisms underlying face processing in right- and left-handers. In a wider context, this demonstrates the entanglement of structurally and functionally remote brain networks, suggesting a broader underlying process regulating brain lateralization.

  1. Redox processes at a nanostructured interface under strong electric fields.

    Science.gov (United States)

    Steurer, Wolfram; Surnev, Svetlozar; Netzer, Falko P; Sementa, Luca; Negreiros, Fabio R; Barcaro, Giovanni; Durante, Nicola; Fortunelli, Alessandro

    2014-09-21

    Manipulation of chemistry and film growth via external electric fields is a longstanding goal in surface science. Numerous systems have been predicted to show such effects but experimental evidence is sparse. Here we demonstrate in a custom-designed UHV apparatus that the application of spatially extended, homogeneous, very high (>1 V nm(-1)) DC-fields not only changes the system energetics but triggers dynamic processes which become important much before static contributions appreciably modify the potential energy landscape. We take a well characterized ultrathin NiO film on a Ag(100) support as a proof-of-principle test case, and show how it gets reduced to supported Ni clusters under fields exceeding the threshold of +0.9 V nm(-1). Using an effective model, we trace the observed interfacial redox process down to a dissociative electron attachment resonant mechanism. The proposed approach can be easily implemented and generally applied to a wide range of interfacial systems, thus opening new opportunities for the manipulation of film growth and reaction processes at solid surfaces under strong external fields.

  2. Interindividual differences in stress sensitivity: basal and stress-induced cortisol levels differentially predict neural vigilance processing under stress.

    Science.gov (United States)

    Henckens, Marloes J A G; Klumpers, Floris; Everaerd, Daphne; Kooijman, Sabine C; van Wingen, Guido A; Fernández, Guillén

    2016-04-01

    Stress exposure is known to precipitate psychological disorders. However, large differences exist in how individuals respond to stressful situations. A major marker for stress sensitivity is hypothalamus-pituitary-adrenal (HPA)-axis function. Here, we studied how interindividual variance in both basal cortisol levels and stress-induced cortisol responses predicts differences in neural vigilance processing during stress exposure. Implementing a randomized, counterbalanced, crossover design, 120 healthy male participants were exposed to a stress-induction and control procedure, followed by an emotional perception task (viewing fearful and happy faces) during fMRI scanning. Stress sensitivity was assessed using physiological (salivary cortisol levels) and psychological measures (trait questionnaires). High stress-induced cortisol responses were associated with increased stress sensitivity as assessed by psychological questionnaires, a stronger stress-induced increase in medial temporal activity and greater differential amygdala responses to fearful as opposed to happy faces under control conditions. In contrast, high basal cortisol levels were related to relative stress resilience as reflected by higher extraversion scores, a lower stress-induced increase in amygdala activity and enhanced differential processing of fearful compared with happy faces under stress. These findings seem to reflect a critical role for HPA-axis signaling in stress coping; higher basal levels indicate stress resilience, whereas higher cortisol responsivity to stress might facilitate recovery in those individuals prone to react sensitively to stress. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  3. β-Decay half-lives and nuclear structure of exotic proton-rich waiting point nuclei under rp-process conditions

    Science.gov (United States)

    Nabi, Jameel-Un; Böyükata, Mahmut

    2016-03-01

    We investigate even-even nuclei in the A ∼ 70 mass region within the framework of the proton-neutron quasi-particle random phase approximation (pn-QRPA) and the interacting boson model-1 (IBM-1). Our work includes calculation of the energy spectra and the potential energy surfaces V (β , γ) of Zn, Ge, Se, Kr and Sr nuclei with the same proton and neutron number, N = Z. The parametrization of the IBM-1 Hamiltonian was performed for the calculation of the energy levels in the ground state bands. Geometric shape of the nuclei was predicted by plotting the potential energy surfaces V (β , γ) obtained from the IBM-1 Hamiltonian in the classical limit. The pn-QRPA model was later used to compute half-lives of the neutron-deficient nuclei which were found to be in very good agreement with the measured ones. The pn-QRPA model was also used to calculate the Gamow-Teller strength distributions and was found to be in decent agreement with the measured data. We further calculate the electron capture and positron decay rates for these N = Z waiting point (WP) nuclei in the stellar environment employing the pn-QRPA model. For the rp-process conditions, our total weak rates are within a factor two compared with the Skyrme HF +BCS +QRPA calculation. All calculated electron capture rates are comparable to the competing positron decay rates under rp-process conditions. Our study confirms the finding that electron capture rates form an integral part of the weak rates under rp-process conditions and should not be neglected in the nuclear network calculations.

  4. Biomimetic propulsion under random heaving conditions, using active pitch control

    Science.gov (United States)

    Politis, Gerasimos; Politis, Konstantinos

    2014-05-01

    Marine mammals travel long distances by utilizing and transforming wave energy to thrust through proper control of their caudal fin. On the other hand, manmade ships traveling in a wavy sea store large amounts of wave energy in the form of kinetic energy for heaving, pitching, rolling and other ship motions. A natural way to extract this energy and transform it to useful propulsive thrust is by using a biomimetic wing. The aim of this paper is to show how an actively pitched biomimetic wing could achieve this goal when it performs a random heaving motion. More specifically, we consider a biomimetic wing traveling with a given translational velocity in an infinitely extended fluid and performing a random heaving motion with a given energy spectrum which corresponds to a given sea state. A formula is invented by which the instantaneous pitch angle of the wing is determined using the heaving data of the current and past time steps. Simulations are then performed for a biomimetic wing at different heave energy spectra, using an indirect Source-Doublet 3-D-BEM, together with a time stepping algorithm capable to track the random motion of the wing. A nonlinear pressure type Kutta condition is applied at the trailing edge of the wing. With a mollifier-based filtering technique, the 3-D unsteady rollup pattern created by the random motion of the wing is calculated without any simplifying assumptions regarding its geometry. Calculated unsteady forces, moments and useful power, show that the proposed active pitch control always results in thrust producing motions, with significant propulsive power production and considerable beneficial stabilizing action to ship motions. Calculation of the power required to set the pitch angle prove it to be a very small percentage of the useful power and thus making the practical application of the device very tractable.

  5. Cognitive Processes in Decisions Under Risk Are Not the Same As in Decisions Under Uncertainty

    Directory of Open Access Journals (Sweden)

    Kirsten G Volz

    2012-07-01

    Full Text Available We deal with risk versus uncertainty, a distinction that is of fundamental importance for cognitive neuroscience yet largely neglected. In a world of risk (small world, all alternatives, consequences, and probabilities are known. In uncertain (large worlds, some of this information is unknown or unknowable. Most of cognitive neuroscience studies exclusively study the neural correlates for decisions under risk (e.g., lotteries, with the tacit implication that understanding these would lead to an understanding of decision making in general. First, we show that normative strategies for decisions under risk do not generalize to uncertain worlds, where simple heuristics are often the more accurate strategies. Second, we argue that the cognitive processes for making decisions in a world of risk are not the same as those for dealing with uncertainty. Because situations with known risks are the exception rather than the rule in human evolution, it is unlikely that our brains are adapted to them. We therefore suggest a paradigm shift towards studying decision processes in uncertain worlds and provide first examples.

  6. [The third lumbar transverse process syndrome treated with acupuncture at zygapophyseal joint and transverse process:a randomized controlled trial].

    Science.gov (United States)

    Li, Fangling; Bi, Dingyan

    2017-08-12

    To explore the effects differences for the third lumbar transverse process syndrome between acupuncture mainly at zygapophyseal joint and transverse process and conventional acupuncture. Eighty cases were randomly assigned into an observation group and a control group, 40 cases in each one. In the observation group, patients were treated with acupuncture at zygapophyseal joint, transverse process, the superior gluteus nerve into the hip point and Weizhong (BL 40), and those in the control group were treated with acupuncture at Qihaishu (BL 24), Jiaji (EX-B 2) of L 2 -L 4 , the superior gluteus nerve into the hip point and Weizhong (BL 40). The treatment was given 6 times a week for 2 weeks, once a day. The visual analogue scale (VAS), Japanese Orthopaedic Association (JOA) low back pain score and simplified Chinese Oswestry disability index (SC-ODI) were observed before and after treatment as well as 6 months after treatment, and the clinical effects were evaluated. The total effective rate in the observation group was 95.0% (38/40), which was significantly higher than 82.5% (33/40) in the control group ( P process for the third lumbar transverse process syndrome achieves good effect, which is better than that of conventional acupuncture on relieving pain, improving lumbar function and life quality.

  7. Load-Dependent Interference of Deep Brain Stimulation of the Subthalamic Nucleus with Switching from Automatic to Controlled Processing During Random Number Generation in Parkinson's Disease.

    Science.gov (United States)

    Williams, Isobel Anne; Wilkinson, Leonora; Limousin, Patricia; Jahanshahi, Marjan

    2015-01-01

    Deep brain stimulation of the subthalamic nucleus (STN DBS) ameliorates the motor symptoms of Parkinson's disease (PD). However, some aspects of executive control are impaired with STN DBS. We tested the prediction that (i) STN DBS interferes with switching from automatic to controlled processing during fast-paced random number generation (RNG) (ii) STN DBS-induced cognitive control changes are load-dependent. Fifteen PD patients with bilateral STN DBS performed paced-RNG, under three levels of cognitive load synchronised with a pacing stimulus presented at 1, 0.5 and 0.33 Hz (faster rates require greater cognitive control), with DBS on or off. Measures of output randomness were calculated. Countscore 1 (CS1) indicates habitual counting in steps of one (CS1). Countscore 2 (CS2) indicates a more controlled strategy of counting in twos. The fastest rate was associated with an increased CS1 score with STN DBS on compared to off. At the slowest rate, patients had higher CS2 scores with DBS off than on, such that the differences between CS1 and CS2 scores disappeared. We provide evidence for a load-dependent effect of STN DBS on paced RNG in PD. Patients could switch to more controlled RNG strategies during conditions of low cognitive load at slower rates only when the STN stimulators were off, but when STN stimulation was on, they engaged in more automatic habitual counting under increased cognitive load. These findings are consistent with the proposal that the STN implements a switch signal from the medial frontal cortex which enables a shift from automatic to controlled processing.

  8. A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jae-Hee Hur

    2017-01-01

    Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.

  9. Random numbers spring from alpha decay

    International Nuclear Information System (INIS)

    Frigerio, N.A.; Sanathanan, L.P.; Morley, M.; Clark, N.A.; Tyler, S.A.

    1980-05-01

    Congruential random number generators, which are widely used in Monte Carlo simulations, are deficient in that the number they generate are concentrated in a relatively small number of hyperplanes. While this deficiency may not be a limitation in small Monte Carlo studies involving a few variables, it introduces a significant bias in large simulations requiring high resolution. This bias was recognized and assessed during preparations for an accident analysis study of nuclear power plants. This report describes a random number device based on the radioactive decay of alpha particles from a 235 U source in a high-resolution gas proportional counter. The signals were fed to a 4096-channel analyzer and for each channel the frequency of signals registered in a 20,000-microsecond interval was recorded. The parity bits of these frequency counts (0 for an even count and 1 for an odd count) were then assembled in sequence to form 31-bit binary random numbers and transcribed to a magnetic tape. This cycle was repeated as many times as were necessary to create 3 million random numbers. The frequency distribution of counts from the present device conforms to the Brockwell-Moyal distribution, which takes into account the dead time of the counter (both the dead time and decay constant of the underlying Poisson process were estimated). Analysis of the count data and tests of randomness on a sample set of the 31-bit binary numbers indicate that this random number device is a highly reliable source of truly random numbers. Its use is, therefore, recommended in Monte Carlo simulations for which the congruential pseudorandom number generators are found to be inadequate. 6 figures, 5 tables

  10. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  11. Spherical particle Brownian motion in viscous medium as non-Markovian random process

    International Nuclear Information System (INIS)

    Morozov, Andrey N.; Skripkin, Alexey V.

    2011-01-01

    The Brownian motion of a spherical particle in an infinite medium is described by the conventional methods and integral transforms considering the entrainment of surrounding particles of the medium by the Brownian particle. It is demonstrated that fluctuations of the Brownian particle velocity represent a non-Markovian random process. The features of Brownian motion in short time intervals and in small displacements are considered. -- Highlights: → Description of Brownian motion considering the entrainment of medium is developed. → We find the equations for statistical characteristics of impulse fluctuations. → Brownian motion at small time intervals is considered. → Theoretical results and experimental data are compared.

  12. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  13. Levy flights and random searches

    Energy Technology Data Exchange (ETDEWEB)

    Raposo, E P [Laboratorio de Fisica Teorica e Computacional, Departamento de Fisica, Universidade Federal de Pernambuco, Recife-PE, 50670-901 (Brazil); Buldyrev, S V [Department of Physics, Yeshiva University, New York, 10033 (United States); Da Luz, M G E [Departamento de Fisica, Universidade Federal do Parana, Curitiba-PR, 81531-990 (Brazil); Viswanathan, G M [Instituto de Fisica, Universidade Federal de Alagoas, Maceio-AL, 57072-970 (Brazil); Stanley, H E [Center for Polymer Studies and Department of Physics, Boston University, Boston, MA 02215 (United States)

    2009-10-30

    In this work we discuss some recent contributions to the random search problem. Our analysis includes superdiffusive Levy processes and correlated random walks in several regimes of target site density, mobility and revisitability. We present results in the context of mean-field-like and closed-form average calculations, as well as numerical simulations. We then consider random searches performed in regular lattices and lattices with defects, and we discuss a necessary criterion for distinguishing true superdiffusion from correlated random walk processes. We invoke energy considerations in relation to critical survival states on the edge of extinction, and we analyze the emergence of Levy behavior in deterministic search walks. Finally, we comment on the random search problem in the context of biological foraging.

  14. Run charts revisited: a simulation study of run chart rules for detection of non-random variation in health care processes.

    Science.gov (United States)

    Anhøj, Jacob; Olesen, Anne Vingaard

    2014-01-01

    A run chart is a line graph of a measure plotted over time with the median as a horizontal line. The main purpose of the run chart is to identify process improvement or degradation, which may be detected by statistical tests for non-random patterns in the data sequence. We studied the sensitivity to shifts and linear drifts in simulated processes using the shift, crossings and trend rules for detecting non-random variation in run charts. The shift and crossings rules are effective in detecting shifts and drifts in process centre over time while keeping the false signal rate constant around 5% and independent of the number of data points in the chart. The trend rule is virtually useless for detection of linear drift over time, the purpose it was intended for.

  15. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  16. Randomly forced CGL equation stationary measures and the inviscid limit

    CERN Document Server

    Kuksin, S

    2003-01-01

    We study a complex Ginzburg-Landau (CGL) equation perturbed by a random force which is white in time and smooth in the space variable~$x$. Assuming that $\\dim x\\le4$, we prove that this equation has a unique solution and discuss its asymptotic in time properties. Next we consider the case when the random force is proportional to the square root of the viscosity and study the behaviour of stationary solutions as the viscosity goes to zero. We show that, under this limit, a subsequence of solutions in question converges to a nontrivial stationary process formed by global strong solutions of the nonlinear Schr\\"odinger equation.

  17. On the Coupling Time of the Heat-Bath Process for the Fortuin-Kasteleyn Random-Cluster Model

    Science.gov (United States)

    Collevecchio, Andrea; Elçi, Eren Metin; Garoni, Timothy M.; Weigel, Martin

    2018-01-01

    We consider the coupling from the past implementation of the random-cluster heat-bath process, and study its random running time, or coupling time. We focus on hypercubic lattices embedded on tori, in dimensions one to three, with cluster fugacity at least one. We make a number of conjectures regarding the asymptotic behaviour of the coupling time, motivated by rigorous results in one dimension and Monte Carlo simulations in dimensions two and three. Amongst our findings, we observe that, for generic parameter values, the distribution of the appropriately standardized coupling time converges to a Gumbel distribution, and that the standard deviation of the coupling time is asymptotic to an explicit universal constant multiple of the relaxation time. Perhaps surprisingly, we observe these results to hold both off criticality, where the coupling time closely mimics the coupon collector's problem, and also at the critical point, provided the cluster fugacity is below the value at which the transition becomes discontinuous. Finally, we consider analogous questions for the single-spin Ising heat-bath process.

  18. Front propagation in flipping processes

    International Nuclear Information System (INIS)

    Antal, T; Ben-Avraham, D; Ben-Naim, E; Krapivsky, P L

    2008-01-01

    We study a directed flipping process that underlies the performance of the random edge simplex algorithm. In this stochastic process, which takes place on a one-dimensional lattice whose sites may be either occupied or vacant, occupied sites become vacant at a constant rate and simultaneously cause all sites to the right to change their state. This random process exhibits rich phenomenology. First, there is a front, defined by the position of the leftmost occupied site, that propagates at a nontrivial velocity. Second, the front involves a depletion zone with an excess of vacant sites. The total excess Δ k increases logarithmically, Δ k ≅ ln k, with the distance k from the front. Third, the front exhibits ageing-young fronts are vigorous but old fronts are sluggish. We investigate these phenomena using a quasi-static approximation, direct solutions of small systems and numerical simulations

  19. A randomized controlled trial of an electronic informed consent process.

    Science.gov (United States)

    Rothwell, Erin; Wong, Bob; Rose, Nancy C; Anderson, Rebecca; Fedor, Beth; Stark, Louisa A; Botkin, Jeffrey R

    2014-12-01

    A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study. © The Author(s) 2014.

  20. Process antecedents of challenging, under-cover and readily-adopted innovations.

    Science.gov (United States)

    Adams, Richard; Tranfield, David; Denyer, David

    2013-01-01

    The purpose of the study is to test the utility of a taxonomy of innovation based on perceived characteristics in the context of healthcare by exploring the extent to which discrete innovation types could be distinguished from each other in terms of process antecedents. A qualitative approach was adopted to explore the process antecedents of nine exemplar cases of "challenging", "under-cover" and "readily-adopted" healthcare innovations. Data were collected by semi-structured interview and from secondary sources, and content analysed according to a theoretically informed framework of innovation process. Cluster analysis was applied to determine whether innovation types could be distinguished on the basis of process characteristics. The findings provide moderate support for the proposition that innovations differentiated on the basis of the way they are perceived by potential users exhibit different process characteristics. Innovations exhibiting characteristics previously believed negatively to impact adoption may be successfully adopted but by a different configuration of processes than by innovations exhibiting a different set of characteristics. The findings must be treated with caution because the sample consists of self-selected cases of successful innovation and is limited by sample size. Nevertheless, the study sheds new light on important process differences in healthcare innovation. The paper offers a heuristic device to aid clinicians and managers to better understand the relatively novel task of promoting and managing innovation in healthcare. The paper advances the argument that there is under-exploited opportunity for cross-disciplinary organisational learning for innovation management in the NHS. If efficiency and quality improvement targets are to be met through a strategy of encouraging innovation, it may be advantageous for clinicians and managers to reflect on what this study found mostly to be absent from the processes of the innovations studied

  1. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  2. Parametric interaction of waves in the plasma with random large-scale inhomogeneities

    International Nuclear Information System (INIS)

    Abramovich, B.S.; Tamojkin, V.V.

    1980-01-01

    Parametric processes of the decay and fusion of three waves in a weakly turbulent plasma with random inhomogeneities, the size of which is too big as compared with wave-lengths are considered. Under the diffusive approximation applicability closed equations are obtained, which determine the behaviour of all the intensity moments of parametrically bound waves. It is shown that under the conditions when the characteristic length of the multiple scattering is considerably less than the nonlinear interaction, length the effective increment of average intensity increase and its moments at dissociation processes is too small as compared with the homogeneous plasma case. At fusion processes the same increment (decrement) determines the distance at which all intensity moments are in the saturation regime

  3. Random walks in Euclidean space

    OpenAIRE

    Varjú, Péter Pál

    2012-01-01

    Consider a sequence of independent random isometries of Euclidean space with a previously fixed probability law. Apply these isometries successively to the origin and consider the sequence of random points that we obtain this way. We prove a local limit theorem under a suitable moment condition and a necessary non-degeneracy condition. Under stronger hypothesis, we prove a limit theorem on a wide range of scales: between e^(-cl^(1/4)) and l^(1/2), where l is the number of steps.

  4. Rayleigh scattering under light-atom coherent interaction

    OpenAIRE

    Takamizawa, Akifumi; Shimoda, Koichi

    2012-01-01

    Semi-classical calculation of an oscillating dipole induced in a two-level atom indicates that spherical radiation from the dipole under coherent interaction, i.e., Rayleigh scattering, has a power level comparable to that of spontaneous emission resulting from an incoherent process. Whereas spontaneous emission is nearly isotropic and has random polarization generally, Rayleigh scattering is strongly anisotropic and polarized in association with incident light. In the case where Rabi frequen...

  5. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    Directory of Open Access Journals (Sweden)

    Bo Sun

    2018-03-01

    Full Text Available In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion.

  6. Multi-fidelity Gaussian process regression for prediction of random fields

    International Nuclear Information System (INIS)

    Parussini, L.; Venturi, D.; Perdikaris, P.; Karniadakis, G.E.

    2017-01-01

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgers equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.

  7. Multi-fidelity Gaussian process regression for prediction of random fields

    Energy Technology Data Exchange (ETDEWEB)

    Parussini, L. [Department of Engineering and Architecture, University of Trieste (Italy); Venturi, D., E-mail: venturi@ucsc.edu [Department of Applied Mathematics and Statistics, University of California Santa Cruz (United States); Perdikaris, P. [Department of Mechanical Engineering, Massachusetts Institute of Technology (United States); Karniadakis, G.E. [Division of Applied Mathematics, Brown University (United States)

    2017-05-01

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgers equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.

  8. 75 FR 65707 - Notice Regarding Consideration and Processing of Applications for Financial Assistance Under the...

    Science.gov (United States)

    2010-10-26

    ... consideration and processing of applications for financial assistance under the RRIF Program. FOR FURTHER...) regarding FRA's consideration and processing of applications for financial assistance under the RRIF Program... DEPARTMENT OF TRANSPORTATION Federal Railroad Administration Notice Regarding Consideration and...

  9. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s......The research in the field of microalgae-based biofuels and chemicals is in early phase of the development, and therefore a wide range of uncertainties exist due to inconsistencies among and shortage of technical information. In order to handle and address these uncertainties to ensure robust......MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence...

  10. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    Science.gov (United States)

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  11. Groupies in multitype random graphs

    OpenAIRE

    Shang, Yilun

    2016-01-01

    A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erd?s-R?nyi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.

  12. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    Science.gov (United States)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  13. Tunable random packings

    International Nuclear Information System (INIS)

    Lumay, G; Vandewalle, N

    2007-01-01

    We present an experimental protocol that allows one to tune the packing fraction η of a random pile of ferromagnetic spheres from a value close to the lower limit of random loose packing η RLP ≅0.56 to the upper limit of random close packing η RCP ≅0.64. This broad range of packing fraction values is obtained under normal gravity in air, by adjusting a magnetic cohesion between the grains during the formation of the pile. Attractive and repulsive magnetic interactions are found to affect stongly the internal structure and the stability of sphere packing. After the formation of the pile, the induced cohesion is decreased continuously along a linear decreasing ramp. The controlled collapse of the pile is found to generate various and reproducible values of the random packing fraction η

  14. An optimization methodology for identifying robust process integration investments under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden); Patriksson, Michael [Department of Mathematical Sciences, Chalmers University of Technology and Department of Mathematical Sciences, University of Gothenburg, SE-412 96 Goeteborg (Sweden)

    2009-02-15

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  15. An optimization methodology for identifying robust process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith; Patriksson, Michael

    2009-01-01

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  16. Investigation of hydrogen isotopes interaction processes with lithium under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Zaurbekova, Zhanna, E-mail: zaurbekova@nnc.kz [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Skakov, Mazhyn; Ponkratov, Yuriy; Kulsartov, Timur; Gordienko, Yuriy; Tazhibayeva, Irina; Baklanov, Viktor; Barsukov, Nikolay [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Chikhray, Yevgen [Institute of Experimental and Theoretical Physics of Kazakh National University, Almaty (Kazakhstan)

    2016-11-01

    Highlights: • The experiments on study of helium and tritium generation and release processes under neutron irradiation from lithium saturated with deuterium are described in paper. ​ • The values of relative tritium and helium yield from lithium sample at different levels of neutron irradiation is calculated. • It was concluded that the main affecting process on tritium release from lithium is its interaction with lithium atoms with formation of lithium tritide. - Abstract: The paper describes the experiments on study of helium and tritium generation and release processes from lithium saturated with deuterium under neutron irradiation (in temperature range from 473 to 773 K). The diagrams of two reactor experiments show the time dependences of helium, DT, T{sub 2}, and tritium water partial pressures changes in experimental chamber with investigated lithium sample. According to experimental results, the values of relative tritium and helium yield from lithium sample at different levels of neutron irradiation were calculated. The time dependences of relative tritium and helium yield from lithium sample were plotted. It was concluded that the main affecting process on tritium release from lithium is its interaction with lithium atoms with formation of lithium tritide.

  17. Dynamics and bifurcations of random circle diffeomorphisms

    NARCIS (Netherlands)

    Zmarrou, H.; Homburg, A.J.

    2008-01-01

    We discuss iterates of random circle diffeomorphisms with identically distributed noise, where the noise is bounded and absolutely continuous. Using arguments of B. Deroin, V.A. Kleptsyn and A. Navas, we provide precise conditions under which random attracting fixed points or random attracting

  18. Response-only modal identification using random decrement algorithm with time-varying threshold level

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Tseng, Tse Chuan

    2014-01-01

    Modal Identification from response data only is studied for structural systems under nonstationary ambient vibration. The topic of this paper is the estimation of modal parameters from nonstationary ambient vibration data by applying the random decrement algorithm with time-varying threshold level. In the conventional random decrement algorithm, the threshold level for evaluating random dec signatures is defined as the standard deviation value of response data of the reference channel. The distortion of random dec signatures may be, however, induced by the error involved in noise from the original response data in practice. To improve the accuracy of identification, a modification of the sampling procedure in random decrement algorithm is proposed for modal-parameter identification from the nonstationary ambient response data. The time-varying threshold level is presented for the acquisition of available sample time history to perform averaging analysis, and defined as the temporal root-mean-square function of structural response, which can appropriately describe a wide variety of nonstationary behaviors in reality, such as the time-varying amplitude (variance) of a nonstationary process in a seismic record. Numerical simulations confirm the validity and robustness of the proposed modal-identification method from nonstationary ambient response data under noisy conditions.

  19. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    Science.gov (United States)

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  20. Quantum-like Viewpoint on the Complexity and Randomness of the Financial Market

    Science.gov (United States)

    Choustova, Olga

    In economics and financial theory, analysts use random walk and more general martingale techniques to model behavior of asset prices, in particular share prices on stock markets, currency exchange rates and commodity prices. This practice has its basis in the presumption that investors act rationally and without bias, and that at any moment they estimate the value of an asset based on future expectations. Under these conditions, all existing information affects the price, which changes only when new information comes out. By definition, new information appears randomly and influences the asset price randomly. Corresponding continuous time models are based on stochastic processes (this approach was initiated in the thesis of [4]), see, e.g., the books of [33] and [37] for historical and mathematical details.

  1. An empirical test of pseudo random number generators by means of an exponential decaying process; Una prueba empirica de generadores de numeros pseudoaleatorios mediante un proceso de decaimiento exponencial

    Energy Technology Data Exchange (ETDEWEB)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A. [Facultad de Fisica e Inteligencia Artificial, Universidad Veracruzana, A.P. 475, Xalapa, Veracruz (Mexico); Mora F, L.E. [CIMAT, A.P. 402, 36000 Guanajuato (Mexico)]. e-mail: hcoronel@uv.mx

    2007-07-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  2. Methodology for optimization of process integration schemes in a biorefinery under uncertainty

    International Nuclear Information System (INIS)

    Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >González-Cortés, Meilyn; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Martínez-Martínez, Yenisleidys; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Albernas-Carvajal, Yailet; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Pedraza-Garciga, Julio; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Morales-Zamora, Marlen

    2017-01-01

    The uncertainty has a great impact in the investment decisions, operability of the plants and in the feasibility of integration opportunities in the chemical processes. This paper, presents the steps to consider the optimization of process investment in the processes integration under conditions of uncertainty. It is shown the potentialities of the biomass cane of sugar for the integration with several plants in a biorefinery scheme for the obtaining chemical products, thermal and electric energy. Among the factories with potentialities for this integration are the pulp and paper and sugar factories and other derivative processes. Theses factories have common resources and also have a variety of products that can be exchange between them so certain products generated in a one of them can be raw matter in another plant. The methodology developed guide to obtaining of feasible investment projects under uncertainty. As objective function was considered the maximization of net profitable value in different scenarios that are generated from the integration scheme. (author)

  3. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  4. Computer generation of random deviates

    International Nuclear Information System (INIS)

    Cormack, John

    1991-01-01

    The need for random deviates arises in many scientific applications. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. 27 refs., 3 tabs., 5 figs

  5. Random numbers from vacuum fluctuations

    International Nuclear Information System (INIS)

    Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda

    2016-01-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  6. Random numbers from vacuum fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com [Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore); Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore); Chng, Brenda [Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore)

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  7. Analysis of covariance with pre-treatment measurements in randomized trials under the cases that covariances and post-treatment variances differ between groups.

    Science.gov (United States)

    Funatogawa, Takashi; Funatogawa, Ikuko; Shyr, Yu

    2011-05-01

    When primary endpoints of randomized trials are continuous variables, the analysis of covariance (ANCOVA) with pre-treatment measurements as a covariate is often used to compare two treatment groups. In the ANCOVA, equal slopes (coefficients of pre-treatment measurements) and equal residual variances are commonly assumed. However, random allocation guarantees only equal variances of pre-treatment measurements. Unequal covariances and variances of post-treatment measurements indicate unequal slopes and, usually, unequal residual variances. For non-normal data with unequal covariances and variances of post-treatment measurements, it is known that the ANCOVA with equal slopes and equal variances using an ordinary least-squares method provides an asymptotically normal estimator for the treatment effect. However, the asymptotic variance of the estimator differs from the variance estimated from a standard formula, and its property is unclear. Furthermore, the asymptotic properties of the ANCOVA with equal slopes and unequal variances using a generalized least-squares method are unclear. In this paper, we consider non-normal data with unequal covariances and variances of post-treatment measurements, and examine the asymptotic properties of the ANCOVA with equal slopes using the variance estimated from a standard formula. Analytically, we show that the actual type I error rate, thus the coverage, of the ANCOVA with equal variances is asymptotically at a nominal level under equal sample sizes. That of the ANCOVA with unequal variances using a generalized least-squares method is asymptotically at a nominal level, even under unequal sample sizes. In conclusion, the ANCOVA with equal slopes can be asymptotically justified under random allocation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Groupies in multitype random graphs.

    Science.gov (United States)

    Shang, Yilun

    2016-01-01

    A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erdős-Rényi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.

  9. Intermittency and random matrices

    Science.gov (United States)

    Sokoloff, Dmitry; Illarionov, E. A.

    2015-08-01

    A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.

  10. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  11. Mean-Variance Optimization in Markov Decision Processes

    OpenAIRE

    Mannor, Shie; Tsitsiklis, John N.

    2011-01-01

    We consider finite horizon Markov decision processes under performance measures that involve both the mean and the variance of the cumulative reward. We show that either randomized or history-based policies can improve performance. We prove that the complexity of computing a policy that maximizes the mean reward under a variance constraint is NP-hard for some cases, and strongly NP-hard for others. We finally offer pseudo-polynomial exact and approximation algorithms.

  12. Scaling law of resistance fluctuations in stationary random resistor networks

    Science.gov (United States)

    Pennetta; Trefan; Reggiani

    2000-12-11

    In a random resistor network we consider the simultaneous evolution of two competing random processes consisting in breaking and recovering the elementary resistors with probabilities W(D) and W(R). The condition W(R)>W(D)/(1+W(D)) leads to a stationary state, while in the opposite case, the broken resistor fraction reaches the percolation threshold p(c). We study the resistance noise of this system under stationary conditions by Monte Carlo simulations. The variance of resistance fluctuations is found to follow a scaling law |p-p(c)|(-kappa(0)) with kappa(0) = 5.5. The proposed model relates quantitatively the defectiveness of a disordered media with its electrical and excess-noise characteristics.

  13. Security of Semi-Device-Independent Random Number Expansion Protocols.

    Science.gov (United States)

    Li, Dan-Dan; Wen, Qiao-Yan; Wang, Yu-Kun; Zhou, Yu-Qian; Gao, Fei

    2015-10-27

    Semi-device-independent random number expansion (SDI-RNE) protocols require some truly random numbers to generate fresh ones, with making no assumptions on the internal working of quantum devices except for the dimension of the Hilbert space. The generated randomness is certified by non-classical correlation in the prepare-and-measure test. Until now, the analytical relations between the amount of the generated randomness and the degree of non-classical correlation, which are crucial for evaluating the security of SDI-RNE protocols, are not clear under both the ideal condition and the practical one. In the paper, first, we give the analytical relation between the above two factors under the ideal condition. As well, we derive the analytical relation under the practical conditions, where devices' behavior is not independent and identical in each round and there exists deviation in estimating the non-classical behavior of devices. Furthermore, we choose a different randomness extractor (i.e., two-universal random function) and give the security proof.

  14. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  15. Markov decision processes: a tool for sequential decision making under uncertainty.

    Science.gov (United States)

    Alagoz, Oguzhan; Hsu, Heather; Schaefer, Andrew J; Roberts, Mark S

    2010-01-01

    We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.

  16. Global industrial impact coefficient based on random walk process and inter-country input-output table

    Science.gov (United States)

    Xing, Lizhi; Dong, Xianlei; Guan, Jun

    2017-04-01

    Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.

  17. A teachable moment communication process for smoking cessation talk: description of a group randomized clinician-focused intervention

    Directory of Open Access Journals (Sweden)

    Flocke Susan A

    2012-05-01

    Full Text Available Abstract Background Effective clinician-patient communication about health behavior change is one of the most important and most overlooked strategies to promote health and prevent disease. Existing guidelines for specific health behavior counseling have been created and promulgated, but not successfully adopted in primary care practice. Building on work focused on creating effective clinician strategies for prompting health behavior change in the primary care setting, we developed an intervention intended to enhance clinician communication skills to create and act on teachable moments for smoking cessation. In this manuscript, we describe the development and implementation of the Teachable Moment Communication Process (TMCP intervention and the baseline characteristics of a group randomized trial designed to evaluate its effectiveness. Methods/Design This group randomized trial includes thirty-one community-based primary care clinicians practicing in Northeast Ohio and 840 of their adult patients. Clinicians were randomly assigned to receive either the Teachable Moments Communication Process (TMCP intervention for smoking cessation, or the delayed intervention. The TMCP intervention consisted of two, 3-hour educational training sessions including didactic presentation, skill demonstration through video examples, skills practices with standardized patients, and feedback from peers and the trainers. For each clinician enrolled, 12 patients were recruited for two time points. Pre- and post-intervention data from the clinicians, patients and audio-recorded clinician‒patient interactions were collected. At baseline, the two groups of clinicians and their patients were similar with regard to all demographic and practice characteristics examined. Both physician and patient recruitment goals were met, and retention was 96% and 94% respectively. Discussion Findings support the feasibility of training clinicians to use the Teachable Moments

  18. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  19. Designing management strategies for carbon dioxide storage and utilization under uncertainty using inexact modelling

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2017-06-01

    Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.

  20. Theoretical Background for the Decision-Making Process Modelling under Controlled Intervention Conditions

    OpenAIRE

    Bakanauskienė Irena; Baronienė Laura

    2017-01-01

    This article is intended to theoretically justify the decision-making process model for the cases, when active participation of investing entities in controlling the activities of an organisation and their results is noticeable. Based on scientific literature analysis, a concept of controlled conditions is formulated, and using a rational approach to the decision-making process, a model of the 11-steps decision-making process under controlled intervention is presented. Also, there have been u...

  1. Group-buying inventory policy with demand under Poisson process

    Directory of Open Access Journals (Sweden)

    Tammarat Kleebmek

    2016-02-01

    Full Text Available The group-buying is the modern business of selling in the uncertain market. With an objective to minimize costs for sellers arising from ordering and reordering, we present in this paper the group buying inventory model, with the demand governed by a Poisson process and the product sale distributed as Binomial distribution. The inventory level is under continuous review, while the lead time is fixed. A numerical example is illustrated.

  2. Effects of Fentanyl on Emergence Agitation in Children under Sevoflurane Anesthesia: Meta-Analysis of Randomized Controlled Trials

    Science.gov (United States)

    Xiong, Wei; Zhou, Qin; Yang, Peng; Huang, Xiongqing

    2015-01-01

    Background and Objectives The goal of this meta-analysis study was to assess the effects of fentanyl on emergence agitation (EA) under sevoflurane anesthesia in children. Subjects and Methods We searched electronic databases (PubMed, Embase, Web of Science and the Cochrane Central Register of Controlled Trials) for articles published until December 2014. Randomized controlled trials (RCTs) that assessed the effects of fentanyl and placebo on EA under sevoflurane anesthesia in children that the outcome were the incidence of EA, postoperative pain, emergence time or adverse effects were included in this meta-analysis. Results A total of 16 studies, including 1362 patients (737 patients for the fentanyl group and 625 for the placebo group), were evaluated in final analysis. We found that administration of fentanyl decreased the incidences of EA (RR = 0.37, 95% CI 0.27~0.49, Pfentanyl decreases the incidence of EA under sevoflurane anesthesia in children and postoperative pain, but has a higher incidence of PONV. Considering the inherent limitations of the included studies, more RCTs with extensive follow-up should be performed to validate our findings in the future. PMID:26275039

  3. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  4. Random skew plane partitions and the Pearcey process

    DEFF Research Database (Denmark)

    Reshetikhin, Nicolai; Okounkov, Andrei

    2007-01-01

    We study random skew 3D partitions weighted by q vol and, specifically, the q → 1 asymptotics of local correlations near various points of the limit shape. We obtain sine-kernel asymptotics for correlations in the bulk of the disordered region, Airy kernel asymptotics near a general point of the ...

  5. On a business cycle model with fractional derivative under narrow-band random excitation

    International Nuclear Information System (INIS)

    Lin, Zifei; Li, Jiaorui; Li, Shuang

    2016-01-01

    This paper analyzes the dynamics of a business cycle model with fractional derivative of order  α (0 < α < 1) subject to narrow-band random excitation, in which fractional derivative describes the memory property of the economic variables. Stochastic dynamical system concepts are integrated into the business cycle model for understanding the economic fluctuation. Firstly, the method of multiple scales is applied to derive the model to obtain the approximate analytical solution. Secondly, the effect of economic policy with fractional derivative on the amplitude of the economic fluctuation and the effect on stationary probability density are studied. The results show macroeconomic regulation and control can lower the stable amplitude of economic fluctuation. While in the process of equilibrium state, the amplitude is magnified. Also, the macroeconomic regulation and control improves the stability of the equilibrium state. Thirdly, how externally stochastic perturbation affects the dynamics of the economy system is investigated.

  6. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  7. Omega-3 and -6 fatty acid supplementation and sensory processing in toddlers with ASD symptomology born preterm: A randomized controlled trial.

    Science.gov (United States)

    Boone, Kelly M; Gracious, Barbara; Klebanoff, Mark A; Rogers, Lynette K; Rausch, Joseph; Coury, Daniel L; Keim, Sarah A

    2017-12-01

    Despite advances in the health and long-term survival of infants born preterm, they continue to face developmental challenges including higher risk for autism spectrum disorder (ASD) and atypical sensory processing patterns. This secondary analysis aimed to describe sensory profiles and explore effects of combined dietary docosahexaenoic acid (DHA), eicosapentaenoic acid (EPA), and gamma-linolenic acid (GLA) supplementation on parent-reported sensory processing in toddlers born preterm who were exhibiting ASD symptoms. 90-day randomized, double blinded, placebo-controlled trial. 31 children aged 18-38months who were born at ≤29weeks' gestation. Mixed effects regression analyses followed intent to treat and explored effects on parent-reported sensory processing measured by the Infant/Toddler Sensory Profile (ITSP). Baseline ITSP scores reflected atypical sensory processing, with the majority of atypical scores falling below the mean. Sensory processing sections: auditory (above=0%, below=65%), vestibular (above=13%, below=48%), tactile (above=3%, below=35%), oral sensory (above=10%; below=26%), visual (above=10%, below=16%); sensory processing quadrants: low registration (above=3%; below=71%), sensation avoiding (above=3%; below=39%), sensory sensitivity (above=3%; below=35%), and sensation seeking (above=10%; below=19%). Twenty-eight of 31 children randomized had complete outcome data. Although not statistically significant (p=0.13), the magnitude of the effect for reduction in behaviors associated with sensory sensitivity was medium to large (effect size=0.57). No other scales reflected a similar magnitude of effect size (range: 0.10 to 0.32). The findings provide support for larger randomized trials of omega fatty acid supplementation for children at risk of sensory processing difficulties, especially those born preterm. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  9. Efficient Numerical Methods for Analysis of Square Ratio of κ-μ and η-μ Random Processes with Their Applications in Telecommunications

    Directory of Open Access Journals (Sweden)

    Gradimir V. Milovanović

    2018-01-01

    Full Text Available We will provide statistical analysis of the square ratio of κ-μ and η-μ random processes and its application in the signal-to-interference ratio (SIR based performance analysis of wireless transmission subjected to the influence of multipath fading, modelled by κ-μ fading model, and undesired occurrence of co-channel interference (CCI, distributed as η-μ random process. First contribution of the paper is deriving exact closed expressions for the probability density function (PDF and cumulative distribution function (CDF of square ratio of κ-μ and η-μ random processes. Further, a verification of accuracy of these PDF and CDF expressions was given by comparison with the corresponding approximations obtained by the high-precision quadrature formulas of Gaussian type with respect to the weight functions on (0,+∞. The computational procedure of such quadrature rules is provided by using the constructive theory of orthogonal polynomials and the MATHEMATICA package OrthogonalPolynomials created by Cvetković and Milovanović (2004. Capitalizing on obtained expression, important wireless performance criteria, namely, outage probability (OP, have been obtained, as functions of transmission parameters. Also, possible performance improvement is observed through a glance at SC (selection combining reception employment based on obtained expressions.

  10. Brownian Optimal Stopping and Random Walks

    International Nuclear Information System (INIS)

    Lamberton, D.

    2002-01-01

    One way to compute the value function of an optimal stopping problem along Brownian paths consists of approximating Brownian motion by a random walk. We derive error estimates for this type of approximation under various assumptions on the distribution of the approximating random walk

  11. Quality assessment of baby food made of different pre-processed organic raw materials under industrial processing conditions.

    Science.gov (United States)

    Seidel, Kathrin; Kahl, Johannes; Paoletti, Flavio; Birlouez, Ines; Busscher, Nicolaas; Kretzschmar, Ursula; Särkkä-Tirkkonen, Marjo; Seljåsen, Randi; Sinesio, Fiorella; Torp, Torfinn; Baiamonte, Irene

    2015-02-01

    The market for processed food is rapidly growing. The industry needs methods for "processing with care" leading to high quality products in order to meet consumers' expectations. Processing influences the quality of the finished product through various factors. In carrot baby food, these are the raw material, the pre-processing and storage treatments as well as the processing conditions. In this study, a quality assessment was performed on baby food made from different pre-processed raw materials. The experiments were carried out under industrial conditions using fresh, frozen and stored organic carrots as raw material. Statistically significant differences were found for sensory attributes among the three autoclaved puree samples (e.g. overall odour F = 90.72, p processed from frozen carrots show increased moisture content and decrease of several chemical constituents. Biocrystallization identified changes between replications of the cooking. Pre-treatment of raw material has a significant influence on the final quality of the baby food.

  12. Almond Consumption and Processing Affects the Composition of the Gastrointestinal Microbiota of Healthy Adult Men and Women: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Hannah D. Holscher

    2018-01-01

    Full Text Available Background: Almond processing has been shown to differentially impact metabolizable energy; however, the effect of food form on the gastrointestinal microbiota is under-investigated. Objective: We aimed to assess the interrelationship of almond consumption and processing on the gastrointestinal microbiota. Design: A controlled-feeding, randomized, five-period, crossover study with washouts between diet periods was conducted in healthy adults (n = 18. Treatments included: (1 zero servings/day of almonds (control; (2 1.5 servings (42 g/day of whole almonds; (3 1.5 servings/day of whole, roasted almonds; (4 1.5 servings/day of roasted, chopped almonds; and (5 1.5 servings/day of almond butter. Fecal samples were collected at the end of each three-week diet period. Results: Almond consumption increased the relative abundances of Lachnospira, Roseburia, and Dialister (p ≤ 0.05. Comparisons between control and the four almond treatments revealed that chopped almonds increased Lachnospira, Roseburia, and Oscillospira compared to control (p < 0.05, while whole almonds increased Dialister compared to control (p = 0.007. There were no differences between almond butter and control. Conclusions: These results reveal that almond consumption induced changes in the microbial community composition of the human gastrointestinal microbiota. Furthermore, the degree of almond processing (e.g., roasting, chopping, and grinding into butter differentially impacted the relative abundances of bacterial genera.

  13. 77 FR 43492 - Expedited Vocational Assessment Under the Sequential Evaluation Process

    Science.gov (United States)

    2012-07-25

    ..., or visit our Internet site, Social Security Online, at http://www.socialsecurity.gov . SUPPLEMENTARY... SOCIAL SECURITY ADMINISTRATION 20 CFR Parts 404 and 416 [Docket No. SSA-2010-0060] RIN 0960-AH26 Expedited Vocational Assessment Under the Sequential Evaluation Process AGENCY: Social Security...

  14. Certified randomness in quantum physics.

    Science.gov (United States)

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  15. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control

    DEFF Research Database (Denmark)

    Capaci, Francesca; Kulahci, Murat; Vanhatalo, Erik

    2017-01-01

    Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop....... Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system...... responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing...

  17. Theoretical Background for the Decision-Making Process Modelling under Controlled Intervention Conditions

    Directory of Open Access Journals (Sweden)

    Bakanauskienė Irena

    2017-12-01

    Full Text Available This article is intended to theoretically justify the decision-making process model for the cases, when active participation of investing entities in controlling the activities of an organisation and their results is noticeable. Based on scientific literature analysis, a concept of controlled conditions is formulated, and using a rational approach to the decision-making process, a model of the 11-steps decision-making process under controlled intervention is presented. Also, there have been unified conditions, describing the case of controlled interventions thus providing preconditions to ensure the adequacy of the proposed decision-making process model.

  18. Random practice - one of the factors of the motor learning process

    Directory of Open Access Journals (Sweden)

    Petr Valach

    2012-01-01

    Full Text Available BACKGROUND: An important concept of acquiring motor skills is the random practice (contextual interference - CI. The explanation of the effect of contextual interference is that the memory has to work more intensively, and therefore it provides higher effect of motor skills retention than the block practice. Only active remembering of a motor skill assigns the practical value for appropriate using in the future. OBJECTIVE: The aim of this research was to determine the difference in how the motor skills in sport gymnastics are acquired and retained using the two different teaching methods - blocked and random practice. METHODS: The blocked and random practice on the three selected gymnastics tasks were applied in the two groups students of physical education (blocked practice - the group BP, random practice - the group RP during two months, in one session a week (totally 80 trials. At the end of the experiment and 6 months after (retention tests the groups were tested on the selected gymnastics skills. RESULTS: No significant differences in a level of the gymnastics skills were found between BP group and RP group at the end of the experiment. However, the retention tests showed significantly higher level of the gymnastics skills in the RP group in comparison with the BP group. CONCLUSION: The results confirmed that a retention of the gymnastics skills using the teaching method of the random practice was significantly higher than with use of the blocked practice.

  19. Time distributions of solar energetic particle events: Are SEPEs really random?

    Science.gov (United States)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  20. The peeling process of infinite Boltzmann planar maps

    DEFF Research Database (Denmark)

    Budd, Timothy George

    2016-01-01

    criterion has a very simple interpretation. The finite random planar maps under consideration were recently proved to possess a well-defined local limit known as the infinite Boltzmann planar map (IBPM). Inspired by recent work of Curien and Le Gall, we show that the peeling process on the IBPM can...

  1. Mapping Common Aphasia Assessments to Underlying Cognitive Processes and Their Neural Substrates.

    Science.gov (United States)

    Lacey, Elizabeth H; Skipper-Kallal, Laura M; Xing, Shihui; Fama, Mackenzie E; Turkeltaub, Peter E

    2017-05-01

    Understanding the relationships between clinical tests, the processes they measure, and the brain networks underlying them, is critical in order for clinicians to move beyond aphasia syndrome classification toward specification of individual language process impairments. To understand the cognitive, language, and neuroanatomical factors underlying scores of commonly used aphasia tests. Twenty-five behavioral tests were administered to a group of 38 chronic left hemisphere stroke survivors and a high-resolution magnetic resonance image was obtained. Test scores were entered into a principal components analysis to extract the latent variables (factors) measured by the tests. Multivariate lesion-symptom mapping was used to localize lesions associated with the factor scores. The principal components analysis yielded 4 dissociable factors, which we labeled Word Finding/Fluency, Comprehension, Phonology/Working Memory Capacity, and Executive Function. While many tests loaded onto the factors in predictable ways, some relied heavily on factors not commonly associated with the tests. Lesion symptom mapping demonstrated discrete brain structures associated with each factor, including frontal, temporal, and parietal areas extending beyond the classical language network. Specific functions mapped onto brain anatomy largely in correspondence with modern neural models of language processing. An extensive clinical aphasia assessment identifies 4 independent language functions, relying on discrete parts of the left middle cerebral artery territory. A better understanding of the processes underlying cognitive tests and the link between lesion and behavior may lead to improved aphasia diagnosis, and may yield treatments better targeted to an individual's specific pattern of deficits and preserved abilities.

  2. Stationary responses of a Rayleigh viscoelastic system with zero barrier impacts under external random excitation.

    Science.gov (United States)

    Wang, Deli; Xu, Wei; Zhao, Xiangrong

    2016-03-01

    This paper aims to deal with the stationary responses of a Rayleigh viscoelastic system with zero barrier impacts under external random excitation. First, the original stochastic viscoelastic system is converted to an equivalent stochastic system without viscoelastic terms by approximately adding the equivalent stiffness and damping. Relying on the means of non-smooth transformation of state variables, the above system is replaced by a new system without an impact term. Then, the stationary probability density functions of the system are observed analytically through stochastic averaging method. By considering the effects of the biquadratic nonlinear damping coefficient and the noise intensity on the system responses, the effectiveness of the theoretical method is tested by comparing the analytical results with those generated from Monte Carlo simulations. Additionally, it does deserve attention that some system parameters can induce the occurrence of stochastic P-bifurcation.

  3. Path Integral Formulation of Anomalous Diffusion Processes

    OpenAIRE

    Friedrich, Rudolf; Eule, Stephan

    2011-01-01

    We present the path integral formulation of a broad class of generalized diffusion processes. Employing the path integral we derive exact expressions for the path probability densities and joint probability distributions for the class of processes under consideration. We show that Continuous Time Random Walks (CTRWs) are included in our framework. A closed expression for the path probability distribution of CTRWs is found in terms of their waiting time distribution as the solution of a Dyson ...

  4. Optimal Decisions in a Single-Period Supply Chain with Price-Sensitive Random Demand under a Buy-Back Contract

    Directory of Open Access Journals (Sweden)

    Feng Wang

    2014-01-01

    Full Text Available This paper studies a single-period supply chain with a buy-back contract under a Stackelberg game model, in which the supplier (leader decides on the wholesale price, and the retailer (follower responds to determine the retail price and the order quantity. We analytically investigate the decentralized retailer’s optimal decision. Our results demonstrate that the retailer has a unique optimal simultaneous decision on the retail price and the order quantity, under a mild restriction on the demand distribution. Moreover, as it can be shown that the decentralized supply chain facing price-sensitive random demand cannot be coordinated with buy-back contract, we propose a scheme for the system to achieve Pareto-improvement. Theoretical analysis suggests that there exists a unique Pareto-equilibrium for the supply chain. In particular, when the Pareto-equilibrium is reached, the supply chain is coordinated. Numerical experiments confirm our results.

  5. Finding Order in Randomness: Single-Molecule Studies Reveal Stochastic RNA Processing | Center for Cancer Research

    Science.gov (United States)

    Producing a functional eukaryotic messenger RNA (mRNA) requires the coordinated activity of several large protein complexes to initiate transcription, elongate nascent transcripts, splice together exons, and cleave and polyadenylate the 3’ end. Kinetic competition between these various processes has been proposed to regulate mRNA maturation, but this model could lead to multiple, randomly determined, or stochastic, pathways or outcomes. Regulatory checkpoints have been suggested as a means of ensuring quality control. However, current methods have been unable to tease apart the contributions of these processes at a single gene or on a time scale that could provide mechanistic insight. To begin to investigate the kinetic relationship between transcription and splicing, Daniel Larson, Ph.D., of CCR’s Laboratory of Receptor Biology and Gene Expression, and his colleagues employed a single-molecule RNA imaging approach to monitor production and processing of a human β-globin reporter gene in living cells.

  6. Random mutagenesis of aspergillus niger and process optimization for enhanced production of glucose oxidase

    International Nuclear Information System (INIS)

    Haq, I.; Nawaz, A.; Mukhtar, A.N.H.; Mansoor, H.M.Z.; Ameer, S.M.

    2014-01-01

    The study deals with the improvement of wild strain Aspergillus niger IIB-31 through random mutagenesis using chemical mutagens. The main aim of the work was to enhance the glucose oxidase (GOX) yield of wild strain (24.57+-0.01 U/g of cell mass) through random mutagenesis and process optimization. The wild strain of Aspergillus niger IIB-31 was treated with chemical mutagens such as Ethyl methane sulphonate (EMS) and nitrous acid for this purpose. Mutagen treated 98 variants indicating the positive results were picked and screened for the glucose oxidase production using submerged fermentation. EMS treated E45 mutant strain gave the highest glucose oxidase production (69.47 + 0.01 U/g of cell mass), which was approximately 3-folds greater than the wild strain IIB-31. The preliminary cultural conditions for the production of glucose oxidase using submerged fermentation from strain E45 were also optimized. The highest yield of GOD was obtained using 8% glucose as carbon and 0.3% peptone as nitrogen source at a medium pH of 7.0 after an incubation period of 72 hrs at 30 degree. (author)

  7. Application of computer picture processing to dynamic strain measurement under electromagnetic field

    International Nuclear Information System (INIS)

    Yagawa, G.; Soneda, N.

    1987-01-01

    For the structural design of fusion reactors, it is very important to ensure the structural integrity of components under various dynamic loading conditions due to a solid-electromagnetic field interaction, an earthquake, MHD effects and so on. As one of the experimental approaches to assess the dynamic fracture, we consider the strain measurement near a crack tip under a transient electromagnetic field, which in general involves several experimental difficulties. The authors have developed a strain measurement method using a picture processing technique. In this method, locations of marks printed on a surface of specimen are determined by the picture processing. The displacement field is interpolated using the mark displacements and finite elements. Finally the strain distribution is calculated by differentiating the displacement field. In the present study, the method is improved and automated apply to the measurement of dynamic strain distribution under an electromagnetic field. Then the effects of dynamic loading on the strain distribution are investigated by comparing the dynamic results with the static ones. (orig./GL)

  8. PREFACE: The random search problem: trends and perspectives The random search problem: trends and perspectives

    Science.gov (United States)

    da Luz, Marcos G. E.; Grosberg, Alexander; Raposo, Ernesto P.; Viswanathan, Gandhi M.

    2009-10-01

    aircraft, a given web site). Regarding the nature of the searching drive, in certain instances, it can be guided almost entirely by external cues, either by the cognitive (memory) or detective (olfaction, vision, etc) skills of the searcher. However, in many situations the movement is non-oriented, being in essence a stochastic process. Therefore, in such cases (and even when a small deterministic component in the locomotion exists) a random search effectively defines the final rates of encounters. Hence, one reason underlying the richness of the random search problem relates just to the `ignorance' of the locations of the randomly located targets. Contrary to conventional wisdom, the lack of complete information does not necessarily lead to greater complexity. As an illustrative example, let us consider the case of complete information. If the positions of all target sites are known in advance, then the question of what sequential order to visit the sites so to reduce the energy costs of locomotion itself becomes a rather challenging problem: the famous `travelling salesman' optimization query, belonging to the NP-complete class of problems. The ignorance of the target site locations, however, considerably modifies the problem and renders it not amenable to be treated by purely deterministic computational methods. In fact, as expected, the random search problem is not particularly suited to search algorithms that do not use elements of randomness. So, only a statistical approach to the search problem can adequately deal with the element of ignorance. In other words, the incomplete information renders the search under-determined, i.e., it is not possible to find the `best' solution to the problem because all the information is not given. Instead, one must guess and probabilistic or stochastic strategies become unavoidable. Also, the random search problem bears a relation to reaction-diffusion processes, because the search involves a diffusive aspect, movement, as well as a

  9. Colonic stem cell data are consistent with the immortal model of stem cell division under non-random strand segregation.

    Science.gov (United States)

    Walters, K

    2009-06-01

    Colonic stem cells are thought to reside towards the base of crypts of the colon, but their numbers and proliferation mechanisms are not well characterized. A defining property of stem cells is that they are able to divide asymmetrically, but it is not known whether they always divide asymmetrically (immortal model) or whether there are occasional symmetrical divisions (stochastic model). By measuring diversity of methylation patterns in colon crypt samples, a recent study found evidence in favour of the stochastic model, assuming random segregation of stem cell DNA strands during cell division. Here, the effect of preferential segregation of the template strand is considered to be consistent with the 'immortal strand hypothesis', and explore the effect on conclusions of previously published results. For a sample of crypts, it is shown how, under the immortal model, to calculate mean and variance of the number of unique methylation patterns allowing for non-random strand segregation and compare them with those observed. The calculated mean and variance are consistent with an immortal model that incorporates non-random strand segregation for a range of stem cell numbers and levels of preferential strand segregation. Allowing for preferential strand segregation considerably alters previously published conclusions relating to stem cell numbers and turnover mechanisms. Evidence in favour of the stochastic model may not be as strong as previously thought.

  10. Experimentally generated randomness certified by the impossibility of superluminal signals.

    Science.gov (United States)

    Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K

    2018-04-01

    From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.

  11. Evidence of different underlying processes in pattern recall and decision-making.

    Science.gov (United States)

    Gorman, Adam D; Abernethy, Bruce; Farrow, Damian

    2015-01-01

    The visual search characteristics of expert and novice basketball players were recorded during pattern recall and decision-making tasks to determine whether the two tasks shared common visual-perceptual processing strategies. The order in which participants entered the pattern elements in the recall task was also analysed to further examine the nature of the visual-perceptual strategies and the relative emphasis placed upon particular pattern features. The experts demonstrated superior performance across the recall and decision-making tasks [see also Gorman, A. D., Abernethy, B., & Farrow, D. (2012). Classical pattern recall tests and the prospective nature of expert performance. The Quarterly Journal of Experimental Psychology, 65, 1151-1160; Gorman, A. D., Abernethy, B., & Farrow, D. (2013a). Is the relationship between pattern recall and decision-making influenced by anticipatory recall? The Quarterly Journal of Experimental Psychology, 66, 2219-2236)] but a number of significant differences in the visual search data highlighted disparities in the processing strategies, suggesting that recall skill may utilize different underlying visual-perceptual processes than those required for accurate decision-making performance in the natural setting. Performance on the recall task was characterized by a proximal-to-distal order of entry of the pattern elements with participants tending to enter the players located closest to the ball carrier earlier than those located more distal to the ball carrier. The results provide further evidence of the underlying perceptual processes employed by experts when extracting visual information from complex and dynamic patterns.

  12. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  13. Stochastic Dominance under the Nonlinear Expected Utilities

    Directory of Open Access Journals (Sweden)

    Xinling Xiao

    2014-01-01

    Full Text Available In 1947, von Neumann and Morgenstern introduced the well-known expected utility and the related axiomatic system (see von Neumann and Morgenstern (1953. It is widely used in economics, for example, financial economics. But the well-known Allais paradox (see Allais (1979 shows that the linear expected utility has some limitations sometimes. Because of this, Peng proposed a concept of nonlinear expected utility (see Peng (2005. In this paper we propose a concept of stochastic dominance under the nonlinear expected utilities. We give sufficient conditions on which a random choice X stochastically dominates a random choice Y under the nonlinear expected utilities. We also provide sufficient conditions on which a random choice X strictly stochastically dominates a random choice Y under the sublinear expected utilities.

  14. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  15. Randomizer for High Data Rates

    Science.gov (United States)

    Garon, Howard; Sank, Victor J.

    2018-01-01

    NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.

  16. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered as ...

  17. High paraffin Kumkol petroleum processing under fuel and lubricant petroleum scheme

    International Nuclear Information System (INIS)

    Nadirov, N.K.; Konaev, Eh.N.

    1997-01-01

    Technological opportunity of high paraffin Kumkol petroleum processing under the fuel and lubricant scheme with production of lubricant materials in short supply, combustible materials and technical paraffin is shown. Mini petroleum block putting into operation on Kumkol deposit is reasonable economically and raises profitableness of hydrocarbon raw material production. (author)

  18. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  19. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    1997-01-01

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  20. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  1. A Dynamic Programming-Based Sustainable Inventory-Allocation Planning Problem with Carbon Emissions and Defective Item Disposal under a Fuzzy Random Environment

    Directory of Open Access Journals (Sweden)

    Kai Kang

    2018-01-01

    Full Text Available There is a growing concern that business enterprises focus primarily on their economic activities and ignore the impact of these activities on the environment and the society. This paper investigates a novel sustainable inventory-allocation planning model with carbon emissions and defective item disposal over multiple periods under a fuzzy random environment. In this paper, a carbon credit price and a carbon cap are proposed to demonstrate the effect of carbon emissions’ costs on the inventory-allocation network costs. The percentage of poor quality products from manufacturers that need to be rejected is assumed to be fuzzy random. Because of the complexity of the model, dynamic programming-based particle swarm optimization with multiple social learning structures, a DP-based GLNPSO, and a fuzzy random simulation are proposed to solve the model. A case is then given to demonstrate the efficiency and effectiveness of the proposed model and the DP-based GLNPSO algorithm. The results found that total costs across the inventory-allocation network varied with changes in the carbon cap and that carbon emissions’ reductions could be utilized to gain greater profits.

  2. Research on Collapse Process of Cable-Stayed Bridges under Strong Seismic Excitations

    Directory of Open Access Journals (Sweden)

    Xuewei Wang

    2017-01-01

    Full Text Available In order to present the collapse process and failure mechanism of long-span cable-stayed bridges under strong seismic excitations, a rail-cum-road steel truss cable-stayed bridge was selected as engineering background, the collapse failure numerical model of the cable-stayed bridge was established based on the explicit dynamic finite element method (FEM, and the whole collapse process of the cable-stayed bridge was analyzed and studied with three different seismic waves acted in the horizontal longitudinal direction, respectively. It can be found from the numerical simulation analysis that the whole collapse failure process and failure modes of the cable-stayed bridge under three different seismic waves are similar. Furthermore, the piers and the main pylons are critical components contributing to the collapse of the cable-stayed bridge structure. However, the cables and the main girder are damaged owing to the failure of piers and main pylons during the whole structure collapse process, so the failure of cable and main girder components is not the main reason for the collapse of cable-stayed bridge. The analysis results can provide theoretical basis for collapse resistance design and the determination of critical damage components of long-span highway and railway cable-stayed bridges in the research of seismic vulnerability analysis.

  3. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  4. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  5. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  6. Mid-infrared optical parametric oscillator pumped by an amplified random fiber laser

    Science.gov (United States)

    Shang, Yaping; Shen, Meili; Wang, Peng; Li, Xiao; Xu, Xiaojun

    2017-01-01

    Recently, the concept of random fiber lasers has attracted a great deal of attention for its feature to generate incoherent light without a traditional laser resonator, which is free of mode competition and insure the stationary narrow-band continuous modeless spectrum. In this Letter, we reported the first, to the best of our knowledge, optical parametric oscillator (OPO) pumped by an amplified 1070 nm random fiber laser (RFL), in order to generate stationary mid-infrared (mid-IR) laser. The experiment realized a watt-level laser output in the mid-IR range and operated relatively stable. The use of the RFL seed source allowed us to take advantage of its respective stable time-domain characteristics. The beam profile, spectrum and time-domain properties of the signal light were measured to analyze the process of frequency down-conversion process under this new pumping condition. The results suggested that the near-infrared (near-IR) signal light `inherited' good beam performances from the pump light. Those would be benefit for further develop about optical parametric process based on different pumping circumstances.

  7. Location-Dependent Query Processing Under Soft Real-Time Constraints

    Directory of Open Access Journals (Sweden)

    Zoubir Mammeri

    2009-01-01

    Full Text Available In recent years, mobile devices and applications achieved an increasing development. In database field, this development required methods to consider new query types like location-dependent queries (i.e. the query results depend on the query issuer location. Although several researches addressed problems related to location-dependent query processing, a few works considered timing requirements that may be associated with queries (i.e., the query results must be delivered to mobile clients on time. The main objective of this paper is to propose a solution for location-dependent query processing under soft real-time constraints. Hence, we propose methods to take into account client location-dependency and to maximize the percentage of queries respecting their deadlines. We validate our proposal by implementing a prototype based on Oracle DBMS. Performance evaluation results show that the proposed solution optimizes the percentage of queries meeting their deadlines and the communication cost.

  8. Will Mobile Diabetes Education Teams (MDETs in primary care improve patient care processes and health outcomes? Study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Gucciardi Enza

    2012-09-01

    Full Text Available Abstract Background There is evidence to suggest that delivery of diabetes self-management support by diabetes educators in primary care may improve patient care processes and patient clinical outcomes; however, the evaluation of such a model in primary care is nonexistent in Canada. This article describes the design for the evaluation of the implementation of Mobile Diabetes Education Teams (MDETs in primary care settings in Canada. Methods/design This study will use a non-blinded, cluster-randomized controlled trial stepped wedge design to evaluate the Mobile Diabetes Education Teams' intervention in improving patient clinical and care process outcomes. A total of 1,200 patient charts at participating primary care sites will be reviewed for data extraction. Eligible patients will be those aged ≥18, who have type 2 diabetes and a hemoglobin A1c (HbA1c of ≥8%. Clusters (that is, primary care sites will be randomized to the intervention and control group using a block randomization procedure within practice size as the blocking factor. A stepped wedge design will be used to sequentially roll out the intervention so that all clusters eventually receive the intervention. The time at which each cluster begins the intervention is randomized to one of the four roll out periods (0, 6, 12, and 18 months. Clusters that are randomized into the intervention later will act as the control for those receiving the intervention earlier. The primary outcome measure will be the difference in the proportion of patients who achieve the recommended HbA1c target of ≤7% between intervention and control groups. Qualitative work (in-depth interviews with primary care physicians, MDET educators and patients; and MDET educators’ field notes and debriefing sessions will be undertaken to assess the implementation process and effectiveness of the MDET intervention. Trial registration ClinicalTrials.gov NCT01553266

  9. Enhanced performance of denitrifying sulfide removal process under micro-aerobic condition

    International Nuclear Information System (INIS)

    Chen Chuan; Ren Nanqi; Wang Aijie; Liu Lihong; Lee, Duu-Jong

    2010-01-01

    The denitrifying sulfide removal (DSR) process with bio-granules comprising both heterotrophic and autotrophic denitrifiers can simultaneously convert nitrate, sulfide and acetate into di-nitrogen gas, elementary sulfur and carbon dioxide, respectively, at high loading rates. This study determines the reaction rate of sulfide oxidized into sulfur, as well as the reduction of nitrate to nitrite, would be enhanced under a micro-aerobic condition. The presence of limited oxygen mitigated the inhibition effects of sulfide on denitrifier activities, and enhanced the performance of DSR granules. The advantages and disadvantages of applying the micro-aerobic condition to the DSR process are discussed.

  10. Continuous-time random walks on networks with vertex- and time-dependent forcing.

    Science.gov (United States)

    Angstmann, C N; Donnelly, I C; Henry, B I; Langlands, T A M

    2013-08-01

    We have investigated the transport of particles moving as random walks on the vertices of a network, subject to vertex- and time-dependent forcing. We have derived the generalized master equations for this transport using continuous time random walks, characterized by jump and waiting time densities, as the underlying stochastic process. The forcing is incorporated through a vertex- and time-dependent bias in the jump densities governing the random walking particles. As a particular case, we consider particle forcing proportional to the concentration of particles on adjacent vertices, analogous to self-chemotactic attraction in a spatial continuum. Our algebraic and numerical studies of this system reveal an interesting pair-aggregation pattern formation in which the steady state is composed of a high concentration of particles on a small number of isolated pairs of adjacent vertices. The steady states do not exhibit this pair aggregation if the transport is random on the vertices, i.e., without forcing. The manifestation of pair aggregation on a transport network may thus be a signature of self-chemotactic-like forcing.

  11. Random-walk simulation of diffusion-controlled processes among static traps

    International Nuclear Information System (INIS)

    Lee, S.B.; Kim, I.C.; Miller, C.A.; Torquato, S.; Department of Mechanical and Aerospace Engineering and Department of Chemical Engineering, North Carolina State University, Raleigh, North Carolina 27695-7910)

    1989-01-01

    We present computer-simulation results for the trapping rate (rate constant) k associated with diffusion-controlled reactions among identical, static spherical traps distributed with an arbitrary degree of impenetrability using a Pearson random-walk algorithm. We specifically consider the penetrable-concentric-shell model in which each trap of diameter σ is composed of a mutually impenetrable core of diameter λσ, encompassed by a perfectly penetrable shell of thickness (1-λ)σ/2: λ=0 corresponding to randomly centered or ''fully penetrable'' traps and λ=1 corresponding to totally impenetrable traps. Trapping rates are calculated accurately from the random-walk algorithm at the extreme limits of λ (λ=0 and 1) and at an intermediate value (λ=0.8), for a wide range of trap densities. Our simulation procedure has a relatively fast execution time. It is found that k increases with increasing impenetrability at fixed trap concentration. These ''exact'' data are compared with previous theories for the trapping rate. Although a good approximate theory exists for the fully-penetrable-trap case, there are no currently available theories that can provide good estimates of the trapping rate for a moderate to high density of traps with nonzero hard cores (λ>0)

  12. Fuel corrosion processes under waste disposal conditions

    International Nuclear Information System (INIS)

    Shoesmith, D.W.

    1999-09-01

    Under the oxidizing conditions likely to be encountered in the Yucca Mountain Repository, fuel dissolution is a corrosion process involving the coupling of the anodic dissolution of the fuel with the cathodic reduction of oxidants available within the repository. The oxidants potentially available to drive fuel corrosion are environmental oxygen, supplied by the transport through the permeable rock of the mountain and molecular and radical species produced by the radiolysis of available aerated water. The mechanism of these coupled anodic and cathodic reactions is reviewed in detail. While gaps in understanding remain, many kinetic features of these reactions have been studied in considerable detail, and a reasonably justified mechanism for fuel corrosion is available. The corrosion rate is determined primarily by environmental factors rather than the properties of the fuel. Thus, with the exception of increase in rate due to an increase in surface area, pre-oxidation of the fuel has little effect on the corrosion rate

  13. Fuel corrosion processes under waste disposal conditions

    Energy Technology Data Exchange (ETDEWEB)

    Shoesmith, D.W. [Univ. of Western Ontario, Dept. of Chemistry, London, Ontario (Canada)

    1999-09-01

    Under the oxidizing conditions likely to be encountered in the Yucca Mountain Repository, fuel dissolution is a corrosion process involving the coupling of the anodic dissolution of the fuel with the cathodic reduction of oxidants available within the repository. The oxidants potentially available to drive fuel corrosion are environmental oxygen, supplied by the transport through the permeable rock of the mountain and molecular and radical species produced by the radiolysis of available aerated water. The mechanism of these coupled anodic and cathodic reactions is reviewed in detail. While gaps in understanding remain, many kinetic features of these reactions have been studied in considerable detail, and a reasonably justified mechanism for fuel corrosion is available. The corrosion rate is determined primarily by environmental factors rather than the properties of the fuel. Thus, with the exception of increase in rate due to an increase in surface area, pre-oxidation of the fuel has little effect on the corrosion rate.

  14. Face processing pattern under top-down perception: a functional MRI study

    Science.gov (United States)

    Li, Jun; Liang, Jimin; Tian, Jie; Liu, Jiangang; Zhao, Jizheng; Zhang, Hui; Shi, Guangming

    2009-02-01

    Although top-down perceptual process plays an important role in face processing, its neural substrate is still puzzling because the top-down stream is extracted difficultly from the activation pattern associated with contamination caused by bottom-up face perception input. In the present study, a novel paradigm of instructing participants to detect faces from pure noise images is employed, which could efficiently eliminate the interference of bottom-up face perception in topdown face processing. Analyzing the map of functional connectivity with right FFA analyzed by conventional Pearson's correlation, a possible face processing pattern induced by top-down perception can be obtained. Apart from the brain areas of bilateral fusiform gyrus (FG), left inferior occipital gyrus (IOG) and left superior temporal sulcus (STS), which are consistent with a core system in the distributed cortical network for face perception, activation induced by top-down face processing is also found in these regions that include the anterior cingulate gyrus (ACC), right oribitofrontal cortex (OFC), left precuneus, right parahippocampal cortex, left dorsolateral prefrontal cortex (DLPFC), right frontal pole, bilateral premotor cortex, left inferior parietal cortex and bilateral thalamus. The results indicate that making-decision, attention, episodic memory retrieving and contextual associative processing network cooperate with general face processing regions to process face information under top-down perception.

  15. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    Science.gov (United States)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  16. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    Science.gov (United States)

    Godrèche, Claude

    2017-05-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature.

  17. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    International Nuclear Information System (INIS)

    Godrèche, Claude

    2017-01-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature. (paper)

  18. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    Science.gov (United States)

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  19. Two-terminal reliability of a mobile ad hoc network under the asymptotic spatial distribution of the random waypoint model

    International Nuclear Information System (INIS)

    Chen, Binchao; Phillips, Aaron; Matis, Timothy I.

    2012-01-01

    The random waypoint (RWP) mobility model is frequently used in describing the movement pattern of mobile users in a mobile ad hoc network (MANET). As the asymptotic spatial distribution of nodes under a RWP model exhibits central tendency, the two-terminal reliability of the MANET is investigated as a function of the source node location. In particular, analytical expressions for one and two hop connectivities are developed as well as an efficient simulation methodology for two-terminal reliability. A study is then performed to assess the effect of nodal density and network topology on network reliability.

  20. Algorithmic randomness, physical entropy, measurements, and the second law

    International Nuclear Information System (INIS)

    Zurek, W.H.

    1989-01-01

    Algorithmic information content is equal to the size -- in the number of bits -- of the shortest program for a universal Turing machine which can reproduce a state of a physical system. In contrast to the statistical Boltzmann-Gibbs-Shannon entropy, which measures ignorance, the algorithmic information content is a measure of the available information. It is defined without a recourse to probabilities and can be regarded as a measure of randomness of a definite microstate. I suggest that the physical entropy S -- that is, the quantity which determines the amount of the work ΔW which can be extracted in the cyclic isothermal expansion process through the equation ΔW = k B TΔS -- is a sum of two contributions: the mission information measured by the usual statistical entropy and the known randomness measured by the algorithmic information content. The sum of these two contributions is a ''constant of motion'' in the process of a dissipation less measurement on an equilibrium ensemble. This conservation under a measurement, which can be traced back to the noiseless coding theorem of Shannon, is necessary to rule out existence of a successful Maxwell's demon. 17 refs., 3 figs

  1. Hierarchical random additive process and logarithmic scaling of generalized high order, two-point correlations in turbulent boundary layer flow

    Science.gov (United States)

    Yang, X. I. A.; Marusic, I.; Meneveau, C.

    2016-06-01

    Townsend [Townsend, The Structure of Turbulent Shear Flow (Cambridge University Press, Cambridge, UK, 1976)] hypothesized that the logarithmic region in high-Reynolds-number wall-bounded flows consists of space-filling, self-similar attached eddies. Invoking this hypothesis, we express streamwise velocity fluctuations in the inertial layer in high-Reynolds-number wall-bounded flows as a hierarchical random additive process (HRAP): uz+=∑i=1Nzai . Here u is the streamwise velocity fluctuation, + indicates normalization in wall units, z is the wall normal distance, and ai's are independently, identically distributed random additives, each of which is associated with an attached eddy in the wall-attached hierarchy. The number of random additives is Nz˜ln(δ /z ) where δ is the boundary layer thickness and ln is natural log. Due to its simplified structure, such a process leads to predictions of the scaling behaviors for various turbulence statistics in the logarithmic layer. Besides reproducing known logarithmic scaling of moments, structure functions, and correlation function [" close="]3/2 uz(x ) uz(x +r ) >, new logarithmic laws in two-point statistics such as uz4(x ) > 1 /2, 1/3, etc. can be derived using the HRAP formalism. Supporting empirical evidence for the logarithmic scaling in such statistics is found from the Melbourne High Reynolds Number Boundary Layer Wind Tunnel measurements. We also show that, at high Reynolds numbers, the above mentioned new logarithmic laws can be derived by assuming the arrival of an attached eddy at a generic point in the flow field to be a Poisson process [Woodcock and Marusic, Phys. Fluids 27, 015104 (2015), 10.1063/1.4905301]. Taken together, the results provide new evidence supporting the essential ingredients of the attached eddy hypothesis to describe streamwise velocity fluctuations of large, momentum transporting eddies in wall-bounded turbulence, while observed deviations suggest the need for further extensions of the

  2. Gaussian random bridges and a geometric model for information equilibrium

    Science.gov (United States)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  3. Absolute continuity under time shift of trajectories and related stochastic calculus

    CERN Document Server

    Löbus, Jörg-Uwe

    2017-01-01

    The text is concerned with a class of two-sided stochastic processes of the form X=W+A. Here W is a two-sided Brownian motion with random initial data at time zero and A\\equiv A(W) is a function of W. Elements of the related stochastic calculus are introduced. In particular, the calculus is adjusted to the case when A is a jump process. Absolute continuity of (X,P) under time shift of trajectories is investigated. For example under various conditions on the initial density with respect to the Lebesgue measure, m, and on A with A_0=0 we verify \\frac{P(dX_{\\cdot -t})}{P(dX_\\cdot)}=\\frac{m(X_{-t})}{m(X_0)}\\cdot \\prod_i\\left|\

  4. A random walk model for evaluating clinical trials involving serial observations.

    Science.gov (United States)

    Hopper, J L; Young, G P

    1988-05-01

    For clinical trials where the variable of interest is ordered and categorical (for example, disease severity, symptom scale), and where measurements are taken at intervals, it might be possible to achieve a greater discrimination between the efficacy of treatments by modelling each patient's progress as a stochastic process. The random walk is a simple, easily interpreted model that can be fitted by maximum likelihood using a maximization routine with inference based on standard likelihood theory. In general the model can allow for randomly censored data, incorporates measured prognostic factors, and inference is conditional on the (possibly non-random) allocation of patients. Tests of fit and of model assumptions are proposed, and application to two therapeutic trials of gastroenterological disorders are presented. The model gave measures of the rate of, and variability in, improvement for patients under different treatments. A small simulation study suggested that the model is more powerful than considering the difference between initial and final scores, even when applied to data generated by a mechanism other than the random walk model assumed in the analysis. It thus provides a useful additional statistical method for evaluating clinical trials.

  5. Micro-Texture Synthesis by Phase Randomization

    Directory of Open Access Journals (Sweden)

    Bruno Galerne

    2011-09-01

    Full Text Available This contribution is concerned with texture synthesis by example, the process of generating new texture images from a given sample. The Random Phase Noise algorithm presented here synthesizes a texture from an original image by simply randomizing its Fourier phase. It is able to reproduce textures which are characterized by their Fourier modulus, namely the random phase textures (or micro-textures.

  6. Reliability of Broadcast Communications Under Sparse Random Linear Network Coding

    OpenAIRE

    Brown, Suzie; Johnson, Oliver; Tassi, Andrea

    2018-01-01

    Ultra-reliable Point-to-Multipoint (PtM) communications are expected to become pivotal in networks offering future dependable services for smart cities. In this regard, sparse Random Linear Network Coding (RLNC) techniques have been widely employed to provide an efficient way to improve the reliability of broadcast and multicast data streams. This paper addresses the pressing concern of providing a tight approximation to the probability of a user recovering a data stream protected by this kin...

  7. Understanding Nutrient Processing Under Similar Hydrologic Conditions Along a River Continuum

    Science.gov (United States)

    Garayburu-Caruso, V. A.; Mortensen, J.; Van Horn, D. J.; Gonzalez-Pinzon, R.

    2015-12-01

    Eutrophication is one of the main causes of water impairment across the US. The fate of nutrients in streams is typically described by the dynamic coupling of physical processes and biochemical processes. However, isolating each of these processes and determining its contribution to the whole system is challenging due to the complexity of the physical, chemical and biological domains. We conducted column experiments seeking to understand nutrient processing in shallow sediment-water interactions along representative sites of the Jemez River-Rio Grande continuum (eight stream orders), in New Mexico (USA). For each stream order, we used a set of 6 columns packed with 3 different sediments, i.e., Silica Cone Density Sand ASTM D 1556 (0.075-2.00 mm), gravel (> 2mm) and native sediments from each site. We incubated the sediments for three months and performed tracer experiments in the laboratory under identical flow conditions, seeking to normalize the physical processes along the river continuum. We added a short-term pulse injection of NO3, resazurin and NaCl to each column and determined metabolism and NO3 processing using the Tracer Additions for Spiraling Curve Characterization method (TASCC). Our methods allowed us to study how changes in bacterial communities and sediment composition along the river continuum define nutrient processing.

  8. Age-dependent impairment of auditory processing under spatially focused and divided attention: an electrophysiological study.

    Science.gov (United States)

    Wild-Wall, Nele; Falkenstein, Michael

    2010-01-01

    By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.

  9. Enhanced performance of denitrifying sulfide removal process under micro-aerobic condition.

    Science.gov (United States)

    Chen, Chuan; Ren, Nanqi; Wang, Aijie; Liu, Lihong; Lee, Duu-Jong

    2010-07-15

    The denitrifying sulfide removal (DSR) process with bio-granules comprising both heterotrophic and autotrophic denitrifiers can simultaneously convert nitrate, sulfide and acetate into di-nitrogen gas, elementary sulfur and carbon dioxide, respectively, at high loading rates. This study determines the reaction rate of sulfide oxidized into sulfur, as well as the reduction of nitrate to nitrite, would be enhanced under a micro-aerobic condition. The presence of limited oxygen mitigated the inhibition effects of sulfide on denitrifier activities, and enhanced the performance of DSR granules. The advantages and disadvantages of applying the micro-aerobic condition to the DSR process are discussed. 2010 Elsevier B.V. All rights reserved.

  10. A process evaluation of the Supermarket Healthy Eating for Life (SHELf) randomized controlled trial.

    Science.gov (United States)

    Olstad, Dana Lee; Ball, Kylie; Abbott, Gavin; McNaughton, Sarah A; Le, Ha N D; Ni Mhurchu, Cliona; Pollard, Christina; Crawford, David A

    2016-02-24

    Supermarket Healthy Eating for Life (SHELf) was a randomized controlled trial that operationalized a socioecological approach to population-level dietary behaviour change in a real-world supermarket setting. SHELf tested the impact of individual (skill-building), environmental (20% price reductions), and combined (skill-building + 20% price reductions) interventions on women's purchasing and consumption of fruits, vegetables, low-calorie carbonated beverages and water. This process evaluation investigated the reach, effectiveness, implementation, and maintenance of the SHELf interventions. RE-AIM provided a conceptual framework to examine the processes underlying the impact of the interventions using data from participant surveys and objective sales data collected at baseline, post-intervention (3 months) and 6-months post-intervention. Fisher's exact, χ (2) and t-tests assessed differences in quantitative survey responses among groups. Adjusted linear regression examined the impact of self-reported intervention dose on food purchasing and consumption outcomes. Thematic analysis identified key themes within qualitative survey responses. Reach of the SHELf interventions to disadvantaged groups, and beyond study participants themselves, was moderate. Just over one-third of intervention participants indicated that the interventions were effective in changing the way they bought, cooked or consumed food (p < 0.001 compared to control), with no differences among intervention groups. Improvements in purchasing and consumption outcomes were greatest among those who received a higher intervention dose. Most notably, participants who said they accessed price reductions on fruits and vegetables purchased (519 g/week) and consumed (0.5 servings/day) more vegetables. The majority of participants said they accessed (82%) and appreciated discounts on fruits and vegetables, while there was limited use (40%) and appreciation of discounts on low-calorie carbonated

  11. Efficient Option Pricing under Levy Processes, with CVA and FVA

    Directory of Open Access Journals (Sweden)

    Jimmy eLaw

    2015-07-01

    Full Text Available We generalize the Piterbarg (2010 model to include 1 bilateral default risk as in Burgard and Kjaer (2012, and 2 jumps in the dynamics of the underlying asset using general classes of L'evy processes of exponential type. We develop an efficient explicit-implicit scheme for European options and barrier options taking CVA-FVA into account. We highlight the importance of this work in the context of trading, pricing and management a derivative portfolio given the trajectory of regulations.

  12. Specialized rheumatology nurse substitutes for rheumatologists in the diagnostic process of fibromyalgia: a cost-consequence analysis and a randomized controlled trial

    NARCIS (Netherlands)

    Kroese, Mariëlle E.; Severens, Johan L.; Schulpen, Guy J.; Bessems, Monique C.; Nijhuis, Frans J.; Landewé, Robert B.

    2011-01-01

    To perform a cost-consequence analysis of the substitution of specialized rheumatology nurses (SRN) for rheumatologists (RMT) in the diagnostic process of fibromyalgia (FM), using both a healthcare and societal perspective and a 9-month period. Alongside a randomized controlled trial, we measured

  13. Description of two-process surface topography

    International Nuclear Information System (INIS)

    Grabon, W; Pawlus, P

    2014-01-01

    After two machining processes, a large number of surface topography measurements were made using Talyscan 150 stylus measuring equipment. The measured samples were divided into two groups. The first group contained two-process surfaces of random nature, while the second group used random-deterministic textures of random plateau parts and portions of deterministic valleys. For comparison, one-process surfaces were also analysed. Correlation and regression analysis was used to study the dependencies among surface texture parameters in 2D and 3D systems. As the result of this study, sets of parameters describing multi-process surface topography were obtained for two-process surfaces of random and of random-deterministic types. (papers)

  14. From Performance to Decision Processes in 33 Years: A History of Organizational Behavior and Human Decision Processes under James C. Naylor.

    Science.gov (United States)

    Weber

    1998-12-01

    For the past 33 years, Organizational Behavior and Human Decision Processes has thrived under a single editor. That editor, James C. Naylor, is retiring from his long stewardship. This article chronicles the course of the journal under Jim's direction and marks some of the accomplishments and changes over the past three decades that go to his credit. Copyright 1998 Academic Press.

  15. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Chemical Continuous Time Random Walks

    Science.gov (United States)

    Aquino, T.; Dentz, M.

    2017-12-01

    Traditional methods for modeling solute transport through heterogeneous media employ Eulerian schemes to solve for solute concentration. More recently, Lagrangian methods have removed the need for spatial discretization through the use of Monte Carlo implementations of Langevin equations for solute particle motions. While there have been recent advances in modeling chemically reactive transport with recourse to Lagrangian methods, these remain less developed than their Eulerian counterparts, and many open problems such as efficient convergence and reconstruction of the concentration field remain. We explore a different avenue and consider the question: In heterogeneous chemically reactive systems, is it possible to describe the evolution of macroscopic reactant concentrations without explicitly resolving the spatial transport? Traditional Kinetic Monte Carlo methods, such as the Gillespie algorithm, model chemical reactions as random walks in particle number space, without the introduction of spatial coordinates. The inter-reaction times are exponentially distributed under the assumption that the system is well mixed. In real systems, transport limitations lead to incomplete mixing and decreased reaction efficiency. We introduce an arbitrary inter-reaction time distribution, which may account for the impact of incomplete mixing. This process defines an inhomogeneous continuous time random walk in particle number space, from which we derive a generalized chemical Master equation and formulate a generalized Gillespie algorithm. We then determine the modified chemical rate laws for different inter-reaction time distributions. We trace Michaelis-Menten-type kinetics back to finite-mean delay times, and predict time-nonlocal macroscopic reaction kinetics as a consequence of broadly distributed delays. Non-Markovian kinetics exhibit weak ergodicity breaking and show key features of reactions under local non-equilibrium.

  17. 20 CFR 10.7 - What forms are needed to process claims under the FECA?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false What forms are needed to process claims under the FECA? 10.7 Section 10.7 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS, DEPARTMENT...' COMPENSATION ACT, AS AMENDED General Provisions Definitions and Forms § 10.7 What forms are needed to process...

  18. An innovative scintillation process for correcting, cooling, and reducing the randomness of waveforms

    International Nuclear Information System (INIS)

    Shen, J.

    1991-01-01

    Research activities were concentrated on an innovative scintillation technique for high-energy collider detection. Heretofore, scintillation waveform data of high- energy physics events have been problematically random. This paper represents a bottleneck of data flow for the next generation of detectors for proton colliders like SSC or LHC. Prevailing problems to resolve were: additional time walk and jitter resulting from the random hitting positions of particles, increased walk and jitter caused by scintillation photon propagation dispersions, and quantum fluctuations of luminescence. However, these were manageable when the different aspects of randomness had been clarified in increased detail. For this purpose, these three were defined as pseudorandomness, quasi-randomness, and real randomness, respectively. A unique scintillation counter incorporating long scintillators with light guides, a drift chamber, and fast discriminators plus integrators was employed to resolve problems of correcting time walk and reducing the additional jitter by establishing an analytical waveform description of V(t,z) for a measured (z). Resolving problem was accomplished by reducing jitter by compressing V(t,z) with a nonlinear medium, called cooling scintillation. Resolving problem was proposed by orienting molecular and polarizing scintillation through the use of intense magnetic technology, called stabilizing the waveform

  19. On the Wigner law in dilute random matrices

    Science.gov (United States)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  20. Population density equations for stochastic processes with memory kernels

    Science.gov (United States)

    Lai, Yi Ming; de Kamps, Marc

    2017-06-01

    We present a method for solving population density equations (PDEs)-a mean-field technique describing homogeneous populations of uncoupled neurons—where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation—a recent result from random network theory—describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.

  1. A Correction of Random Incidence Absorption Coefficients for the Angular Distribution of Acoustic Energy under Measurement Conditions

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2009-01-01

    Most acoustic measurements are based on an assumption of ideal conditions. One such ideal condition is a diffuse and reverberant field. In practice, a perfectly diffuse sound field cannot be achieved in a reverberation chamber. Uneven incident energy density under measurement conditions can cause...... discrepancies between the measured value and the theoretical random incidence absorption coefficient. Therefore the angular distribution of the incident acoustic energy onto an absorber sample should be taken into account. The angular distribution of the incident energy density was simulated using the beam...... tracing method for various room shapes and source positions. The averaged angular distribution is found to be similar to a Gaussian distribution. As a result, an angle-weighted absorption coefficient was proposed by considering the angular energy distribution to improve the agreement between...

  2. Geomagnetic storm under laboratory conditions: randomized experiment

    Science.gov (United States)

    Gurfinkel, Yu I.; Vasin, A. L.; Pishchalnikov, R. Yu; Sarimov, R. M.; Sasonko, M. L.; Matveeva, T. A.

    2017-10-01

    The influence of the previously recorded geomagnetic storm (GS) on human cardiovascular system and microcirculation has been studied under laboratory conditions. Healthy volunteers in lying position were exposed under two artificially created conditions: quiet (Q) and storm (S). The Q regime playbacks a noise-free magnetic field (MF) which is closed to the natural geomagnetic conditions on Moscow's latitude. The S regime playbacks the initially recorded 6-h geomagnetic storm which is repeated four times sequentially. The cardiovascular response to the GS impact was assessed by measuring capillary blood velocity (CBV) and blood pressure (BP) and by the analysis of the 24-h ECG recording. A storm-to-quiet ratio for the cardio intervals (CI) and the heart rate variability (HRV) was introduced in order to reveal the average over group significant differences of HRV. An individual sensitivity to the GS was estimated using the autocorrelation function analysis of the high-frequency (HF) part of the CI spectrum. The autocorrelation analysis allowed for detection a group of subjects of study which autocorrelation functions (ACF) react differently in the Q and S regimes of exposure.

  3. Geomagnetic storm under laboratory conditions: randomized experiment.

    Science.gov (United States)

    Gurfinkel, Yu I; Vasin, A L; Pishchalnikov, R Yu; Sarimov, R M; Sasonko, M L; Matveeva, T A

    2018-04-01

    The influence of the previously recorded geomagnetic storm (GS) on human cardiovascular system and microcirculation has been studied under laboratory conditions. Healthy volunteers in lying position were exposed under two artificially created conditions: quiet (Q) and storm (S). The Q regime playbacks a noise-free magnetic field (MF) which is closed to the natural geomagnetic conditions on Moscow's latitude. The S regime playbacks the initially recorded 6-h geomagnetic storm which is repeated four times sequentially. The cardiovascular response to the GS impact was assessed by measuring capillary blood velocity (CBV) and blood pressure (BP) and by the analysis of the 24-h ECG recording. A storm-to-quiet ratio for the cardio intervals (CI) and the heart rate variability (HRV) was introduced in order to reveal the average over group significant differences of HRV. An individual sensitivity to the GS was estimated using the autocorrelation function analysis of the high-frequency (HF) part of the CI spectrum. The autocorrelation analysis allowed for detection a group of subjects of study which autocorrelation functions (ACF) react differently in the Q and S regimes of exposure.

  4. Accumulated damage evaluation for a piping system by the response factor on non-stationary random process, 2

    International Nuclear Information System (INIS)

    Shintani, Masanori

    1988-01-01

    This paper shows that the average and variance of the accumulated damage caused by earthquakes on the piping system attached to a building are related to the seismic response factor λ. The earthquakes refered to in this paper are of a non-stationary random process kind. The average is proportional to λ 2 and the variance to λ 4 . The analytical values of the average and variance for a single-degree-of-freedom system are compared with those obtained from computer simulations. Here the model of the building is a single-degree-of-freedom system. Both average of accumulated damage are approximately equal. The variance obtained from the analysis does not coincide with that from simulations. The reason is considered to be the forced vibraiton by sinusoidal waves, and the sinusoidal waves included random waves. Taking account of amplitude magnification factor, the values of the variance approach those obtained from simulations. (author)

  5. Internal mechanisms underlying anticipatory language processing: Evidence from event-related-potentials and neural oscillations.

    Science.gov (United States)

    Li, Xiaoqing; Zhang, Yuping; Xia, Jinyan; Swaab, Tamara Y

    2017-07-28

    Although numerous studies have demonstrated that the language processing system can predict upcoming content during comprehension, there is still no clear picture of the anticipatory stage of predictive processing. This electroencephalograph study examined the cognitive and neural oscillatory mechanisms underlying anticipatory processing during language comprehension, and the consequences of this prediction for bottom-up processing of predicted/unpredicted content. Participants read Mandarin Chinese sentences that were either strongly or weakly constraining and that contained critical nouns that were congruent or incongruent with the sentence contexts. We examined the effects of semantic predictability on anticipatory processing prior to the onset of the critical nouns and on integration of the critical nouns. The results revealed that, at the integration stage, the strong-constraint condition (compared to the weak-constraint condition) elicited a reduced N400 and reduced theta activity (4-7Hz) for the congruent nouns, but induced beta (13-18Hz) and theta (4-7Hz) power decreases for the incongruent nouns, indicating benefits of confirmed predictions and potential costs of disconfirmed predictions. More importantly, at the anticipatory stage, the strongly constraining context elicited an enhanced sustained anterior negativity and beta power decrease (19-25Hz), which indicates that strong prediction places a higher processing load on the anticipatory stage of processing. The differences (in the ease of processing and the underlying neural oscillatory activities) between anticipatory and integration stages of lexical processing were discussed with regard to predictive processing models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The Investigations of Friction under Die Surface Vibration in Cold Forging Process

    DEFF Research Database (Denmark)

    Jinming, Sha

    investigation, and the second stage is to design and manufacture a more practical tool system which can be used to forging some industrial components with larger capacity. The high performance and power piezoelectric actuator stack as the vibration source will be used for designing the vibration system in order...... to 50% with vibration being applied in forming process. Furthermore, by using finite element method, a series of the simulations of the cold forging process under die surface excitation have been implemented in order to further understand the influence of vibration on friction, especially the influence...

  7. Functional connectivity in cortico-subcortical brain networks underlying reward processing in attention-deficit/hyperactivity disorder

    NARCIS (Netherlands)

    Oldehinkel, Marianne; Beckmann, Christian F.; Franke, Barbara; Hartman, Catharina A.; Hoekstra, Pieter J.; Oosterlaan, Jaap; Heslenfeld, Dirk; Buitelaar, Jan K.; Mennes, Maarten

    2016-01-01

    Background: Many patients with attention-deficit/hyperactivity disorder (ADHD) display aberrant reward-related behavior. Task-based fMRI studies have related atypical reward processing in ADHD to altered BOLD activity in regions underlying reward processing such as ventral striatum and orbitofrontal

  8. Analysis in nuclear power accident emergency based on random network and particle swarm optimization

    International Nuclear Information System (INIS)

    Gong Dichen; Fang Fang; Ding Weicheng; Chen Zhi

    2014-01-01

    The GERT random network model of nuclear power accident emergency was built in this paper, and the intelligent computation was combined with the random network based on the analysis of Fukushima nuclear accident in Japan. The emergency process was divided into the series link and parallel link, and the parallel link was the part of series link. The overall allocation of resources was firstly optimized, and then the parallel link was analyzed. The effect of the resources for emergency used in different links was analyzed, and it was put forward that the corresponding particle velocity vector was limited under the condition of limited emergency resources. The resource-constrained particle swarm optimization was obtained by using velocity projection matrix to correct the motion of particles. The optimized allocation of resources in emergency process was obtained and the time consumption of nuclear power accident emergency was reduced. (authors)

  9. Choosing between Higher Moment Maximum Entropy Models and Its Application to Homogeneous Point Processes with Random Effects

    Directory of Open Access Journals (Sweden)

    Lotfi Khribi

    2017-12-01

    Full Text Available In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulation study and the choice of the best order is established by two goodness-of-fit criteria: Kullback–Leibler divergence and a discrepancy measure. This procedure is illustrated on a warranty data set from the automobile industry.

  10. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  11. Unidirectional interference in use of nondominant hand during concurrent Grooved Pegboard and random number generation tasks.

    Science.gov (United States)

    Strenge, Hans; Niederberger, Uwe

    2008-06-01

    The interference effect between Grooved Pegboard task with either hand and the executive task of cued verbal random number generation was investigated. 24 normal right-handed subjects performed each task under separate (single-task) and concurrent (dual-task) conditions. Articulatory suppression was required as an additional secondary task during pegboard performance. Analysis indicated an unambiguous distinction between the two hands. Comparisons of single-task and dual-task conditions showed an asymmetrical pattern of unidirectional interference with no practice effects during pegboard performance. Concurrent performance with nondominant hand but not the dominant hand of random number generation performance became continuously slower. There was no effect of divided attention on pegboard performance. Findings support the idea that the nondominant hand on the pegboard and random number tasks draw from the same processing resources but that for the executive aspect random number generation is more sensitive to changes in allocation of attentional resources.

  12. A random-walk model for pore pressure accumulation in marine soils

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Cheng, Niang-Sheng

    1999-01-01

    A numerical random-walk model has been developed for the pore-water pressure. The model is based on the analogy between the variation of the pore pressure and the diffusion process of any passive quantity such as concentration. The pore pressure in the former process is analogous...... to the concentration in the latter. In the simulation, particles are released in the soil, and followed as they travel through the statistical field variables. The model has been validated (1) against the Terzaghi consolidation process, and (2) against the process where the pore pressure builds up under progressive...... waves. The model will apparently enable the researcher to handle complex geometries (such as a pipeline buried in a soil) relatively easily. Early results with regard to the latter example, namely the buildup of pore pressure around a buried pipeline subject to a progressive wave, are encouraging....

  13. The Knot Spectrum of Confined Random Equilateral Polygons

    Directory of Open Access Journals (Sweden)

    Diao Y.

    2014-01-01

    Full Text Available It is well known that genomic materials (long DNA chains of living organisms are often packed compactly under extreme confining conditions using macromolecular self-assembly processes but the general DNA packing mechanism remains an unsolved problem. It has been proposed that the topology of the packed DNA may be used to study the DNA packing mechanism. For example, in the case of (mutant bacteriophage P4, DNA molecules packed inside the bacteriophage head are considered to be circular since the two sticky ends of the DNA are close to each other. The DNAs extracted from the capsid without separating the two ends can thus preserve the topology of the (circular DNAs. It turns out that the circular DNAs extracted from bacteriophage P4 are non-trivially knotted with very high probability and with a bias toward chiral knots. In order to study this problem using a systematic approach based on mathematical modeling, one needs to introduce a DNA packing model under extreme volume confinement condition and test whether such a model can produce the kind of knot spectrum observed in the experiments. In this paper we introduce and study a model of equilateral random polygons con_ned in a sphere. This model is not meant to generate polygons that model DNA packed in a virus head directly. Instead, the average topological characteristics of this model may serve as benchmark data for totally randomly packed circular DNAs. The difference between the biologically observed topological characteristics and our benchmark data might reveal the bias of DNA packed in the viral capsids and possibly lead to a better understanding of the DNA packing mechanism, at least for the bacteriophage DNA. The purpose of this paper is to provide information about the knot spectrum of equilateral random polygons under such a spherical confinement with length and confinement ratios in a range comparable to circular DNAs packed inside bacteriophage heads.

  14. Independent component processes underlying emotions during natural music listening.

    Science.gov (United States)

    Rogenmoser, Lars; Zollinger, Nina; Elmer, Stefan; Jäncke, Lutz

    2016-09-01

    The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Small Acute Benefits of 4 Weeks Processing Speed Training Games on Processing Speed and Inhibition Performance and Depressive Mood in the Healthy Elderly People: Evidence from a Randomized Control Trial.

    Science.gov (United States)

    Nouchi, Rui; Saito, Toshiki; Nouchi, Haruka; Kawashima, Ryuta

    2016-01-01

    Background: Processing speed training using a 1-year intervention period improves cognitive functions and emotional states of elderly people. Nevertheless, it remains unclear whether short-term processing speed training such as 4 weeks can benefit elderly people. This study was designed to investigate effects of 4 weeks of processing speed training on cognitive functions and emotional states of elderly people. Methods: We used a single-blinded randomized control trial (RCT). Seventy-two older adults were assigned randomly to two groups: a processing speed training game (PSTG) group and knowledge quiz training game (KQTG) group, an active control group. In PSTG, participants were asked to play PSTG (12 processing speed games) for 15 min, during five sessions per week, for 4 weeks. In the KQTG group, participants were asked to play KQTG (four knowledge quizzes) for 15 min, during five sessions per week, for 4 weeks. We measured several cognitive functions and emotional states before and after the 4 week intervention period. Results: Our results revealed that PSTG improved performances in processing speed and inhibition compared to KQTG, but did not improve performance in reasoning, shifting, short term/working memory, and episodic memory. Moreover, PSTG reduced the depressive mood score as measured by the Profile of Mood State compared to KQTG during the 4 week intervention period, but did not change other emotional measures. Discussion: This RCT first provided scientific evidence related to small acute benefits of 4 week PSTG on processing speed, inhibition, and depressive mood in healthy elderly people. We discuss possible mechanisms for improvements in processing speed and inhibition and reduction of the depressive mood. Trial registration: This trial was registered in The University Hospital Medical Information Network Clinical Trials Registry (UMIN000022250).

  16. Smooth conditional distribution function and quantiles under random censorship.

    Science.gov (United States)

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  17. Determination of the underlying cause of death in three multicenter international HIV clinical trials

    DEFF Research Database (Denmark)

    Lifson, Alan R; Lundgren, Jens; Belloso, Waldo H

    2008-01-01

    PURPOSE: Describe processes and challenges for an Endpoint Review Committee (ERC) in determining and adjudicating underlying causes of death in HIV clinical trials. METHOD: Three randomized HIV trials (two evaluating interleukin-2 and one treatment interruption) enrolled 11,593 persons from 36...... information or supporting documentation to determine cause of death. Half (51%) of deaths reviewed by the ERC required follow-up adjudication; consensus was eventually always reached. CONCLUSION: ERCs can successfully provide blinded, independent, and systematic determinations of underlying cause of death...

  18. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  19. Convergence and approximate calculation of average degree under different network sizes for decreasing random birth-and-death networks

    Science.gov (United States)

    Long, Yin; Zhang, Xiao-Jun; Wang, Kui

    2018-05-01

    In this paper, convergence and approximate calculation of average degree under different network sizes for decreasing random birth-and-death networks (RBDNs) are studied. First, we find and demonstrate that the average degree is convergent in the form of power law. Meanwhile, we discover that the ratios of the back items to front items of convergent reminder are independent of network link number for large network size, and we theoretically prove that the limit of the ratio is a constant. Moreover, since it is difficult to calculate the analytical solution of the average degree for large network sizes, we adopt numerical method to obtain approximate expression of the average degree to approximate its analytical solution. Finally, simulations are presented to verify our theoretical results.

  20. Mixtures in nonstable Levy processes

    International Nuclear Information System (INIS)

    Petroni, N Cufaro

    2007-01-01

    We analyse the Levy processes produced by means of two interconnected classes of nonstable, infinitely divisible distribution: the variance gamma and the Student laws. While the variance gamma family is closed under convolution, the Student one is not: this makes its time evolution more complicated. We prove that-at least for one particular type of Student processes suggested by recent empirical results, and for integral times-the distribution of the process is a mixture of other types of Student distributions, randomized by means of a new probability distribution. The mixture is such that along the time the asymptotic behaviour of the probability density functions always coincide with that of the generating Student law. We put forward the conjecture that this can be a general feature of the Student processes. We finally analyse the Ornstein-Uhlenbeck process driven by our Levy noises and show a few simulations of it

  1. Weak convergence to isotropic complex [Formula: see text] random measure.

    Science.gov (United States)

    Wang, Jun; Li, Yunmeng; Sang, Liheng

    2017-01-01

    In this paper, we prove that an isotropic complex symmetric α -stable random measure ([Formula: see text]) can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  2. The impact of randomness on the distribution of wealth: Some economic aspects of the Wright-Fisher diffusion process

    Science.gov (United States)

    Bouleau, Nicolas; Chorro, Christophe

    2017-08-01

    In this paper we consider some elementary and fair zero-sum games of chance in order to study the impact of random effects on the wealth distribution of N interacting players. Even if an exhaustive analytical study of such games between many players may be tricky, numerical experiments highlight interesting asymptotic properties. In particular, we emphasize that randomness plays a key role in concentrating wealth in the extreme, in the hands of a single player. From a mathematical perspective, we interestingly adopt some diffusion limits for small and high-frequency transactions which are otherwise extensively used in population genetics. Finally, the impact of small tax rates on the preceding dynamics is discussed for several regulation mechanisms. We show that taxation of income is not sufficient to overcome this extreme concentration process in contrast to the uniform taxation of capital which stabilizes the economy and prevents agents from being ruined.

  3. Estimation of a monotone percentile residual life function under random censorship.

    Science.gov (United States)

    Franco-Pereira, Alba M; de Uña-Álvarez, Jacobo

    2013-01-01

    In this paper, we introduce a new estimator of a percentile residual life function with censored data under a monotonicity constraint. Specifically, it is assumed that the percentile residual life is a decreasing function. This assumption is useful when estimating the percentile residual life of units, which degenerate with age. We establish a law of the iterated logarithm for the proposed estimator, and its n-equivalence to the unrestricted estimator. The asymptotic normal distribution of the estimator and its strong approximation to a Gaussian process are also established. We investigate the finite sample performance of the monotone estimator in an extensive simulation study. Finally, data from a clinical trial in primary biliary cirrhosis of the liver are analyzed with the proposed methods. One of the conclusions of our work is that the restricted estimator may be much more efficient than the unrestricted one. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Equivalent non-Gaussian excitation method for response moment calculation of systems under non-Gaussian random excitation

    International Nuclear Information System (INIS)

    Tsuchida, Takahiro; Kimura, Koji

    2015-01-01

    Equivalent non-Gaussian excitation method is proposed to obtain the moments up to the fourth order of the response of systems under non-Gaussian random excitation. The excitation is prescribed by the probability density and power spectrum. Moment equations for the response can be derived from the stochastic differential equations for the excitation and the system. However, the moment equations are not closed due to the nonlinearity of the diffusion coefficient in the equation for the excitation. In the proposed method, the diffusion coefficient is replaced with the equivalent diffusion coefficient approximately to obtain a closed set of the moment equations. The square of the equivalent diffusion coefficient is expressed by the second-order polynomial. In order to demonstrate the validity of the method, a linear system to non-Gaussian excitation with generalized Gaussian distribution is analyzed. The results show the method is applicable to non-Gaussian excitation with the widely different kurtosis and bandwidth. (author)

  5. A Solution Method for Linear and Geometrically Nonlinear MDOF Systems with Random Properties subject to Random Excitation

    DEFF Research Database (Denmark)

    Micaletti, R. C.; Cakmak, A. S.; Nielsen, Søren R. K.

    structural properties. The resulting state-space formulation is a system of ordinary stochastic differential equations with random coefficient and deterministic initial conditions which are subsequently transformed into ordinary stochastic differential equations with deterministic coefficients and random......A method for computing the lower-order moments of randomly-excited multi-degree-of-freedom (MDOF) systems with random structural properties is proposed. The method is grounded in the techniques of stochastic calculus, utilizing a Markov diffusion process to model the structural system with random...... initial conditions. This transformation facilitates the derivation of differential equations which govern the evolution of the unconditional statistical moments of response. Primary consideration is given to linear systems and systems with odd polynomial nonlinearities, for in these cases...

  6. Working through the pain: working memory capacity and differences in processing and storage under pain.

    Science.gov (United States)

    Sanchez, Christopher A

    2011-02-01

    It has been suggested that pain perception and attention are closely linked at both a neural and a behavioural level. If pain and attention are so linked, it is reasonable to speculate that those who vary in working memory capacity (WMC) should be affected by pain differently. This study compares the performance of individuals who differ in WMC as they perform processing and memory span tasks while under mild pain and not. While processing performance under mild pain does not interact with WMC, the ability to store information for later recall does. This suggests that pain operates much like an additional processing burden, and that the ability to overcome this physical sensation is related to differences in WMC. © 2011 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business

  7. Random walks and diffusion on networks

    Science.gov (United States)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  8. A Method of Erasing Data Using Random Number Generators

    OpenAIRE

    井上,正人

    2012-01-01

    Erasing data is an indispensable step for disposal of computers or external storage media. Except physical destruction, erasing data means writing random information on entire disk drives or media. We propose a method which erases data safely using random number generators. These random number generators create true random numbers based on quantum processes.

  9. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2011-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views [fr

  10. Guidelines regarding the review process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2002-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  11. Guidelines regarding the review process under the convention on nuclear safety

    International Nuclear Information System (INIS)

    1998-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing national reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of national reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  12. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2011-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  13. Guidelines regarding the review process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    1999-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing national reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of national reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  14. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2011-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views [es

  15. Random walk generated by random permutations of {1, 2, 3, ..., n + 1}

    International Nuclear Information System (INIS)

    Oshanin, G; Voituriez, R

    2004-01-01

    We study properties of a non-Markovian random walk X (n) l , l = 0, 1, 2, ..., n, evolving in discrete time l on a one-dimensional lattice of integers, whose moves to the right or to the left are prescribed by the rise-and-descent sequences characterizing random permutations π of [n + 1] = {1, 2, 3, ..., n + 1}. We determine exactly the probability of finding the end-point X n = X (n) n of the trajectory of such a permutation-generated random walk (PGRW) at site X, and show that in the limit n → ∞ it converges to a normal distribution with a smaller, compared to the conventional Polya random walk, diffusion coefficient. We formulate, as well, an auxiliary stochastic process whose distribution is identical to the distribution of the intermediate points X (n) l , l < n, which enables us to obtain the probability measure of different excursions and to define the asymptotic distribution of the number of 'turns' of the PGRW trajectories

  16. Republic of Georgia estimates for prevalence of drug use: Randomized response techniques suggest under-estimation.

    Science.gov (United States)

    Kirtadze, Irma; Otiashvili, David; Tabatadze, Mzia; Vardanashvili, Irina; Sturua, Lela; Zabransky, Tomas; Anthony, James C

    2018-06-01

    Validity of responses in surveys is an important research concern, especially in emerging market economies where surveys in the general population are a novelty, and the level of social control is traditionally higher. The Randomized Response Technique (RRT) can be used as a check on response validity when the study aim is to estimate population prevalence of drug experiences and other socially sensitive and/or illegal behaviors. To apply RRT and to study potential under-reporting of drug use in a nation-scale, population-based general population survey of alcohol and other drug use. For this first-ever household survey on addictive substances for the Country of Georgia, we used the multi-stage probability sampling of 18-to-64-year-old household residents of 111 urban and 49 rural areas. During the interviewer-administered assessments, RRT involved pairing of sensitive and non-sensitive questions about drug experiences. Based upon the standard household self-report survey estimate, an estimated 17.3% [95% confidence interval, CI: 15.5%, 19.1%] of Georgian household residents have tried cannabis. The corresponding RRT estimate was 29.9% [95% CI: 24.9%, 34.9%]. The RRT estimates for other drugs such as heroin also were larger than the standard self-report estimates. We remain unsure about what is the "true" value for prevalence of using illegal psychotropic drugs in the Republic of Georgia study population. Our RRT results suggest that standard non-RRT approaches might produce 'under-estimates' or at best, highly conservative, lower-end estimates. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Listening to the Noise: Random Fluctuations Reveal Gene Network Parameters

    Science.gov (United States)

    Munsky, Brian; Trinh, Brooke; Khammash, Mustafa

    2010-03-01

    The cellular environment is abuzz with noise originating from the inherent random motion of reacting molecules in the living cell. In this noisy environment, clonal cell populations exhibit cell-to-cell variability that can manifest significant prototypical differences. Noise induced stochastic fluctuations in cellular constituents can be measured and their statistics quantified using flow cytometry, single molecule fluorescence in situ hybridization, time lapse fluorescence microscopy and other single cell and single molecule measurement techniques. We show that these random fluctuations carry within them valuable information about the underlying genetic network. Far from being a nuisance, the ever-present cellular noise acts as a rich source of excitation that, when processed through a gene network, carries its distinctive fingerprint that encodes a wealth of information about that network. We demonstrate that in some cases the analysis of these random fluctuations enables the full identification of network parameters, including those that may otherwise be difficult to measure. We use theoretical investigations to establish experimental guidelines for the identification of gene regulatory networks, and we apply these guideline to experimentally identify predictive models for different regulatory mechanisms in bacteria and yeast.

  18. Removal of chlortetracycline from spiked municipal wastewater using a photoelectrocatalytic process operated under sunlight irradiations

    Energy Technology Data Exchange (ETDEWEB)

    Daghrir, Rimeh, E-mail: rimeh.daghrir@ete.inrs.ca [Institut National de la Recherche Scientifique, Centre Eau, Terre et Environnement, 490 rue de la Couronne, Québec, Qc G1K 9A9 (Canada); Drogui, Patrick, E-mail: patrick.drogui@ete.inrs.ca [Institut National de la Recherche Scientifique, Centre Eau, Terre et Environnement, 490 rue de la Couronne, Québec, Qc G1K 9A9 (Canada); Delegan, Nazar, E-mail: delegan@emt.inrs.ca [Institut National de la Recherche Scientifique, INRS-Énergie, Matériaux et Télécommunications, 1650 Blvd. Lionel-Boulet, Varennes, Qc J3X 1S2 (Canada); El Khakani, My Ali, E-mail: elkhakani@emt.inrs.ca [Institut National de la Recherche Scientifique, INRS-Énergie, Matériaux et Télécommunications, 1650 Blvd. Lionel-Boulet, Varennes, Qc J3X 1S2 (Canada)

    2014-01-01

    The degradation of chlortetracycline in synthetic solution and in municipal effluent was investigated using a photoelectrocatalytic oxidation process under visible irradiation. The N-doped TiO{sub 2} used as photoanode with 3.4 at.% of nitrogen content was prepared by means of a radiofrequency magnetron sputtering (RF-MS) process. Under visible irradiation, higher photoelectrocatalytic removal efficiency of CTC was recorded using N-doped TiO{sub 2} compared to the conventional electrochemical oxidation, direct photolysis and photocatalysis processes. The photoelectrocatalytic process operated at 0.6 A of current intensity during 180 min of treatment time promotes the degradation of 99.1 ± 0.1% of CTC. Under these conditions, removal rates of 85.4 ± 3.6%, 87.4 ± 3.1% and 55.7 ± 2.9% of TOC, TN and NH{sub 4}{sup +} have been recorded. During the treatment, CTC was mainly transformed into CO{sub 2} and H{sub 2}O. The process was also found to be effective in removing indicator of pathogens such as fecal coliform (log-inactivation was higher than 1.2 units). - Highlights: •PECO process is a feasible technology for the treatment of MWW contaminated by CTC. •99.1% ± 0.1% of CTC was degraded by PECO using N-doped TiO{sub 2}. •85.4% ± 3.6% of TOC removal and 97.5% ± 1.2% of COD removal were achieved. •87.4% ± 3.1% of TN removal and 55.7% ± 2.9% of NH{sub 4}{sup +} removal were recorded. •More than 94% of fecal coliform was removed (abatement > 1.2-log units)

  19. Do MENA stock market returns follow a random walk process?

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2013-01-01

    Full Text Available In this research, three variance ratio tests: the standard variance ratio test, the wild bootstrap multiple variance ratio test, and the non-parametric rank scores test are adopted to test the random walk hypothesis (RWH of stock markets in Middle East and North Africa (MENA region using most recent data from January 2010 to September 2012. The empirical results obtained by all three econometric tests show that the RWH is strongly rejected for Kuwait, Tunisia, and Morocco. However, the standard variance ratio test and the wild bootstrap multiple variance ratio test reject the null hypothesis of random walk in Jordan and KSA, while non-parametric rank scores test do not. We may conclude that Jordan and KSA stock market are weak efficient. In sum, the empirical results suggest that return series in Kuwait, Tunisia, and Morocco are predictable. In other words, predictable patterns that can be exploited in these markets still exit. Therefore, investors may make profits in such less efficient markets.

  20. Disentangling Complexity from Randomness and Chaos

    Directory of Open Access Journals (Sweden)

    Lena C. Zuchowski

    2012-02-01

    Full Text Available This study aims to disentangle complexity from randomness and chaos, and to present a definition of complexity that emphasizes its epistemically distinct qualities. I will review existing attempts at defining complexity and argue that these suffer from two major faults: a tendency to neglect the underlying dynamics and to focus exclusively on the phenomenology of complex systems; and linguistic imprecisions in describing these phenomenologies. I will argue that the tendency to discuss phenomenology removed from the underlying dynamics is the main root of the difficulties in distinguishing complex from chaotic or random systems. In my own definition, I will explicitly try to avoid these pitfalls. The theoretical contemplations in this paper will be tested on a sample of five models: the random Kac ring, the chaotic CA30, the regular CA90, the complex CA110 and the complex Bak-Sneppen model. Although these modelling studies are restricted in scope and can only be seen as preliminary, they still constitute on of the first attempts to investigate complex systems comparatively.

  1. A One Line Derivation of DCC: Application of a Vector Random Coefficient Moving Average Process

    NARCIS (Netherlands)

    C.M. Hafner (Christian); M.J. McAleer (Michael)

    2014-01-01

    markdownabstract__Abstract__ One of the most widely-used multivariate conditional volatility models is the dynamic conditional correlation (or DCC) specification. However, the underlying stochastic process to derive DCC has not yet been established, which has made problematic the derivation of

  2. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Obsolescence : The underlying processes

    NARCIS (Netherlands)

    Thomsen, A.F.; Nieboer, N.E.T.; Van der Flier, C.L.

    2015-01-01

    Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of housing properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, namely

  4. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...... the accounting of uncertainty are compared with respect to different objectives. (C) 2015 Elsevier Ltd. All rights reserved....

  5. Research on photodiode detector-based spatial transient light detection and processing system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Yang; Zhao, Hui; Nan, Meng

    2016-10-01

    In order to realize real-time signal identification and processing of spatial transient light, the features and the energy of the captured target light signal are first described and quantitatively calculated. Considering that the transient light signal has random occurrence, a short duration and an evident beginning and ending, a photodiode detector based spatial transient light detection and processing system is proposed and designed in this paper. This system has a large field of view and is used to realize non-imaging energy detection of random, transient and weak point target under complex background of spatial environment. Weak signal extraction under strong background is difficult. In this paper, considering that the background signal changes slowly and the target signal changes quickly, filter is adopted for signal's background subtraction. A variable speed sampling is realized by the way of sampling data points with a gradually increased interval. The two dilemmas that real-time processing of large amount of data and power consumption required by the large amount of data needed to be stored are solved. The test results with self-made simulative signal demonstrate the effectiveness of the design scheme. The practical system could be operated reliably. The detection and processing of the target signal under the strong sunlight background was realized. The results indicate that the system can realize real-time detection of target signal's characteristic waveform and monitor the system working parameters. The prototype design could be used in a variety of engineering applications.

  6. Chaos-based Pseudo-random Number Generation

    KAUST Repository

    Barakat, Mohamed L.

    2014-04-10

    Various methods and systems related to chaos-based pseudo-random number generation are presented. In one example, among others, a system includes a pseudo-random number generator (PRNG) to generate a series of digital outputs and a nonlinear post processing circuit to perform an exclusive OR (XOR) operation on a first portion of a current digital output of the PRNG and a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output. In another example, a method includes receiving at least a first portion of a current output from a PRNG and performing an XOR operation on the first portion of the current PRNG output with a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output.

  7. Chaos-based Pseudo-random Number Generation

    KAUST Repository

    Barakat, Mohamed L.; Mansingka, Abhinav S.; Radwan, Ahmed Gomaa Ahmed; Salama, Khaled N.

    2014-01-01

    Various methods and systems related to chaos-based pseudo-random number generation are presented. In one example, among others, a system includes a pseudo-random number generator (PRNG) to generate a series of digital outputs and a nonlinear post processing circuit to perform an exclusive OR (XOR) operation on a first portion of a current digital output of the PRNG and a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output. In another example, a method includes receiving at least a first portion of a current output from a PRNG and performing an XOR operation on the first portion of the current PRNG output with a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output.

  8. Study of drain-extended NMOS under electrostatic discharge stress in 28 nm and 40 nm CMOS process

    Science.gov (United States)

    Wang, Weihuai; Jin, Hao; Dong, Shurong; Zhong, Lei; Han, Yan

    2016-02-01

    Researches on the electrostatic discharge (ESD) performance of drain-extended NMOS (DeNMOS) under the state-of-the-art 28 nm and 40 nm bulk CMOS process are performed in this paper. Three distinguishing phases of avalanche breakdown stage, depletion region push-out stage and parasitic NPN turn on stage of the gate-grounded DeNMOS (GG-DeNMOS) fabricated under 28 nm CMOS process measured with transmission line pulsing (TLP) test are analyzed through TCAD simulations and tape-out silicon verification detailedly. Damage mechanisms and failure spots of GG-DeNMOS under both CMOS processes are thermal breakdown of drain junction. Improvements based on the basic structure adjustments can increase the GG-DeNMOS robustness from original 2.87 mA/μm to the highest 5.41 mA/μm. Under 40 nm process, parameter adjustments based on the basic structure have no significant benefits on the robustness improvements. By inserting P+ segments in the N+ implantation of drain or an entire P+ strip between the N+ implantation of drain and polysilicon gate to form the typical DeMOS-SCR (silicon-controlled rectifier) structure, the ESD robustness can be enhanced from 1.83 mA/μm to 8.79 mA/μm and 29.78 mA/μm, respectively.

  9. Polarized ensembles of random pure states

    Science.gov (United States)

    Deelan Cunden, Fabio; Facchi, Paolo; Florio, Giuseppe

    2013-08-01

    A new family of polarized ensembles of random pure states is presented. These ensembles are obtained by linear superposition of two random pure states with suitable distributions, and are quite manageable. We will use the obtained results for two purposes: on the one hand we will be able to derive an efficient strategy for sampling states from isopurity manifolds. On the other, we will characterize the deviation of a pure quantum state from separability under the influence of noise.

  10. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2013-01-01

    These Guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 of the Convention and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views. [fr

  11. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2013-01-01

    These Guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 of the Convention and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views.

  12. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2013-01-01

    These Guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 of the Convention and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views. [es

  13. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    Science.gov (United States)

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  14. On the entropy of a hidden Markov process.

    Science.gov (United States)

    Jacquet, Philippe; Seroussi, Gadiel; Szpankowski, Wojciech

    2008-05-01

    We study the entropy rate of a hidden Markov process (HMP) defined by observing the output of a binary symmetric channel whose input is a first-order binary Markov process. Despite the simplicity of the models involved, the characterization of this entropy is a long standing open problem. By presenting the probability of a sequence under the model as a product of random matrices, one can see that the entropy rate sought is equal to a top Lyapunov exponent of the product. This offers an explanation for the elusiveness of explicit expressions for the HMP entropy rate, as Lyapunov exponents are notoriously difficult to compute. Consequently, we focus on asymptotic estimates, and apply the same product of random matrices to derive an explicit expression for a Taylor approximation of the entropy rate with respect to the parameter of the binary symmetric channel. The accuracy of the approximation is validated against empirical simulation results. We also extend our results to higher-order Markov processes and to Rényi entropies of any order.

  15. Investigating category- and shape-selective neural processing in ventral and dorsal visual stream under interocular suppression.

    Science.gov (United States)

    Ludwig, Karin; Kathmann, Norbert; Sterzer, Philipp; Hesselmann, Guido

    2015-01-01

    Recent behavioral and neuroimaging studies using continuous flash suppression (CFS) have suggested that action-related processing in the dorsal visual stream might be independent of perceptual awareness, in line with the "vision-for-perception" versus "vision-for-action" distinction of the influential dual-stream theory. It remains controversial if evidence suggesting exclusive dorsal stream processing of tool stimuli under CFS can be explained by their elongated shape alone or by action-relevant category representations in dorsal visual cortex. To approach this question, we investigated category- and shape-selective functional magnetic resonance imaging-blood-oxygen level-dependent responses in both visual streams using images of faces and tools. Multivariate pattern analysis showed enhanced decoding of elongated relative to non-elongated tools, both in the ventral and dorsal visual stream. The second aim of our study was to investigate whether the depth of interocular suppression might differentially affect processing in dorsal and ventral areas. However, parametric modulation of suppression depth by varying the CFS mask contrast did not yield any evidence for differential modulation of category-selective activity. Together, our data provide evidence for shape-selective processing under CFS in both dorsal and ventral stream areas and, therefore, do not support the notion that dorsal "vision-for-action" processing is exclusively preserved under interocular suppression. © 2014 Wiley Periodicals, Inc.

  16. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  17. Security and Composability of Randomness Expansion from Bell Inequalities

    NARCIS (Netherlands)

    S. Fehr (Serge); R. Gelles; C. Schaffner (Christian)

    2013-01-01

    htmlabstractThe nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness

  18. Markov Random Fields on Triangle Meshes

    DEFF Research Database (Denmark)

    Andersen, Vedrana; Aanæs, Henrik; Bærentzen, Jakob Andreas

    2010-01-01

    In this paper we propose a novel anisotropic smoothing scheme based on Markov Random Fields (MRF). Our scheme is formulated as two coupled processes. A vertex process is used to smooth the mesh by displacing the vertices according to a MRF smoothness prior, while an independent edge process label...

  19. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  20. Evolving Random Forest for Preference Learning

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Shaker, Noor

    2015-01-01

    This paper introduces a novel approach for pairwise preference learning through a combination of an evolutionary method and random forest. Grammatical evolution is used to describe the structure of the trees in the Random Forest (RF) and to handle the process of evolution. Evolved random forests ...... obtained for predicting pairwise self-reports of users for the three emotional states engagement, frustration and challenge show very promising results that are comparable and in some cases superior to those obtained from state-of-the-art methods....

  1. Attention Modulates the Neural Processes Underlying Multisensory Integration of Emotion

    Directory of Open Access Journals (Sweden)

    Hao Tam Ho

    2011-10-01

    Full Text Available Integrating emotional information from multiple sensory modalities is generally assumed to be a pre-attentive process (de Gelder et al., 1999. This assumption, however, presupposes that the integrative process occurs independent of attention. Using event-potentials (ERP the present study investigated whether the neural processes underlying the integration of dynamic facial expression and emotional prosody is indeed unaffected by attentional manipulations. To this end, participants were presented with congruent and incongruent face-voice combinations (eg, an angry face combined with a neutral voice and performed different two-choice tasks in four consecutive blocks. Three of the tasks directed the participants' attention to emotion expressions in the face, the voice or both. The fourth task required participants to attend to the synchronicity between voice and lip movements. The results show divergent modulations of early ERP components by the different attentional manipulations. For example, when attention was directed to the face (or the voice, incongruent stimuli elicited a reduced N1 as compared to congruent stimuli. This effect was absent, when attention was diverted away from the emotionality in both face and voice suggesting that the detection of emotional incongruence already requires attention. Based on these findings, we question whether multisensory integration of emotion occurs indeed pre-attentively.

  2. MendelianRandomization: an R package for performing Mendelian randomization analyses using summarized data.

    Science.gov (United States)

    Yavorska, Olena O; Burgess, Stephen

    2017-12-01

    MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  3. An integrable low-cost hardware random number generator

    Science.gov (United States)

    Ranasinghe, Damith C.; Lim, Daihyun; Devadas, Srinivas; Jamali, Behnam; Zhu, Zheng; Cole, Peter H.

    2005-02-01

    A hardware random number generator is different from a pseudo-random number generator; a pseudo-random number generator approximates the assumed behavior of a real hardware random number generator. Simple pseudo random number generators suffices for most applications, however for demanding situations such as the generation of cryptographic keys, requires an efficient and a cost effective source of random numbers. Arbiter-based Physical Unclonable Functions (PUFs) proposed for physical authentication of ICs exploits statistical delay variation of wires and transistors across integrated circuits, as a result of process variations, to build a secret key unique to each IC. Experimental results and theoretical studies show that a sufficient amount of variation exits across IC"s. This variation enables each IC to be identified securely. It is possible to exploit the unreliability of these PUF responses to build a physical random number generator. There exists measurement noise, which comes from the instability of an arbiter when it is in a racing condition. There exist challenges whose responses are unpredictable. Without environmental variations, the responses of these challenges are random in repeated measurements. Compared to other physical random number generators, the PUF-based random number generators can be a compact and a low-power solution since the generator need only be turned on when required. A 64-stage PUF circuit costs less than 1000 gates and the circuit can be implemented using a standard IC manufacturing processes. In this paper we have presented a fast and an efficient random number generator, and analysed the quality of random numbers produced using an array of tests used by the National Institute of Standards and Technology to evaluate the randomness of random number generators designed for cryptographic applications.

  4. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  5. QUALITY OF MINIMALLY PROCESSED ‘FUJI’ APPLE UNDER REFRIGERATED STORAGE AND TREATMENT WITH ADDITIVES

    Directory of Open Access Journals (Sweden)

    MARINES BATALHA MORENO

    Full Text Available ABSTRACT The aim of this study was to evaluate the ability to prolong the useful life of the minimally processed ‘Fuji’ apple by applying the individual or combined additives (L-cysteine chloride, L-ascorbic acid and calcium chloride and to determine the appropriate period of storage of the whole fruit to perform the minimum processing. The experimental design was completely randomized in three-factor design with three replications. Factor A was composed of storage periods of whole apples, pre-processing, in cold chambers (20, 78, 138 and 188 days; the factor B was represented by storage periods minimum post-processing, simulating shelf life (3, 6, 9 and 12 days, and factor C was represented by chemical additives (distilled water, as control, 0.5% L-cysteine chloride, 1% L-ascorbic acid, 0.5% L-cysteine chloride along with 1% calcium chloride and 1% L-ascorbic acid together with 1% calcium chloride. The evaluated dependent variables were pulp color (L* and hº, soluble solids, titratable acidity, content of phenolic compounds, antioxidant capacity and quantification of polyphenol oxidase. In addition, was analyzed the presence or absence of Salmonella sp. and Escherichia coli. The prolongation of the storage time of ‘Fuji’ apples in a refrigerated atmosphere promotes increased susceptibility to browning and softening after processing from 78 days of storage. The use of additives in the process, helps prevent these problems, especially when combined 0.5% L-cysteine chloride with 1% calcium chloride, achieving an excellent conservation in refrigerated shelf up to 6 days. From a microbiological aspect, minimally processed apples are toxicologically safe.

  6. Reducing under-reporting of stigmatized health events using the List Experiment: results from a randomized, population-based study of abortion in Liberia.

    Science.gov (United States)

    Moseson, Heidi; Massaquoi, Moses; Dehlendorf, Christine; Bawo, Luke; Dahn, Bernice; Zolia, Yah; Vittinghoff, Eric; Hiatt, Robert A; Gerdts, Caitlin

    2015-12-01

    Direct measurement of sensitive health events is often limited by high levels of under-reporting due to stigma and concerns about privacy. Abortion in particular is notoriously difficult to measure. This study implements a novel method to estimate the cumulative lifetime incidence of induced abortion in Liberia. In a randomly selected sample of 3219 women ages 15–49 years in June 2013 in Liberia, we implemented the ‘Double List Experiment’. To measure abortion incidence, each woman was read two lists: (A) a list of non-sensitive items and (B) a list of correlated non-sensitive items with abortion added. The sensitive item, abortion, was randomly added to either List A or List B for each respondent. The respondent reported a simple count of the options on each list that she had experienced, without indicating which options. Difference in means calculations between the average counts for each list were then averaged to provide an estimate of the population proportion that has had an abortion. The list experiment estimates that 32% [95% confidence interval (CI): 0.29-0.34) of respondents surveyed had ever had an abortion (26% of women in urban areas, and 36% of women in rural areas, P-value for difference Liberia, indicating the potential utility of this method to reduce under-reporting in the measurement of abortion. The method could be widely applied to measure other stigmatized health topics, including sexual behaviours, sexual assault or domestic violence.

  7. Genetic parameters for quail body weights using a random ...

    African Journals Online (AJOL)

    A model including fixed and random linear regressions is described for analyzing body weights at different ages. In this study, (co)variance components, heritabilities for quail weekly weights and genetic correlations among these weights were estimated using a random regression model by DFREML under DXMRR option.

  8. Tuned mass absorbers on damped structures under random load

    DEFF Research Database (Denmark)

    Krenk, Steen; Høgsberg, Jan Becker

    2008-01-01

    the mass ratio alone, and the damping can be determined subsequently. Only approximate results are available for the influence of damping in the original structure, typically in the form of series expansions. In the present paper it is demonstrated that for typical mass ratios in the order of a few percent......A substantial literature exists on the optimal choice of parameters of a tuned mass absorber on a structure excited by a force or by ground acceleration with random characteristics in the form of white noise. In the absence of structural damping the optimal frequency tuning is determined from...... for the response variance of a structure with initial damping in terms of the mass ratio and both damping ratios. Within this format the optimal tuning of the absorber turns out to be independent of the structural damping, and a simple explicit expression is obtained for the equivalent total damping....

  9. Using Pupil Diameter Changes for Measuring Mental Workload under Mental Processing

    Science.gov (United States)

    Batmaz, Ihsan; Ozturk, Mustafa

    In this study, it is aimed to evaluate the mental workload by using a practical way which based on measuring pupil diameter changes that occurs under mental processing. To determine the mental effort required for each task, the video record of subjects` eyes are taken while they are performed different tasks and pupils were measured from the records. A group of university student, one female 9 males participated to the experiment. Additionally, NASA-TLX questionnaire is applied for the related mental tasks. For verification of results obtained from both indices, the correlation coefficient is calculated task base. The results show that there is weak and negative correlation between the indices on task base except 3rd task. By investigating pupil diameter measurements data too, it is founded that pupil dilates under mental workload during performing related tasks. For all tasks, pupil diameters of response periods increased according to reference baseline period.

  10. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  11. Training processes in under 6s football competition: The transition from ingenuity to institutionalization

    OpenAIRE

    Abel Merino Orozco; Ana Arraiz Pérez; Fernando Sabirón Sierra

    2016-01-01

    Under 6s football competition is a school sport that has inherent educational implications. Moreover, it is a booming non-formal socio-educational framework where families and children lay training expectations and dreams. The aim is to comprehend the emerging learning processes promoted in this environment for 6 years-old children, when the child starts the institutionalization process in the ruled sport. The research uses a case study design, the ethnographic mode, through participant obser...

  12. Markov counting and reward processes for analysing the performance of a complex system subject to random inspections

    International Nuclear Information System (INIS)

    Ruiz-Castro, Juan Eloy

    2016-01-01

    In this paper, a discrete complex reliability system subject to internal failures and external shocks, is modelled algorithmically. Two types of internal failure are considered: repairable and non-repairable. When a repairable failure occurs, the unit goes to corrective repair. In addition, the unit is subject to external shocks that may produce an aggravation of the internal degradation level, cumulative damage or extreme failure. When a damage threshold is reached, the unit must be removed. When a non-repairable failure occurs, the device is replaced by a new, identical one. The internal performance and the external damage are partitioned in performance levels. Random inspections are carried out. When an inspection takes place, the internal performance of the system and the damage caused by external shocks are observed and if necessary the unit is sent to preventive maintenance. If the inspection observes minor state for the internal performance and/or external damage, then these states remain in memory when the unit goes to corrective or preventive maintenance. Transient and stationary analyses are performed. Markov counting and reward processes are developed in computational form to analyse the performance and profitability of the system with and without preventive maintenance. These aspects are implemented computationally with Matlab. - Highlights: • A multi-state device is modelled in an algorithmic and computational form. • The performance is partitioned in multi-states and degradation levels. • Several types of failures with repair times according to degradation levels. • Preventive maintenance as response to random inspection is introduced. • The performance-profitable is analysed through Markov counting and reward processes.

  13. Polarized ensembles of random pure states

    International Nuclear Information System (INIS)

    Cunden, Fabio Deelan; Facchi, Paolo; Florio, Giuseppe

    2013-01-01

    A new family of polarized ensembles of random pure states is presented. These ensembles are obtained by linear superposition of two random pure states with suitable distributions, and are quite manageable. We will use the obtained results for two purposes: on the one hand we will be able to derive an efficient strategy for sampling states from isopurity manifolds. On the other, we will characterize the deviation of a pure quantum state from separability under the influence of noise. (paper)

  14. The underlying processes of a soil mite metacommunity on a small scale

    Science.gov (United States)

    Guo, Chuanwei; Lin, Lin; Wu, Donghui; Zhang, Limin

    2017-01-01

    Metacommunity theory provides an understanding of how ecological processes regulate local community assemblies. However, few field studies have evaluated the underlying mechanisms of a metacommunity on a small scale through revealing the relative roles of spatial and environmental filtering in structuring local community composition. Based on a spatially explicit sampling design in 2012 and 2013, this study aims to evaluate the underlying processes of a soil mite metacommunity on a small spatial scale (50 m) in a temperate deciduous forest located at the Maoershan Ecosystem Research Station, Northeast China. Moran’s eigenvector maps (MEMs) were used to model independent spatial variables. The relative importance of spatial (including trend variables, i.e., geographical coordinates, and broad- and fine-scale spatial variables) and environmental factors in driving the soil mite metacommunity was determined by variation partitioning. Mantel and partial Mantel tests and a redundancy analysis (RDA) were also used to identify the relative contributions of spatial and environmental variables. The results of variation partitioning suggested that the relatively large and significant variance was a result of spatial variables (including broad- and fine-scale spatial variables and trend), indicating the importance of dispersal limitation and autocorrelation processes. The significant contribution of environmental variables was detected in 2012 based on a partial Mantel test, and soil moisture and soil organic matter were especially important for the soil mite metacommunity composition in both years. The study suggested that the soil mite metacommunity was primarily regulated by dispersal limitation due to broad-scale and neutral biotic processes at a fine-scale and that environmental filtering might be of subordinate importance. In conclusion, a combination of metacommunity perspectives between neutral and species sorting theories was suggested to be important in the

  15. The underlying processes of a soil mite metacommunity on a small scale.

    Directory of Open Access Journals (Sweden)

    Chengxu Dong

    Full Text Available Metacommunity theory provides an understanding of how ecological processes regulate local community assemblies. However, few field studies have evaluated the underlying mechanisms of a metacommunity on a small scale through revealing the relative roles of spatial and environmental filtering in structuring local community composition. Based on a spatially explicit sampling design in 2012 and 2013, this study aims to evaluate the underlying processes of a soil mite metacommunity on a small spatial scale (50 m in a temperate deciduous forest located at the Maoershan Ecosystem Research Station, Northeast China. Moran's eigenvector maps (MEMs were used to model independent spatial variables. The relative importance of spatial (including trend variables, i.e., geographical coordinates, and broad- and fine-scale spatial variables and environmental factors in driving the soil mite metacommunity was determined by variation partitioning. Mantel and partial Mantel tests and a redundancy analysis (RDA were also used to identify the relative contributions of spatial and environmental variables. The results of variation partitioning suggested that the relatively large and significant variance was a result of spatial variables (including broad- and fine-scale spatial variables and trend, indicating the importance of dispersal limitation and autocorrelation processes. The significant contribution of environmental variables was detected in 2012 based on a partial Mantel test, and soil moisture and soil organic matter were especially important for the soil mite metacommunity composition in both years. The study suggested that the soil mite metacommunity was primarily regulated by dispersal limitation due to broad-scale and neutral biotic processes at a fine-scale and that environmental filtering might be of subordinate importance. In conclusion, a combination of metacommunity perspectives between neutral and species sorting theories was suggested to be important

  16. Law of large numbers and central limit theorem for randomly forced PDE's

    CERN Document Server

    Shirikyan, A

    2004-01-01

    We consider a class of dissipative PDE's perturbed by an external random force. Under the condition that the distribution of perturbation is sufficiently non-degenerate, a strong law of large numbers (SLLN) and a central limit theorem (CLT) for solutions are established and the corresponding rates of convergence are estimated. It is also shown that the estimates obtained are close to being optimal. The proofs are based on the property of exponential mixing for the problem in question and some abstract SLLN and CLT for mixing-type Markov processes.

  17. A cyano-terminated dithienyldiketopyrrolopyrrole dimer as a solution processable ambipolar semiconductor under ambient conditions.

    Science.gov (United States)

    Wang, Li; Zhang, Xiaojie; Tian, Hongkun; Lu, Yunfeng; Geng, Yanhou; Wang, Fosong

    2013-12-14

    A cyano-terminated dimer of dithienyldiketopyrrolopyrrole (TDPP), DPP2-CN, is a solution processable ambipolar semiconductor with field-effect hole and electron mobilities of 0.066 and 0.033 cm(2) V(-1) s(-1), respectively, under ambient conditions.

  18. 78 FR 70088 - Agency Proposed Business Process Vision Under the Rehabilitation Act of 1973

    Science.gov (United States)

    2013-11-22

    ... site, Social Security Online, at http://www.socialsecurity.gov . SUPPLEMENTARY INFORMATION: Background... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2013-0042] Agency Proposed Business Process Vision Under the Rehabilitation Act of 1973 AGENCY: Social Security Administration (SSA). ACTION: Notice of...

  19. Temporal changes in randomness of bird communities across Central Europe.

    Science.gov (United States)

    Renner, Swen C; Gossner, Martin M; Kahl, Tiemo; Kalko, Elisabeth K V; Weisser, Wolfgang W; Fischer, Markus; Allan, Eric

    2014-01-01

    Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  20. Temporal changes in randomness of bird communities across Central Europe.

    Directory of Open Access Journals (Sweden)

    Swen C Renner

    Full Text Available Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63, implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.

  1. Snake representation of a superprocess in random environment

    OpenAIRE

    Mytnik, Leonid; Xiong, Jie; Zeitouni, Ofer

    2011-01-01

    We consider (discrete time) branching particles in a random environment which is i.i.d. in time and possibly spatially correlated. We prove a representation of the limit process by means of a Brownian snake in random environment.

  2. Generating equilateral random polygons in confinement II

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue an earlier study (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202) on the generation algorithms of random equilateral polygons confined in a sphere. Here, the equilateral random polygons are rooted at the center of the confining sphere and the confining sphere behaves like an absorbing boundary. One way to generate such a random polygon is the accept/reject method in which an unconditioned equilateral random polygon rooted at origin is generated. The polygon is accepted if it is within the confining sphere, otherwise it is rejected and the process is repeated. The algorithm proposed in this paper offers an alternative to the accept/reject method, yielding a faster generation process when the confining sphere is small. In order to use this algorithm effectively, a large, reusable data set needs to be pre-computed only once. We derive the theoretical distribution of the given random polygon model and demonstrate, with strong numerical evidence, that our implementation of the algorithm follows this distribution. A run time analysis and a numerical error estimate are given at the end of the paper. (paper)

  3. Information content versus word length in random typing

    International Nuclear Information System (INIS)

    Ferrer-i-Cancho, Ramon; Moscoso del Prado Martín, Fermín

    2011-01-01

    Recently, it has been claimed that a linear relationship between a measure of information content and word length is expected from word length optimization and it has been shown that this linearity is supported by a strong correlation between information content and word length in many languages (Piantadosi et al 2011 Proc. Nat. Acad. Sci. 108 3825). Here, we study in detail some connections between this measure and standard information theory. The relationship between the measure and word length is studied for the popular random typing process where a text is constructed by pressing keys at random from a keyboard containing letters and a space behaving as a word delimiter. Although this random process does not optimize word lengths according to information content, it exhibits a linear relationship between information content and word length. The exact slope and intercept are presented for three major variants of the random typing process. A strong correlation between information content and word length can simply arise from the units making a word (e.g., letters) and not necessarily from the interplay between a word and its context as proposed by Piantadosi and co-workers. In itself, the linear relation does not entail the results of any optimization process. (letter)

  4. Evaluation of Gas Phase Dispersion in Flotation under Predetermined Hydrodynamic Conditions

    Science.gov (United States)

    Młynarczykowska, Anna; Oleksik, Konrad; Tupek-Murowany, Klaudia

    2018-03-01

    Results of various investigations shows the relationship between the flotation parameters and gas distribution in a flotation cell. The size of gas bubbles is a random variable with a specific distribution. The analysis of this distribution is useful to make mathematical description of the flotation process. The flotation process depends on many variable factors. These are mainly occurrences like collision of single particle with gas bubble, adhesion of particle to the surface of bubble and detachment process. These factors are characterized by randomness. Because of that it is only possible to talk about the probability of occurence of one of these events which directly affects the speed of the process, thus a constant speed of flotation process. Probability of the bubble-particle collision in the flotation chamber with mechanical pulp agitation depends on the surface tension of the solution, air consumption, degree of pul aeration, energy dissipation and average feed particle size. Appropriate identification and description of the parameters of the dispersion of gas bubbles helps to complete the analysis of the flotation process in a specific physicochemical conditions and hydrodynamic for any raw material. The article presents the results of measurements and analysis of the gas phase dispersion by the size distribution of air bubbles in a flotation chamber under fixed hydrodynamic conditions. The tests were carried out in the Laboratory of Instrumental Methods in Department of Environmental Engineering and Mineral Processing, Faculty of Mining and Geoengineerin, AGH Univeristy of Science and Technology in Krakow.

  5. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  6. Tritium test of the tritium processing components under the Annex III US-Japan Collaboration

    International Nuclear Information System (INIS)

    Konishi, Satoshi; Yoshida, Hiroshi; Naruse, Yuji; Binning, K.E.; Carlson, R.V.; Bartlit, J.R.; Anderson, J.L.

    1993-03-01

    The process ready components for Fuel Cleanup System were tested at the TSTA under the US-Japan Collaboration program. Palladium diffuser for tritium purification and Ceramic Electrolysis Cell for decomposition of tritiated water respectively were tested with pure tritium for years. The characteristics of the components with hydrogen isotopes, effects of impurities, and long-term reliability of the components were studied. It was concluded that these components are suitable and attractive for fusion fuel processing systems. (author)

  7. The physics of randomness and regularities for languages (lifetimes, family trees, and the second languages); in terms of random matrices

    OpenAIRE

    Tuncay, Caglar

    2007-01-01

    The physics of randomness and regularities for languages (mother tongues) and their lifetimes and family trees and for the second languages are studied in terms of two opposite processes; random multiplicative noise [1], and fragmentation [2], where the original model is given in the matrix format. We start with a random initial world, and come out with the regularities, which mimic various empirical data [3] for the present languages.

  8. Multiple Scattering in Random Mechanical Systems and Diffusion Approximation

    Science.gov (United States)

    Feres, Renato; Ng, Jasmine; Zhang, Hong-Kun

    2013-10-01

    This paper is concerned with stochastic processes that model multiple (or iterated) scattering in classical mechanical systems of billiard type, defined below. From a given (deterministic) system of billiard type, a random process with transition probabilities operator P is introduced by assuming that some of the dynamical variables are random with prescribed probability distributions. Of particular interest are systems with weak scattering, which are associated to parametric families of operators P h , depending on a geometric or mechanical parameter h, that approaches the identity as h goes to 0. It is shown that ( P h - I)/ h converges for small h to a second order elliptic differential operator on compactly supported functions and that the Markov chain process associated to P h converges to a diffusion with infinitesimal generator . Both P h and are self-adjoint (densely) defined on the space of square-integrable functions over the (lower) half-space in , where η is a stationary measure. This measure's density is either (post-collision) Maxwell-Boltzmann distribution or Knudsen cosine law, and the random processes with infinitesimal generator respectively correspond to what we call MB diffusion and (generalized) Legendre diffusion. Concrete examples of simple mechanical systems are given and illustrated by numerically simulating the random processes.

  9. 30 CFR 285.612 - How will my SAP be processed for Federal consistency under the Coastal Zone Management Act?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How will my SAP be processed for Federal consistency under the Coastal Zone Management Act? 285.612 Section 285.612 Mineral Resources MINERALS... Plan § 285.612 How will my SAP be processed for Federal consistency under the Coastal Zone Management...

  10. 30 CFR 285.647 - How will my GAP be processed for Federal consistency under the Coastal Zone Management Act?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How will my GAP be processed for Federal consistency under the Coastal Zone Management Act? 285.647 Section 285.647 Mineral Resources MINERALS... Activities Plan § 285.647 How will my GAP be processed for Federal consistency under the Coastal Zone...

  11. Body Weight Management in Adults Under Chronic Stress Through Treatment With Ashwagandha Root Extract: A Double-Blind, Randomized, Placebo-Controlled Trial.

    Science.gov (United States)

    Choudhary, Dnyanraj; Bhattacharyya, Sauvik; Joshi, Kedar

    2017-01-01

    Chronic stress has been associated with a number of illnesses, including obesity. Ashwagandha is a well-known adaptogen and known for reducing stress and anxiety in humans. The objective of this study was to evaluate the safety and efficacy of a standardized root extract of Ashwagandha through a double-blind, randomized, placebo-controlled trial. A total of 52 subjects under chronic stress received either Ashwagandha (300 mg) or placebo twice daily. Primary efficacy measures were Perceived Stress Scale and Food Cravings Questionnaire. Secondary efficacy measures were Oxford Happiness Questionnaire, Three-Factor Eating Questionnaire, serum cortisol, body weight, and body mass index. Each subject was assessed at the start and at 4 and 8 weeks. The treatment with Ashwagandha resulted in significant improvements in primary and secondary measures. Also, the extract was found to be safe and tolerable. The outcome of this study suggests that Ashwagandha root extract can be used for body weight management in adults under chronic stress. © The Author(s) 2016.

  12. Response of stiff piles to random two-way lateral loading

    DEFF Research Database (Denmark)

    Bakmar, Christian LeBlanc; Byrne, B.W.; Houlsby, G. T.

    2010-01-01

    A model for predicting the accumulated rotation of stiff piles under random two-way loading is presented. The model is based on a strain superposition rule similar to Miner's rule and uses rainflow-counting to decompose a random time-series of varying loads into a set of simple load reversals. Th....... The method is consistent with the work of LeBlanc et al. (2010) and is supported by 1g laboratory tests. An example is given for an offshore wind turbine indicating that accumulated pile rotation during the life of the turbine is dominated by the worst expected load.......A model for predicting the accumulated rotation of stiff piles under random two-way loading is presented. The model is based on a strain superposition rule similar to Miner's rule and uses rainflow-counting to decompose a random time-series of varying loads into a set of simple load reversals...

  13. Multi-agent coordination in directed moving neighbourhood random networks

    International Nuclear Information System (INIS)

    Yi-Lun, Shang

    2010-01-01

    This paper considers the consensus problem of dynamical multiple agents that communicate via a directed moving neighbourhood random network. Each agent performs random walk on a weighted directed network. Agents interact with each other through random unidirectional information flow when they coincide in the underlying network at a given instant. For such a framework, we present sufficient conditions for almost sure asymptotic consensus. Numerical examples are taken to show the effectiveness of the obtained results. (general)

  14. Depolarization current relaxation process of insulating dielectrics after corona poling under different charging conditions

    Directory of Open Access Journals (Sweden)

    J. W. Zhang

    2017-10-01

    Full Text Available As an insulating dielectric, polyimide is favorable for the application of optoelectronics, electrical insulation system in electric power industry, insulating, and packaging materials in space aircraft, due to its excellent thermal, mechanical and electrical insulating stability. The charge storage profile of such insulating dielectric is utmost important to its application, when it is exposed to electron irradiation, high voltage corona discharge or other treatments. These treatments could induce changes in physical and chemical properties of treated samples. To investigate the charge storage mechanism of the insulating dielectrics after high-voltage corona discharge, the relaxation processes responsible for corona charged polyimide films under different poling conditions were analyzed by the Thermally Stimulated Discharge Currents method (TSDC. In the results of thermal relaxation process, the appearance of various peaks in TSDC spectra provided a deep insight into the molecular status in the dielectric material and reflected stored space charge relaxation process in the insulating polymers after corona discharge treatments. Furthermore, the different space charge distribution status under various poling temperature and different discharge voltage level were also investigated, which could partly reflect the influence of the ambiance condition on the functional dielectrics after corona poling.

  15. Depolarization current relaxation process of insulating dielectrics after corona poling under different charging conditions

    Science.gov (United States)

    Zhang, J. W.; Zhou, T. C.; Wang, J. X.; Yang, X. F.; Zhu, F.; Tian, L. M.; Liu, R. T.

    2017-10-01

    As an insulating dielectric, polyimide is favorable for the application of optoelectronics, electrical insulation system in electric power industry, insulating, and packaging materials in space aircraft, due to its excellent thermal, mechanical and electrical insulating stability. The charge storage profile of such insulating dielectric is utmost important to its application, when it is exposed to electron irradiation, high voltage corona discharge or other treatments. These treatments could induce changes in physical and chemical properties of treated samples. To investigate the charge storage mechanism of the insulating dielectrics after high-voltage corona discharge, the relaxation processes responsible for corona charged polyimide films under different poling conditions were analyzed by the Thermally Stimulated Discharge Currents method (TSDC). In the results of thermal relaxation process, the appearance of various peaks in TSDC spectra provided a deep insight into the molecular status in the dielectric material and reflected stored space charge relaxation process in the insulating polymers after corona discharge treatments. Furthermore, the different space charge distribution status under various poling temperature and different discharge voltage level were also investigated, which could partly reflect the influence of the ambiance condition on the functional dielectrics after corona poling.

  16. Subgeometric Ergodicity Analysis of Continuous-Time Markov Chains under Random-Time State-Dependent Lyapunov Drift Conditions

    Directory of Open Access Journals (Sweden)

    Mokaedi V. Lekgari

    2014-01-01

    Full Text Available We investigate random-time state-dependent Foster-Lyapunov analysis on subgeometric rate ergodicity of continuous-time Markov chains (CTMCs. We are mainly concerned with making use of the available results on deterministic state-dependent drift conditions for CTMCs and on random-time state-dependent drift conditions for discrete-time Markov chains and transferring them to CTMCs.

  17. A Partial Backlogging Inventory Model for Deteriorating Item under Fuzzy Inflation and Discounting over Random Planning Horizon: A Fuzzy Genetic Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Dipak Kumar Jana

    2013-01-01

    Full Text Available An inventory model for deteriorating item is considered in a random planning horizon under inflation and time value money. The model is described in two different environments: random and fuzzy random. The proposed model allows stock-dependent consumption rate and shortages with partial backlogging. In the fuzzy stochastic model, possibility chance constraints are used for defuzzification of imprecise expected total profit. Finally, genetic algorithm (GA and fuzzy simulation-based genetic algorithm (FSGA are used to make decisions for the above inventory models. The models are illustrated with some numerical data. Sensitivity analysis on expected profit function is also presented. Scope and Purpose. The traditional inventory model considers the ideal case in which depletion of inventory is caused by a constant demand rate. However, to keep sales higher, the inventory level would need to remain high. Of course, this would also result in higher holding or procurement cost. Also, in many real situations, during a longer-shortage period some of the customers may refuse the management. For instance, for fashionable commodities and high-tech products with short product life cycle, the willingness for a customer to wait for backlogging is diminishing with the length of the waiting time. Most of the classical inventory models did not take into account the effects of inflation and time value of money. But in the past, the economic situation of most of the countries has changed to such an extent due to large-scale inflation and consequent sharp decline in the purchasing power of money. So, it has not been possible to ignore the effects of inflation and time value of money any more. The purpose of this paper is to maximize the expected profit in the random planning horizon.

  18. An Electrochemical Processing Strategy for Improving Tribological Performance of Aisi 316 Stainless Steel Under Grease Lubrication

    Science.gov (United States)

    Zou, Jiaojuan; Li, Maolin; Lin, Naiming; Zhang, Xiangyu; Qin, Lin; Tang, Bin

    2014-12-01

    In order to improve the tribological performance of AISI 316 stainless steel (316 SS) under grease lubrication, electrochemical processing was conducted on it to obtain a rough (surface texturing-like) surface by making use of the high sensitivity of austenitic stainless steel to pitting corrosion in Cl--rich environment. Numerous corrosion pits or micro-ditches acted as micro-reservoirs on the obtained surface. While the grease could offer consistent lubrication, and then improve the tribological performance of 316 SS. Tribological behaviors of raw 316 SS and the treated sample were measured using a reciprocating type tribometer sliding against GCr15 steel counterpart under dry and grease lubrication conditions. The results showed that the mass losses of the two samples were in the same order of magnitude, and the raw sample exhibited lower friction coefficient in dry sliding. When the tests were conducted under grease lubrication condition, the friction coefficients and mass losses of the treated sample were far lower than those of the raw 316 SS. The tribological performance of 316 SS under grease lubrication was drastically improved after electrochemical processing.

  19. Killing (absorption) versus survival in random motion

    Science.gov (United States)

    Garbaczewski, Piotr

    2017-09-01

    We address diffusion processes in a bounded domain, while focusing on somewhat unexplored affinities between the presence of absorbing and/or inaccessible boundaries. For the Brownian motion (Lévy-stable cases are briefly mentioned) model-independent features are established of the dynamical law that underlies the short-time behavior of these random paths, whose overall lifetime is predefined to be long. As a by-product, the limiting regime of a permanent trapping in a domain is obtained. We demonstrate that the adopted conditioning method, involving the so-called Bernstein transition function, works properly also in an unbounded domain, for stochastic processes with killing (Feynman-Kac kernels play the role of transition densities), provided the spectrum of the related semigroup operator is discrete. The method is shown to be useful in the case, when the spectrum of the generator goes down to zero and no isolated minimal (ground state) eigenvalue is in existence, like in the problem of the long-term survival on a half-line with a sink at origin.

  20. Contextuality in canonical systems of random variables

    Science.gov (United States)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  1. Search for Directed Networks by Different Random Walk Strategies

    Science.gov (United States)

    Zhu, Zi-Qi; Jin, Xiao-Ling; Huang, Zhi-Long

    2012-03-01

    A comparative study is carried out on the efficiency of five different random walk strategies searching on directed networks constructed based on several typical complex networks. Due to the difference in search efficiency of the strategies rooted in network clustering, the clustering coefficient in a random walker's eye on directed networks is defined and computed to be half of the corresponding undirected networks. The search processes are performed on the directed networks based on Erdös—Rényi model, Watts—Strogatz model, Barabási—Albert model and clustered scale-free network model. It is found that self-avoiding random walk strategy is the best search strategy for such directed networks. Compared to unrestricted random walk strategy, path-iteration-avoiding random walks can also make the search process much more efficient. However, no-triangle-loop and no-quadrangle-loop random walks do not improve the search efficiency as expected, which is different from those on undirected networks since the clustering coefficient of directed networks are smaller than that of undirected networks.

  2. Evaluation of electron beam irradiation under heating process on vulcanized EPDM

    International Nuclear Information System (INIS)

    Gabriel, Leandro; Cardoso, Jessica R.; Moura, Eduardo; Geraldo, Aurea B.C.

    2015-01-01

    The Global consumption of rubber is estimated around 30.5 million tons in 2015, when it is expected an increase of 4.3% of this volume in the coming of years. This demand is mainly attributed to the production of elastomeric accessories for the automotive sector. However, the generation of this type of waste also reaches major proportions at the end of its useful life, when it is necessary to dispose the environmental liability. Rubber reprocessing is an alternative where it can be used as filler in other polymer matrices or in other types of materials. The devulcanization process is another alternative and it includes the study of methods that allow economic viability and waste reduction. Therefore, this study aims to recycle vulcanized EPDM rubber with the use of ionizing radiation. In this work we are using the electron beam irradiation process with simultaneous heating at absorbed doses from 150 kGy to 800 kGy, under high dose rate of 22.3 kGy/s on vulcanized EPDM powder and on samples about 4 mm thick. Their characterization, before and after the irradiation process, have been realized by thermal analysis and their changes have been discussed. (author)

  3. Evaluation of electron beam irradiation under heating process on vulcanized EPDM

    Energy Technology Data Exchange (ETDEWEB)

    Gabriel, Leandro; Cardoso, Jessica R.; Moura, Eduardo; Geraldo, Aurea B.C., E-mail: lgabriell@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The Global consumption of rubber is estimated around 30.5 million tons in 2015, when it is expected an increase of 4.3% of this volume in the coming of years. This demand is mainly attributed to the production of elastomeric accessories for the automotive sector. However, the generation of this type of waste also reaches major proportions at the end of its useful life, when it is necessary to dispose the environmental liability. Rubber reprocessing is an alternative where it can be used as filler in other polymer matrices or in other types of materials. The devulcanization process is another alternative and it includes the study of methods that allow economic viability and waste reduction. Therefore, this study aims to recycle vulcanized EPDM rubber with the use of ionizing radiation. In this work we are using the electron beam irradiation process with simultaneous heating at absorbed doses from 150 kGy to 800 kGy, under high dose rate of 22.3 kGy/s on vulcanized EPDM powder and on samples about 4 mm thick. Their characterization, before and after the irradiation process, have been realized by thermal analysis and their changes have been discussed. (author)

  4. Randomized clinical trials as reflexive-interpretative process in patients with rheumatoid arthritis: a qualitative study.

    Science.gov (United States)

    de Jorge, Mercedes; Parra, Sonia; de la Torre-Aboki, Jenny; Herrero-Beaumont, Gabriel

    2015-08-01

    Patients in randomized clinical trials have to adapt themselves to a restricted language to capture the necessary information to determine the safety and efficacy of a new treatment. The aim of this study was to explore the experience of patients with rheumatoid arthritis after completing their participation in a biologic therapy randomized clinical trial for a period of 3 years. A qualitative approach was used. The information was collected using 15 semi-structured interviews of patients with rheumatoid arthritis. Data collection was guided by the emergent analysis until no more relevant variations in the categories were found. The data were analysed using the grounded theory method. The objective of the patients when entering the study was to improve their quality of life by initiating the treatment. However, the experience changed the significance of the illness as they acquired skills and practical knowledge related to the management of their disease. The category "Interactional Empowerment" emerged as core category, as it represented the participative experience in a clinical trial. The process integrates the follow categories: "weight of systematisation", "working together", and the significance of the experience: "the duties". Simultaneously these categories evolved. The clinical trial monitoring activities enabled patients to engage in a reflexive-interpretative mechanism that transformed the emotional and symbolic significance of their disease and improved the empowerment of the patient. A better communicative strategy with the health professionals, the relatives of the patients, and the community was also achieved.

  5. Process convergence of self-normalized sums of i.i.d. random ...

    Indian Academy of Sciences (India)

    The study of the asymptotics of the self-normalized sums are also interesting. Logan ... if the constituent random variables are from the domain of attraction of a normal dis- tribution ... index of stability α which equals 2 (for definition, see §2).

  6. Cover times of random searches

    Science.gov (United States)

    Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël

    2015-10-01

    How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.

  7. Experimental Analysis of a Piezoelectric Energy Harvesting System for Harmonic, Random, and Sine on Random Vibration

    Directory of Open Access Journals (Sweden)

    Jackson W. Cryns

    2013-01-01

    Full Text Available Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random, and sine on random (SOR input vibration scenarios; the implications of source vibration characteristics on harvester design are discussed. The rise in popularity of harvesting energy from ambient vibrations has made compact, energy dense piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. Variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. The results agree with numerical and theoretical predictions in the previous literature for optimal power harvesting in sinusoidal and flat broadband vibration scenarios. Going beyond idealized steady-state sinusoidal and flat random vibration input, experimental SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibration sources significantly alter power generation and processing requirements by varying harvested power, shifting optimal conditioning impedance, inducing voltage fluctuations, and ultimately rendering idealized sinusoidal and random analyses incorrect.

  8. Dynamics of random Boolean networks under fully asynchronous stochastic update based on linear representation.

    Directory of Open Access Journals (Sweden)

    Chao Luo

    Full Text Available A novel algebraic approach is proposed to study dynamics of asynchronous random Boolean networks where a random number of nodes can be updated at each time step (ARBNs. In this article, the logical equations of ARBNs are converted into the discrete-time linear representation and dynamical behaviors of systems are investigated. We provide a general formula of network transition matrices of ARBNs as well as a necessary and sufficient algebraic criterion to determine whether a group of given states compose an attractor of length[Formula: see text] in ARBNs. Consequently, algorithms are achieved to find all of the attractors and basins in ARBNs. Examples are showed to demonstrate the feasibility of the proposed scheme.

  9. The underlying event in hard scattering processes

    International Nuclear Information System (INIS)

    Field, R.

    2002-01-01

    The authors study the behavior of the underlying event in hard scattering proton-antiproton collisions at 1.8 TeV and compare with the QCD Monte-Carlo models. The underlying event is everything except the two outgoing hard scattered jets and receives contributions from the beam-beam remnants plus initial and final-state radiation. The data indicate that neither ISAJET or HERWIG produce enough charged particles (with p T > 0.5 GeV/c) from the beam-beam remnant component and that ISAJET produces too many charged particles from initial-state radiation. PYTHIA which uses multiple parton scattering to enhance the underlying event does the best job describing the data

  10. Cloud Macroscopic Organization: Order Emerging from Randomness

    Science.gov (United States)

    Yuan, Tianle

    2011-01-01

    Clouds play a central role in many aspects of the climate system and their forms and shapes are remarkably diverse. Appropriate representation of clouds in climate models is a major challenge because cloud processes span at least eight orders of magnitude in spatial scales. Here we show that there exists order in cloud size distribution of low-level clouds, and that it follows a power-law distribution with exponent gamma close to 2. gamma is insensitive to yearly variations in environmental conditions, but has regional variations and land-ocean contrasts. More importantly, we demonstrate this self-organizing behavior of clouds emerges naturally from a complex network model with simple, physical organizing principles: random clumping and merging. We also demonstrate symmetry between clear and cloudy skies in terms of macroscopic organization because of similar fundamental underlying organizing principles. The order in the apparently complex cloud-clear field thus has its root in random local interactions. Studying cloud organization with complex network models is an attractive new approach that has wide applications in climate science. We also propose a concept of cloud statistic mechanics approach. This approach is fully complementary to deterministic models, and the two approaches provide a powerful framework to meet the challenge of representing clouds in our climate models when working in tandem.

  11. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  12. Neurophysiological processes and functional neuroanatomical structures underlying proactive effects of emotional conflicts.

    Science.gov (United States)

    Schreiter, Marie Luise; Chmielewski, Witold; Beste, Christian

    2018-07-01

    There is a strong inter-relation of cognitive and emotional processes as evidenced by emotional conflict monitoring processes. In the cognitive domain, proactive effects of conflicts have widely been studied; i.e. effects of conflicts in the n-1 trial on trial n. Yet, the neurophysiological processes and associated functional neuroanatomical structures underlying such proactive effects during emotional conflicts have not been investigated. This is done in the current study combining EEG recordings with signal decomposition methods and source localization approaches. We show that an emotional conflict in the n-1 trial differentially influences processing of positive and negative emotions in trial n, but not the processing of conflicts in trial n. The dual competition framework stresses the importance of dissociable 'perceptual' and 'response selection' or cognitive control levels for interactive effects of cognition and emotion. Only once these coding levels were isolated in the neurophysiological data, processes explaining the behavioral effects were detectable. The data show that there is not only a close correspondence between theoretical propositions of the dual competition framework and neurophysiological processes. Rather, processing levels conceptualized in the framework operate in overlapping time windows, but are implemented via distinct functional neuroanatomical structures; the precuneus (BA31) and the insula (BA13). It seems that decoding of information in the precuneus, as well as the integration of information during response selection in the insula is more difficult when confronted with angry facial emotions whenever cognitive control resources have been highly taxed by previous conflicts. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Can we always sweep the details of RNA-processing under the carpet?

    International Nuclear Information System (INIS)

    Klironomos, Filippos D; Berg, Johannes; De Meaux, Juliette

    2013-01-01

    RNA molecules follow a succession of enzyme-mediated processing steps from transcription to maturation. The participating enzymes, for example the spliceosome for mRNAs and Drosha and Dicer for microRNAs, are also produced in the cell and their copy-numbers fluctuate over time. Enzyme copy-number changes affect the processing rate of the substrate molecules; high enzyme numbers increase the processing rate, while low enzyme numbers decrease it. We study different RNA-processing cascades where enzyme copy-numbers are either fixed or fluctuate. We find that for the fixed enzyme copy-numbers, the substrates at steady-state are Poisson-distributed, and the whole RNA cascade dynamics can be understood as a single birth–death process of the mature RNA product. In this case, solely fluctuations in the timing of RNA processing lead to variation in the number of RNA molecules. However, we show analytically and numerically that when enzyme copy-numbers fluctuate, the strength of RNA fluctuations increases linearly with the RNA transcription rate. This linear effect becomes stronger as the speed of enzyme dynamics decreases relative to the speed of RNA dynamics. Interestingly, we find that under certain conditions, the RNA cascade can reduce the strength of fluctuations in the expression level of the mature RNA product. Finally, by investigating the effects of processing polymorphisms, we show that it is possible for the effects of transcriptional polymorphisms to be enhanced, reduced or even reversed. Our results provide a framework to understand the dynamics of RNA processing. (paper)

  14. Decolourisation of dyes under electro-Fenton process using Fe alginate gel beads

    International Nuclear Information System (INIS)

    Rosales, E.; Iglesias, O.; Pazos, M.; Sanromán, M.A.

    2012-01-01

    Highlights: ► Catalytic activity of Fe alginate gel beads for the remediation of wastewater was tested. ► New electro-Fenton process for the remediation of polluted wastewater. ► Continuous dye treatment without operational problem with high removal. - Abstract: This study focuses on the application of electro-Fenton technique by use of catalytic activity of Fe alginate gel beads for the remediation of wastewater contaminated with synthetic dyes. The Fe alginate gel beads were evaluated for decolourisation of two typical dyes, Lissamine Green B and Azure B under electro-Fenton process. After characterization of Fe alginate gel beads, the pH effect on the process with Fe alginate beads and a comparative study of the electro-Fenton process with free Fe and Fe alginate bead was done. The results showed that the use of Fe alginate beads increases the efficiency of the process; moreover the developed particles show a physical integrity in a wide range of pH (2–8). Around 98–100% of dye decolourisation was obtained for both dyes by electro-Fenton process in successive batches. Therefore, the process was performed with Fe alginate beads in a bubble continuous reactor. High color removal (87–98%) was attained for both dyes operating at a residence time of 30 min, without operational problems and maintaining particle shapes throughout the oxidation process. Consequently, the stable performance of Fe alginate beads opens promising perspectives for fast and economical treatment of wastewater polluted by dyes or similar organic contaminants.

  15. Decolourisation of dyes under electro-Fenton process using Fe alginate gel beads

    Energy Technology Data Exchange (ETDEWEB)

    Rosales, E.; Iglesias, O.; Pazos, M. [Department of Chemical Engineering, University of Vigo, Isaac Newton Building, Campus As Lagoas, Marcosende 36310, Vigo (Spain); Sanroman, M.A., E-mail: sanroman@uvigo.es [Department of Chemical Engineering, University of Vigo, Isaac Newton Building, Campus As Lagoas, Marcosende 36310, Vigo (Spain)

    2012-04-30

    Highlights: Black-Right-Pointing-Pointer Catalytic activity of Fe alginate gel beads for the remediation of wastewater was tested. Black-Right-Pointing-Pointer New electro-Fenton process for the remediation of polluted wastewater. Black-Right-Pointing-Pointer Continuous dye treatment without operational problem with high removal. - Abstract: This study focuses on the application of electro-Fenton technique by use of catalytic activity of Fe alginate gel beads for the remediation of wastewater contaminated with synthetic dyes. The Fe alginate gel beads were evaluated for decolourisation of two typical dyes, Lissamine Green B and Azure B under electro-Fenton process. After characterization of Fe alginate gel beads, the pH effect on the process with Fe alginate beads and a comparative study of the electro-Fenton process with free Fe and Fe alginate bead was done. The results showed that the use of Fe alginate beads increases the efficiency of the process; moreover the developed particles show a physical integrity in a wide range of pH (2-8). Around 98-100% of dye decolourisation was obtained for both dyes by electro-Fenton process in successive batches. Therefore, the process was performed with Fe alginate beads in a bubble continuous reactor. High color removal (87-98%) was attained for both dyes operating at a residence time of 30 min, without operational problems and maintaining particle shapes throughout the oxidation process. Consequently, the stable performance of Fe alginate beads opens promising perspectives for fast and economical treatment of wastewater polluted by dyes or similar organic contaminants.

  16. Dimer coverings on random multiple chains of planar honeycomb lattices

    International Nuclear Information System (INIS)

    Ren, Haizhen; Zhang, Fuji; Qian, Jianguo

    2012-01-01

    We study dimer coverings on random multiple chains. A multiple chain is a planar honeycomb lattice constructed by successively fusing copies of a ‘straight’ condensed hexagonal chain at the bottom of the previous one in two possible ways. A random multiple chain is then generated by admitting the Bernoulli distribution on the two types of fusing, which describes a zeroth-order Markov process. We determine the expectation of the number of the pure dimer coverings (perfect matchings) over the ensemble of random multiple chains by the transfer matrix approach. Our result shows that, with only two exceptions, the average of the logarithm of this expectation (i.e., the annealed entropy per dimer) is asymptotically nonzero when the fusing process goes to infinity and the length of the hexagonal chain is fixed, though it is zero when the fusing process and the length of the hexagonal chain go to infinity simultaneously. Some numerical results are provided to support our conclusion, from which we can see that the asymptotic behavior fits well to the theoretical results. We also apply the transfer matrix approach to the quenched entropy and reveal that the quenched entropy of random multiple chains has a close connection with the well-known Lyapunov exponent of random matrices. Using the theory of Lyapunov exponents we show that, for some random multiple chains, the quenched entropy per dimer is strictly smaller than the annealed one when the fusing process goes to infinity. Finally, we determine the expectation of the free energy per dimer over the ensemble of the random multiple chains in which the three types of dimers in different orientations are distinguished, and specify a series of non-random multiple chains whose free energy per dimer is asymptotically equal to this expectation. (paper)

  17. On pricing futures options on random binomial tree

    International Nuclear Information System (INIS)

    Bayram, Kamola; Ganikhodjaev, Nasir

    2013-01-01

    The discrete-time approach to real option valuation has typically been implemented in the finance literature using a binomial tree framework. Instead we develop a new model by randomizing the environment and call such model a random binomial tree. Whereas the usual model has only one environment (u, d) where the price of underlying asset can move by u times up and d times down, and pair (u, d) is constant over the life of the underlying asset, in our new model the underlying security is moving in two environments namely (u 1 , d 1 ) and (u 2 , d 2 ). Thus we obtain two volatilities σ 1 and σ 2 . This new approach enables calculations reflecting the real market since it consider the two states of market normal and extra ordinal. In this paper we define and study Futures options for such models.

  18. Neural processes underlying cultural differences in cognitive persistence.

    Science.gov (United States)

    Telzer, Eva H; Qu, Yang; Lin, Lynda C

    2017-08-01

    Self-improvement motivation, which occurs when individuals seek to improve upon their competence by gaining new knowledge and improving upon their skills, is critical for cognitive, social, and educational adjustment. While many studies have delineated the neural mechanisms supporting extrinsic motivation induced by monetary rewards, less work has examined the neural processes that support intrinsically motivated behaviors, such as self-improvement motivation. Because cultural groups traditionally vary in terms of their self-improvement motivation, we examined cultural differences in the behavioral and neural processes underlying motivated behaviors during cognitive persistence in the absence of extrinsic rewards. In Study 1, 71 American (47 females, M=19.68 years) and 68 Chinese (38 females, M=19.37 years) students completed a behavioral cognitive control task that required cognitive persistence across time. In Study 2, 14 American and 15 Chinese students completed the same cognitive persistence task during an fMRI scan. Across both studies, American students showed significant declines in cognitive performance across time, whereas Chinese participants demonstrated effective cognitive persistence. These behavioral effects were explained by cultural differences in self-improvement motivation and paralleled by increasing activation and functional coupling between the inferior frontal gyrus (IFG) and ventral striatum (VS) across the task among Chinese participants, neural activation and coupling that remained low in American participants. These findings suggest a potential neural mechanism by which the VS and IFG work in concert to promote cognitive persistence in the absence of extrinsic rewards. Thus, frontostriatal circuitry may be a neurobiological signal representing intrinsic motivation for self-improvement that serves an adaptive function, increasing Chinese students' motivation to engage in cognitive persistence. Copyright © 2017 Elsevier Inc. All rights

  19. Mechanical and tribological behaviour of molten salt processed self-lubricated aluminium composite under different treatments

    Science.gov (United States)

    Kannan, C.; Ramanujam, R.

    2018-05-01

    The aim of this research work is to evaluate the mechanical and tribological behaviour of Al 7075 based self-lubricated hybrid nanocomposite under different treated conditions viz. as-cast, T6 and deep cryo treated. In order to overcome the drawbacks associated with conventional stir casting, a combinational approach that consists of molten salt processing, ultrasonic assistance and optimized mechanical stirring is adopted in this study to fabricate the nanocomposite. The mechanical characterisation tests carried out on this nanocomposite reveals an improvement of about 39% in hardness and 22% in ultimate tensile strength possible under T6 condition. Under specific conditions, the wear rate can be reduced to the extent of about 63% through the usage of self-lubricated hybrid nanocomposite under T6 condition.

  20. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)