WorldWideScience

Sample records for random measurement bases

  1. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    Science.gov (United States)

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  2. Experimental nonlocality-based randomness generation with nonprojective measurements

    Science.gov (United States)

    Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.

    2018-04-01

    We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.

  3. An explicit semantic relatedness measure based on random walk

    Directory of Open Access Journals (Sweden)

    HU Sihui

    2016-10-01

    Full Text Available The semantic relatedness calculation of open domain knowledge network is a significant issue.In this paper,pheromone strategy is drawn from the thought of ant colony algorithm and is integrated into the random walk which is taken as the basic framework of calculating the semantic relatedness degree.The pheromone distribution is taken as a criterion of determining the tightness degree of semantic relatedness.A method of calculating semantic relatedness degree based on random walk is proposed and the exploration process of calculating the semantic relatedness degree is presented in a dominant way.The method mainly contains Path Select Model(PSM and Semantic Relatedness Computing Model(SRCM.PSM is used to simulate the path selection of ants and pheromone release.SRCM is used to calculate the semantic relatedness by utilizing the information returned by ants.The result indicates that the method could complete semantic relatedness calculation in linear complexity and extend the feasible strategy of semantic relatedness calculation.

  4. Quantum authentication based on the randomness of measurement bases in BB84

    International Nuclear Information System (INIS)

    Dang Minh Dung; Bellot, P.; Alleaume, R.

    2005-01-01

    Full text: The establishment of a secret key between two legitimate end points of a communication link, let us name them Alice and Bob, using Quantum key distribution (QKD) is unconditionally secure thanks to Quantum Physics laws.However, the various QKD protocols do not intend to provide the authentication of the end points: Alice cannot be sure that she is communicating with Bob and reciprocally. Therefore, these protocols are subjects to various attacks. The most obvious attack is the man-in-the-middle attack in which an eavesdropper, let us name her Eve, stands in the middle of the communication link. Alice communicates with Eve meanwhile she thinks she communicate with Bob. And Bob communicates with Eve meanwhile he thinks he is communicating with Alice. Eve, acting as a relay, can read all the communications between Alice and Bob and retransmit them. To prevent this kind of attack, the solution is to authenticate the two end points of the communication link. One solution is that Alice and Bob share an authentication key prior to the communication. In order to improve the security, Alice and Bob must share a set of authentication one-time keys. One-time key means that the key has to be used only once because each time a key is used, the eavesdropper Eve can gain a few information on the key. Re-using the same key many times would finally reveal the key to Eve. However, Eve can simulate many times the authentication process with Alice. Each time Eve simulates the authentication process, one of the pre-positioned keys is depleted leading to the exhaustion of the set of pre-positioned keys. This type of attack is named Denial of Service attack. In this work, we propose to use the randomness of the measurement bases in BB84 to build an authentication scheme based on the existence of a prepositioned authentication key. This authentication scheme can be used with BB84 but also with any other Quantum Key Distribution protocols. It is protected against the Denial of

  5. Random walk-based similarity measure method for patterns in complex object

    Directory of Open Access Journals (Sweden)

    Liu Shihu

    2017-04-01

    Full Text Available This paper discusses the similarity of the patterns in complex objects. The complex object is composed both of the attribute information of patterns and the relational information between patterns. Bearing in mind the specificity of complex object, a random walk-based similarity measurement method for patterns is constructed. In this method, the reachability of any two patterns with respect to the relational information is fully studied, and in the case of similarity of patterns with respect to the relational information can be calculated. On this bases, an integrated similarity measurement method is proposed, and algorithms 1 and 2 show the performed calculation procedure. One can find that this method makes full use of the attribute information and relational information. Finally, a synthetic example shows that our proposed similarity measurement method is validated.

  6. Hardware random number generator base on monostable multivibrators dedicated for distributed measurement and control systems

    Science.gov (United States)

    Czernik, Pawel

    2013-10-01

    The hardware random number generator based on the 74121 monostable multivibrators for applications in cryptographically secure distributed measurement and control systems with asymmetric resources was presented. This device was implemented on the basis of the physical electronic vibration generator in which the circuit is composed of two "loop" 74121 monostable multivibrators, D flip-flop and external clock signal source. The clock signal, witch control D flip-flop was generated by a computer on one of the parallel port pins. There was presented programmed the author's acquisition process of random data from the measuring system to a computer. The presented system was designed, builded and thoroughly tested in the term of cryptographic security in our laboratory, what there is the most important part of this publication. Real cryptographic security was tested based on the author's software and the software environment called RDieHarder. The obtained results was here presented and analyzed in detail with particular reference to the specificity of distributed measurement and control systems with asymmetric resources.

  7. An AUC-based permutation variable importance measure for random forests.

    Science.gov (United States)

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  8. Covariance-Based Estimation from Multisensor Delayed Measurements with Random Parameter Matrices and Correlated Noises

    Directory of Open Access Journals (Sweden)

    R. Caballero-Águila

    2014-01-01

    Full Text Available The optimal least-squares linear estimation problem is addressed for a class of discrete-time multisensor linear stochastic systems subject to randomly delayed measurements with different delay rates. For each sensor, a different binary sequence is used to model the delay process. The measured outputs are perturbed by both random parameter matrices and one-step autocorrelated and cross correlated noises. Using an innovation approach, computationally simple recursive algorithms are obtained for the prediction, filtering, and smoothing problems, without requiring full knowledge of the state-space model generating the signal process, but only the information provided by the delay probabilities and the mean and covariance functions of the processes (signal, random parameter matrices, and noises involved in the observation model. The accuracy of the estimators is measured by their error covariance matrices, which allow us to analyze the estimator performance in a numerical simulation example that illustrates the feasibility of the proposed algorithms.

  9. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    Science.gov (United States)

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  10. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  11. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    1997-01-01

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  12. Development of measurement system for radiation effect on static random access memory based field programmable gate array

    International Nuclear Information System (INIS)

    Yao Zhibin; He Baoping; Zhang Fengqi; Guo Hongxia; Luo Yinhong; Wang Yuanming; Zhang Keying

    2009-01-01

    Based on the detailed investigation in field programmable gate array(FPGA) radiation effects theory, a measurement system for radiation effects on static random access memory(SRAM)-based FPGA was developed. The testing principle of internal memory, function and power current was introduced. The hardware and software implement means of system were presented. Some important parameters for radiation effects on SRAM-based FPGA, such as configuration RAM upset section, block RAM upset section, function fault section and single event latchup section can be gained with this system. The transmission distance of the system can be over 50 m and the maximum number of tested gates can reach one million. (authors)

  13. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  14. Classical randomness in quantum measurements

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro; Presti, Paoloplacido Lo; Perinotti, Paolo

    2005-01-01

    Similarly to quantum states, also quantum measurements can be 'mixed', corresponding to a random choice within an ensemble of measuring apparatuses. Such mixing is equivalent to a sort of hidden variable, which produces a noise of purely classical nature. It is then natural to ask which apparatuses are indecomposable, i.e. do not correspond to any random choice of apparatuses. This problem is interesting not only for foundations, but also for applications, since most optimization strategies give optimal apparatuses that are indecomposable. Mathematically the problem is posed describing each measuring apparatus by a positive operator-valued measure (POVM), which gives the statistics of the outcomes for any input state. The POVMs form a convex set, and in this language the indecomposable apparatuses are represented by extremal points-the analogous of 'pure states' in the convex set of states. Differently from the case of states, however, indecomposable POVMs are not necessarily rank-one, e.g. von Neumann measurements. In this paper we give a complete classification of indecomposable apparatuses (for discrete spectrum), by providing different necessary and sufficient conditions for extremality of POVMs, along with a simple general algorithm for the decomposition of a POVM into extremals. As an interesting application, 'informationally complete' measurements are analysed in this respect. The convex set of POVMs is fully characterized by determining its border in terms of simple algebraic properties of the corresponding POVMs

  15. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  16. Error evaluation method for material accountancy measurement. Evaluation of random and systematic errors based on material accountancy data

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2008-01-01

    International Target Values (ITV) shows random and systematic measurement uncertainty components as a reference for routinely achievable measurement quality in the accountancy measurement. The measurement uncertainty, called error henceforth, needs to be periodically evaluated and checked against ITV for consistency as the error varies according to measurement methods, instruments, operators, certified reference samples, frequency of calibration, and so on. In the paper an error evaluation method was developed with focuses on (1) Specifying clearly error calculation model, (2) Getting always positive random and systematic error variances, (3) Obtaining probability density distribution of an error variance and (4) Confirming the evaluation method by simulation. In addition the method was demonstrated by applying real data. (author)

  17. What Randomized Benchmarking Actually Measures

    International Nuclear Information System (INIS)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-01-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  18. A Maximin Approach for the Bi-criteria 0-1 Random Fuzzy Programming Problem Based on the Necessity Measure

    International Nuclear Information System (INIS)

    Hasuike, Takashi; Ishii, Hiroaki; Katagiri, Hideki

    2009-01-01

    This paper considers a bi-criteria general 0-1 random fuzzy programming problem based on the degree of necessity which include some previous 0-1 stochastic and fuzzy programming problems. The proposal problem is not well-defined due to including randomness and fuzziness. Therefore, by introducing chance constraint and fuzzy goals for objectives, and considering the maximization of the aspiration level for total profit and the degree of necessity that the objective function's value satisfies the fuzzy goal, the main problem is transformed into a deterministic equivalent problem. Furthermore, by using the assumption that each random variable is distributed according to a normal distribution, the problem is equivalently transformed into a basic 0-1 programming problem, and the efficient strict solution method to find an optimal solution is constructed.

  19. Weak convergence to isotropic complex [Formula: see text] random measure.

    Science.gov (United States)

    Wang, Jun; Li, Yunmeng; Sang, Liheng

    2017-01-01

    In this paper, we prove that an isotropic complex symmetric α -stable random measure ([Formula: see text]) can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  20. Multipartite nonlocality and random measurements

    Science.gov (United States)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  1. [Intel random number generator-based true random number generator].

    Science.gov (United States)

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  2. Frequency characteristic measurement of a fiber optic gyroscope using a correlation spectrum analysis method based on a pseudo-random sequence

    International Nuclear Information System (INIS)

    Li, Yang; Chen, Xingfan; Liu, Cheng

    2015-01-01

    The frequency characteristic is an important indicator of a system’s dynamic performance. The identification of a fiber optic gyroscope (FOG)’s frequency characteristic using a correlation spectrum analysis method based on a pseudo-random sequence is proposed. Taking the angle vibrator as the source of the test rotation stimulation and a pseudo-random sequence as the test signal, the frequency characteristic of a FOG is calculated according to the power spectral density of the rotation rate signal and the cross-power spectral density of the FOG’s output signal and rotation rate signal. A theoretical simulation is done to confirm the validity of this method. An experiment system is built and the test results indicate that the measurement error of the normalized amplitude–frequency response is less than 0.01, that the error of the phase–frequency response is less than 0.3 rad, and the overall measurement accuracy is superior to the traditional frequency-sweep method. By using this method, the FOG’s amplitude–frequency response and phase–frequency response can be measured simultaneously, quickly, accurately, and with a high frequency resolution. The described method meets the requirements of engineering applications. (paper)

  3. Design and usability evaluation of user-centered and visual-based aids for dietary food measurement on mobile devices in a randomized controlled trial.

    Science.gov (United States)

    Liu, Ying-Chieh; Chen, Chien-Hung; Lee, Chien-Wei; Lin, Yu-Sheng; Chen, Hsin-Yun; Yeh, Jou-Yin; Chiu, Sherry Yueh-Hsia

    2016-12-01

    We designed and developed two interactive apps interfaces for dietary food measurements on mobile devices. The user-centered designs of both the IPI (interactive photo interface) and the SBI (sketching-based interface) were evaluated. Four types of outcomes were assessed to evaluate the usability of mobile devices for dietary measurements, including accuracy, absolute weight differences, and the response time to determine the efficacy of food measurements. The IPI presented users with images of pre-determined portion sizes of a specific food and allowed users to scan and then select the most representative image matching the food that they were measuring. The SBI required users to relate the food shape to a readily available comparator (e.g., credit card) and scribble to shade in the appropriate area. A randomized controlled trial was conducted to evaluate their usability. A total of 108 participants were randomly assigned into the following three groups: the IPI (n=36) and SBI (n=38) experimental groups and the traditional life-size photo (TLP) group as the control. A total of 18 types of food items with 3-4 different weights were randomly selected for assessment by each type. The independent Chi-square test and t-test were performed for the dichotomous and continuous variable analyses, respectively. The total accuracy rates were 66.98%, 44.15%, and 72.06% for the IPI, SBI, and TLP, respectively. No significant difference was observed between the IPI and TLP, regardless of the accuracy proportion or weight differences. The SBI accuracy rates were significantly lower than the IPI and TLP accuracy rates, especially for several spooned, square cube, and sliced pie food items. The time needed to complete the operation assessment by the user was significantly lower for the IPI than for the SBI. Our study corroborates that the user-centered visual-based design of the IPI on a mobile device is comparable the TLP in terms of the usability for dietary food measurements

  4. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    Science.gov (United States)

    Wayne, Peter M; Gow, Brian J; Costa, Madalena D; Peng, C-K; Lipsitz, Lewis A; Hausdorff, Jeffrey M; Davis, Roger B; Walsh, Jacquelyn N; Lough, Matthew; Novak, Vera; Yeh, Gloria Y; Ahn, Andrew C; Macklin, Eric A; Manor, Brad

    2014-01-01

    Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations. To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function. At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (P<0.001), and ML EC (P<0.001) conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018). Long- and short-term TC training were positively associated with physical function. Multiscale entropy offers a complementary

  5. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    Directory of Open Access Journals (Sweden)

    Peter M Wayne

    Full Text Available Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations.To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults.A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO and eyes-closed (EC. Anterior-posterior (AP and medio-lateral (ML sway speed, magnitude, and complexity (quantified by multiscale entropy were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function.At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs, TC-experts (n = 27, age 62.8±7.5 yrs exhibited greater complexity of sway in the AP EC (P = 0.023, ML EO (P<0.001, and ML EC (P<0.001 conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018. Long- and short-term TC training were positively associated with physical function.Multiscale entropy offers a complementary

  6. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies

    Science.gov (United States)

    Wayne, Peter M.; Gow, Brian J.; Costa, Madalena D.; Peng, C.-K.; Lipsitz, Lewis A.; Hausdorff, Jeffrey M.; Davis, Roger B.; Walsh, Jacquelyn N.; Lough, Matthew; Novak, Vera; Yeh, Gloria Y.; Ahn, Andrew C.; Macklin, Eric A.; Manor, Brad

    2014-01-01

    Background Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale “complexity” of postural sway fluctuations. Objectives To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. Methods A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up–and-Go tests characterized physical function. Results At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (Padults. Trial Registration ClinicalTrials.gov NCT01340365 PMID:25494333

  7. Classical and nonclassical randomness in quantum measurements

    International Nuclear Information System (INIS)

    Farenick, Douglas; Plosker, Sarah; Smith, Jerrod

    2011-01-01

    The space POVM H (X) of positive operator-valued probability measures on the Borel sets of a compact (or even locally compact) Hausdorff space X with values in B(H), the algebra of linear operators acting on a d-dimensional Hilbert space H, is studied from the perspectives of classical and nonclassical convexity through a transform Γ that associates any positive operator-valued measure ν with a certain completely positive linear map Γ(ν) of the homogeneous C*-algebra C(X) x B(H) into B(H). This association is achieved by using an operator-valued integral in which nonclassical random variables (that is, operator-valued functions) are integrated with respect to positive operator-valued measures and which has the feature that the integral of a random quantum effect is itself a quantum effect. A left inverse Ω for Γ yields an integral representation, along the lines of the classical Riesz representation theorem for linear functionals on C(X), of certain (but not all) unital completely positive linear maps φ:C(X) x B(H)→B(H). The extremal and C*-extremal points of POVM H (X) are determined.

  8. Pseudo-random number generator based on asymptotic deterministic randomness

    Science.gov (United States)

    Wang, Kai; Pei, Wenjiang; Xia, Haishan; Cheung, Yiu-ming

    2008-06-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks.

  9. Pseudo-random number generator based on asymptotic deterministic randomness

    International Nuclear Information System (INIS)

    Wang Kai; Pei Wenjiang; Xia Haishan; Cheung Yiuming

    2008-01-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks

  10. Quantitative time domain analysis of lifetime-based Förster resonant energy transfer measurements with fluorescent proteins: Static random isotropic fluorophore orientation distributions

    DEFF Research Database (Denmark)

    Alexandrov, Yuriy; Nikolic, Dino Solar; Dunsby, Christopher

    2018-01-01

    Förster resonant energy transfer (FRET) measurements are widely used to obtain information about molecular interactions and conformations through the dependence of FRET efficiency on the proximity of donor and acceptor fluorophores. Fluorescence lifetime measurements can provide quantitative...... into new software for fitting donor emission decay profiles. Calculated FRET parameters, including molar population fractions, are compared for the analysis of simulated and experimental FRET data under the assumption of static and dynamic fluorophores and the intermediate regimes between fully dynamic...... analysis of FRET efficiency and interacting population fraction. Many FRET experiments exploit the highly specific labelling of genetically expressed fluorescent proteins, applicable in live cells and organisms. Unfortunately, the typical assumption of fast randomization of fluorophore orientations...

  11. Concise biomarker for spatial-temporal change in three-dimensional ultrasound measurement of carotid vessel wall and plaque thickness based on a graph-based random walk framework: Towards sensitive evaluation of response to therapy.

    Science.gov (United States)

    Chiu, Bernard; Chen, Weifu; Cheng, Jieyu

    2016-12-01

    Rapid progression in total plaque area and volume measured from ultrasound images has been shown to be associated with an elevated risk of cardiovascular events. Since atherosclerosis is focal and predominantly occurring at the bifurcation, biomarkers that are able to quantify the spatial distribution of vessel-wall-plus-plaque thickness (VWT) change may allow for more sensitive detection of treatment effect. The goal of this paper is to develop simple and sensitive biomarkers to quantify the responsiveness to therapies based on the spatial distribution of VWT-Change on the entire 2D carotid standardized map previously described. Point-wise VWT-Changes computed for each patient were reordered lexicographically to a high-dimensional data node in a graph. A graph-based random walk framework was applied with the novel Weighted Cosine (WCos) similarity function introduced, which was tailored for quantification of responsiveness to therapy. The converging probability of each data node to the VWT regression template in the random walk process served as a scalar descriptor for VWT responsiveness to treatment. The WCos-based biomarker was 14 times more sensitive than the mean VWT-Change in discriminating responsive and unresponsive subjects based on the p-values obtained in T-tests. The proposed framework was extended to quantify where VWT-Change occurred by including multiple VWT-Change distribution templates representing focal changes at different regions. Experimental results show that the framework was effective in classifying carotid arteries with focal VWT-Change at different locations and may facilitate future investigations to correlate risk of cardiovascular events with the location where focal VWT-Change occurs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Random amplified polymorphic DNA based genetic characterization ...

    African Journals Online (AJOL)

    Random amplified polymorphic DNA based genetic characterization of four important species of Bamboo, found in Raigad district, Maharashtra State, India. ... Bambusoideae are differentiated from other members of the family by the presence of petiolate blades with parallel venation and stamens are three, four, six or more, ...

  13. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-01

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  14. Brain Tumor Segmentation Based on Random Forest

    Directory of Open Access Journals (Sweden)

    László Lefkovits

    2016-09-01

    Full Text Available In this article we present a discriminative model for tumor detection from multimodal MR images. The main part of the model is built around the random forest (RF classifier. We created an optimization algorithm able to select the important features for reducing the dimensionality of data. This method is also used to find out the training parameters used in the learning phase. The algorithm is based on random feature properties for evaluating the importance of the variable, the evolution of learning errors and the proximities between instances. The detection performances obtained have been compared with the most recent systems, offering similar results.

  15. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-06

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  16. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...... by the apparent shift being consistent with one of a number of numerical possibilities for the real shift which differ by 2n are resolved by combining measurements performed on the same sample using light paths therethrough of differing lengths....

  17. A comparison between the original and Tablet-based Symbol Digit Modalities Test in patients with schizophrenia: Test-retest agreement, random measurement error, practice effect, and ecological validity.

    Science.gov (United States)

    Tang, Shih-Fen; Chen, I-Hui; Chiang, Hsin-Yu; Wu, Chien-Te; Hsueh, I-Ping; Yu, Wan-Hui; Hsieh, Ching-Lin

    2017-11-27

    We aimed to compare the test-retest agreement, random measurement error, practice effect, and ecological validity of the original and Tablet-based Symbol Digit Modalities Test (T-SDMT) over five serial assessments, and to examine the concurrent validity of the T-SDMT in patients with schizophrenia. Sixty patients with chronic schizophrenia completed five serial assessments (one week apart) of the SDMT and T-SDMT and one assessment of the Activities of Daily Living Rating Scale III at the first time point. Both measures showed high test-retest agreement, similar levels of random measurement error over five serial assessments. Moreover, the practice effects of the two measures did not reach a plateau phase after five serial assessments in young and middle-aged participants. Nevertheless, only the practice effect of the T-SDMT became trivial after the first assessment. Like the SDMT, the T-SDMT had good ecological validity. The T-SDMT also had good concurrent validity with the SDMT. In addition, only the T-SDMT had discriminative validity to discriminate processing speed in young and middle-aged participants. Compared to the SDMT, the T-SDMT had overall slightly better psychometric properties, so it can be an alternative measure to the SDMT for assessing processing speed in patients with schizophrenia. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  19. Antibiotic treatment interruption of suspected lower respiratory tract infections based on a single procalcitonin measurement at hospital admission--a randomized trial

    DEFF Research Database (Denmark)

    Kristoffersen, K B; Søgaard, O S; Wejse, C

    2009-01-01

    Recent studies have suggested that procalcitonin (PCT) is a safe marker for the discrimination between bacterial and viral infection, and that PCT-guided treatment may lead to substantial reductions in antibiotic use. The present objective was to evaluate the effect of a single PCT measurement...... to either PCT-guided treatment or standard treatment. Antibiotic treatment duration in the PCT group was based on the serum PCT value at admission. The cut-off point for recommending antibiotic treatment was PCT > or =0.25 microg/L. Physicians could overrule treatment guidelines. The mean duration...... of hospital stay was 5.9 days in the PCT group vs. 6.7 days in the control group (p 0.22). The mean duration of antibiotic treatment during hospitalization in the PCT group was 5.1 days on average, as compared to 6.8 days in the control group (p 0.007). In a subgroup analysis of chronic obstructive pulmonary...

  20. Weak convergence to isotropic complex S α S $S\\alpha S$ random measure

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2017-09-01

    Full Text Available Abstract In this paper, we prove that an isotropic complex symmetric α-stable random measure ( 0 < α < 2 $0<\\alpha<2$ can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  1. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...... coherent light having a wavelength along an input light path, - producing scattering of said light from each of a plurality of interfaces within said apparatus including interfaces between said fluid and a surface bounding said fluid, said scattering producing an interference pattern formed by said...... scattered light, - cyclically varying the wavelength of said light in said input light path over a 1 nm to 20nm wide range of wavelengths a rate of from 10Hz to 50 KHz, - recording variation of intensity of the interfering light with change in wavelength of the light at an angle of observation...

  2. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  3. Measuring and Supporting Language Function for Children with Autism: Evidence from a Randomized Control Trial of a Social-Interaction-Based Therapy

    Science.gov (United States)

    Casenhiser, Devin M.; Binns, Amanda; McGill, Fay; Morderer, Olga; Shanker, Stuart G.

    2015-01-01

    In a report of the effectiveness of MEHRIT, a social-interaction-based intervention for autism, Casenhiser et al. ("Autism" 17(2):220-241, 2013) failed to find a significant advantage for language development in the treatment group using standardized language assessments. We present the results from a re-analysis of their results to…

  4. Randomness Representation of Turbulence in Canopy Flows Using Kolmogorov Complexity Measures

    Directory of Open Access Journals (Sweden)

    Dragutin Mihailović

    2017-09-01

    Full Text Available Turbulence is often expressed in terms of either irregular or random fluid flows, without quantification. In this paper, a methodology to evaluate the randomness of the turbulence using measures based on the Kolmogorov complexity (KC is proposed. This methodology is applied to experimental data from a turbulent flow developing in a laboratory channel with canopy of three different densities. The methodology is even compared with the traditional approach based on classical turbulence statistics.

  5. Random Number Simulations Reveal How Random Noise Affects the Measurements and Graphical Portrayals of Self-Assessed Competency

    Directory of Open Access Journals (Sweden)

    Edward Nuhfer

    2016-01-01

    Full Text Available Self-assessment measures of competency are blends of an authentic self-assessment signal that researchers seek to measure and random disorder or "noise" that accompanies that signal. In this study, we use random number simulations to explore how random noise affects critical aspects of self-assessment investigations: reliability, correlation, critical sample size, and the graphical representations of self-assessment data. We show that graphical conventions common in the self-assessment literature introduce artifacts that invite misinterpretation. Troublesome conventions include: (y minus x vs. (x scatterplots; (y minus x vs. (x column graphs aggregated as quantiles; line charts that display data aggregated as quantiles; and some histograms. Graphical conventions that generate minimal artifacts include scatterplots with a best-fit line that depict (y vs. (x measures (self-assessed competence vs. measured competence plotted by individual participant scores, and (y vs. (x scatterplots of collective average measures of all participants plotted item-by-item. This last graphic convention attenuates noise and improves the definition of the signal. To provide relevant comparisons across varied graphical conventions, we use a single dataset derived from paired measures of 1154 participants' self-assessed competence and demonstrated competence in science literacy. Our results show that different numerical approaches employed in investigating and describing self-assessment accuracy are not equally valid. By modeling this dataset with random numbers, we show how recognizing the varied expressions of randomness in self-assessment data can improve the validity of numeracy-based descriptions of self-assessment.

  6. Reheating-volume measure for random-walk inflation

    International Nuclear Information System (INIS)

    Winitzki, Sergei

    2008-01-01

    The recently proposed 'reheating-volume' (RV) measure promises to solve the long-standing problem of extracting probabilistic predictions from cosmological multiverse scenarios involving eternal inflation. I give a detailed description of the new measure and its applications to generic models of eternal inflation of random-walk type. For those models I derive a general formula for RV-regulated probability distributions that is suitable for numerical computations. I show that the results of the RV cutoff in random-walk type models are always gauge invariant and independent of the initial conditions at the beginning of inflation. In a toy model where equal-time cutoffs lead to the 'youngness paradox', the RV cutoff yields unbiased results that are distinct from previously proposed measures.

  7. A matched pair cluster randomized implementation trail to measure the effectiveness of an intervention package aiming to decrease perinatal mortality and increase institution-based obstetric care among indigenous women in Guatemala: study protocol.

    Science.gov (United States)

    Kestler, Edgar; Walker, Dilys; Bonvecchio, Anabelle; de Tejada, Sandra Sáenz; Donner, Allan

    2013-03-21

    Maternal and perinatal mortality continue to be a high priority problem on the health agendas of less developed countries. Despite the progress made in the last decade to quantify the magnitude of maternal mortality, few interventions have been implemented with the intent to measure impact directly on maternal or perinatal deaths. The success of interventions implemented in less developed countries to reduce mortality has been questioned, in terms of the tendency to maintain a clinical perspective with a focus on purely medical care separate from community-based approaches that take cultural and social aspects of maternal and perinatal deaths into account. Our innovative approach utilizes both the clinical and community perspectives; moreover, our study will report the weight that each of these components may have had on reducing perinatal mortality and increasing institution-based deliveries. A matched pair cluster-randomized trial will be conducted in clinics in four rural indigenous districts with the highest maternal mortality ratios in Guatemala. The individual clinic will serve as the unit of randomization, with 15 matched pairs of control and intervention clinics composing the final sample. Three interventions will be implemented in indigenous, rural and poor populations: a simulation training program for emergency obstetric and perinatal care, increased participation of the professional midwife in strengthening the link between traditional birth attendants (TBA) and the formal health care system, and a social marketing campaign to promote institution-based deliveries. No external intervention is planned for control clinics, although enhanced monitoring, surveillance and data collection will occur throughout the study in all clinics throughout the four districts. All obstetric events occurring in any of the participating health facilities and districts during the 18 months implementation period will be included in the analysis, controlling for the cluster

  8. Algorithmic randomness, physical entropy, measurements, and the second law

    International Nuclear Information System (INIS)

    Zurek, W.H.

    1989-01-01

    Algorithmic information content is equal to the size -- in the number of bits -- of the shortest program for a universal Turing machine which can reproduce a state of a physical system. In contrast to the statistical Boltzmann-Gibbs-Shannon entropy, which measures ignorance, the algorithmic information content is a measure of the available information. It is defined without a recourse to probabilities and can be regarded as a measure of randomness of a definite microstate. I suggest that the physical entropy S -- that is, the quantity which determines the amount of the work ΔW which can be extracted in the cyclic isothermal expansion process through the equation ΔW = k B TΔS -- is a sum of two contributions: the mission information measured by the usual statistical entropy and the known randomness measured by the algorithmic information content. The sum of these two contributions is a ''constant of motion'' in the process of a dissipation less measurement on an equilibrium ensemble. This conservation under a measurement, which can be traced back to the noiseless coding theorem of Shannon, is necessary to rule out existence of a successful Maxwell's demon. 17 refs., 3 figs

  9. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  10. Measuring symmetry, asymmetry and randomness in neural network connectivity.

    Directory of Open Access Journals (Sweden)

    Umberto Esposito

    Full Text Available Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity.

  11. Measuring symmetry, asymmetry and randomness in neural network connectivity.

    Science.gov (United States)

    Esposito, Umberto; Giugliano, Michele; van Rossum, Mark; Vasilaki, Eleni

    2014-01-01

    Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity.

  12. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  13. Quantitative measure of randomness and order for complete genomes

    Science.gov (United States)

    Kong, Sing-Guan; Fan, Wen-Lang; Chen, Hong-Da; Wigger, Jan; Torda, Andrew E.; Lee, H. C.

    2009-06-01

    We propose an order index, ϕ , which gives a quantitative measure of randomness and order of complete genomic sequences. It maps genomes to a number from 0 (random and of infinite length) to 1 (fully ordered) and applies regardless of sequence length. The 786 complete genomic sequences in GenBank were found to have ϕ values in a very narrow range, ϕg=0.031-0.015+0.028 . We show this implies that genomes are halfway toward being completely random, or, at the “edge of chaos.” We further show that artificial “genomes” converted from literary classics have ϕ ’s that almost exactly coincide with ϕg , but sequences of low information content do not. We infer that ϕg represents a high information-capacity “fixed point” in sequence space, and that genomes are driven to it by the dynamics of a robust growth and evolution process. We show that a growth process characterized by random segmental duplication can robustly drive genomes to the fixed point.

  14. Ashtanga-Based Yoga Therapy Increases the Sensory Contribution to Postural Stability in Visually-Impaired Persons at Risk for Falls as Measured by the Wii Balance Board: A Pilot Randomized Controlled Trial

    Science.gov (United States)

    Haaz Moonaz, Steffany; Bittner, Ava K.

    2015-01-01

    Objective Persons with visual impairment (VI) are at greater risk for falls due to irreparable damage to visual sensory input contributing to balance. Targeted training may significantly improve postural stability by strengthening the remaining sensory systems. Here, we evaluate the Ashtanga-based Yoga Therapy (AYT) program as a multi-sensory behavioral intervention to develop postural stability in VI. Design A randomized, waitlist-controlled, single-blind clinical trial Methods The trial was conducted between October 2012 and December 2013. Twenty-one legally blind participants were randomized to an 8-week AYT program (n = 11, mean (SD) age = 55(17)) or waitlist control (n=10, mean (SD) age = 55(10)). AYT subjects convened for one group session at a local yoga studio with an instructor and two individual home-based practice sessions per week for a total of 8 weeks. Subjects completed outcome measures at baseline and post-8 weeks of AYT. The primary outcome, absolute Center of Pressure (COP), was derived from the Wii Balance Board (WBB), a standalone posturography device, in 4 sensory conditions: firm surface, eyes open (EO); firm surface, eyes closed (EC); foam surface, EO; and foam surface, EC. Stabilization Indices (SI) were computed from COP measures to determine the relative visual (SIfirm, SIfoam), somatosensory (SIEO, SIEC) and vestibular (SIV, i.e., FoamEC vs. FirmEO) contributions to balance. This study was not powered to detect between group differences, so significance of pre-post changes was assessed by paired samples t-tests within each group. Results Groups were equivalent at baseline (all p > 0.05). In the AYT group, absolute COP significantly increased in the FoamEO (t(8) = -3.66, p = 0.01) and FoamEC (t(8) = -3.90, p = 0.01) conditions. Relative somatosensory SIEO (t(8) = -2.42, p = 0.04) and SIEC (t(8) = -3.96, p = 0.01), and vestibular SIV (t(8) = -2.47, p = 0.04) contributions to balance increased significantly. As expected, no significant

  15. Ashtanga-Based Yoga Therapy Increases the Sensory Contribution to Postural Stability in Visually-Impaired Persons at Risk for Falls as Measured by the Wii Balance Board: A Pilot Randomized Controlled Trial.

    Science.gov (United States)

    Jeter, Pamela E; Haaz Moonaz, Steffany; Bittner, Ava K; Dagnelie, Gislin

    2015-01-01

    Persons with visual impairment (VI) are at greater risk for falls due to irreparable damage to visual sensory input contributing to balance. Targeted training may significantly improve postural stability by strengthening the remaining sensory systems. Here, we evaluate the Ashtanga-based Yoga Therapy (AYT) program as a multi-sensory behavioral intervention to develop postural stability in VI. A randomized, waitlist-controlled, single-blind clinical trial. The trial was conducted between October 2012 and December 2013. Twenty-one legally blind participants were randomized to an 8-week AYT program (n = 11, mean (SD) age = 55(17)) or waitlist control (n=10, mean (SD) age = 55(10)). AYT subjects convened for one group session at a local yoga studio with an instructor and two individual home-based practice sessions per week for a total of 8 weeks. Subjects completed outcome measures at baseline and post-8 weeks of AYT. The primary outcome, absolute Center of Pressure (COP), was derived from the Wii Balance Board (WBB), a standalone posturography device, in 4 sensory conditions: firm surface, eyes open (EO); firm surface, eyes closed (EC); foam surface, EO; and foam surface, EC. Stabilization Indices (SI) were computed from COP measures to determine the relative visual (SIfirm, SIfoam), somatosensory (SIEO, SIEC) and vestibular (SIV, i.e., FoamEC vs. FirmEO) contributions to balance. This study was not powered to detect between group differences, so significance of pre-post changes was assessed by paired samples t-tests within each group. Groups were equivalent at baseline (all p > 0.05). In the AYT group, absolute COP significantly increased in the FoamEO (t(8) = -3.66, p = 0.01) and FoamEC (t(8) = -3.90, p = 0.01) conditions. Relative somatosensory SIEO (t(8) = -2.42, p = 0.04) and SIEC (t(8) = -3.96, p = 0.01), and vestibular SIV (t(8) = -2.47, p = 0.04) contributions to balance increased significantly. As expected, no significant changes from EO to EC conditions were

  16. Ashtanga-Based Yoga Therapy Increases the Sensory Contribution to Postural Stability in Visually-Impaired Persons at Risk for Falls as Measured by the Wii Balance Board: A Pilot Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Pamela E Jeter

    Full Text Available Persons with visual impairment (VI are at greater risk for falls due to irreparable damage to visual sensory input contributing to balance. Targeted training may significantly improve postural stability by strengthening the remaining sensory systems. Here, we evaluate the Ashtanga-based Yoga Therapy (AYT program as a multi-sensory behavioral intervention to develop postural stability in VI.A randomized, waitlist-controlled, single-blind clinical trial.The trial was conducted between October 2012 and December 2013. Twenty-one legally blind participants were randomized to an 8-week AYT program (n = 11, mean (SD age = 55(17 or waitlist control (n=10, mean (SD age = 55(10. AYT subjects convened for one group session at a local yoga studio with an instructor and two individual home-based practice sessions per week for a total of 8 weeks. Subjects completed outcome measures at baseline and post-8 weeks of AYT. The primary outcome, absolute Center of Pressure (COP, was derived from the Wii Balance Board (WBB, a standalone posturography device, in 4 sensory conditions: firm surface, eyes open (EO; firm surface, eyes closed (EC; foam surface, EO; and foam surface, EC. Stabilization Indices (SI were computed from COP measures to determine the relative visual (SIfirm, SIfoam, somatosensory (SIEO, SIEC and vestibular (SIV, i.e., FoamEC vs. FirmEO contributions to balance. This study was not powered to detect between group differences, so significance of pre-post changes was assessed by paired samples t-tests within each group.Groups were equivalent at baseline (all p > 0.05. In the AYT group, absolute COP significantly increased in the FoamEO (t(8 = -3.66, p = 0.01 and FoamEC (t(8 = -3.90, p = 0.01 conditions. Relative somatosensory SIEO (t(8 = -2.42, p = 0.04 and SIEC (t(8 = -3.96, p = 0.01, and vestibular SIV (t(8 = -2.47, p = 0.04 contributions to balance increased significantly. As expected, no significant changes from EO to EC conditions were found

  17. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    OpenAIRE

    Dhruba Das; Hemanta K. Baruah

    2015-01-01

    In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM...

  18. Measurement model choice influenced randomized controlled trial results.

    Science.gov (United States)

    Gorter, Rosalie; Fox, Jean-Paul; Apeldoorn, Adri; Twisk, Jos

    2016-11-01

    In randomized controlled trials (RCTs), outcome variables are often patient-reported outcomes measured with questionnaires. Ideally, all available item information is used for score construction, which requires an item response theory (IRT) measurement model. However, in practice, the classical test theory measurement model (sum scores) is mostly used, and differences between response patterns leading to the same sum score are ignored. The enhanced differentiation between scores with IRT enables more precise estimation of individual trajectories over time and group effects. The objective of this study was to show the advantages of using IRT scores instead of sum scores when analyzing RCTs. Two studies are presented, a real-life RCT, and a simulation study. Both IRT and sum scores are used to measure the construct and are subsequently used as outcomes for effect calculation. The bias in RCT results is conditional on the measurement model that was used to construct the scores. A bias in estimated trend of around one standard deviation was found when sum scores were used, where IRT showed negligible bias. Accurate statistical inferences are made from an RCT study when using IRT to estimate construct measurements. The use of sum scores leads to incorrect RCT results. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  20. A generator for unique quantum random numbers based on vacuum states

    DEFF Research Database (Denmark)

    Gabriel, C.; Wittmann, C.; Sych, D.

    2010-01-01

    the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably......Random numbers are a valuable component in diverse applications that range from simulations(1) over gambling to cryptography(2,3). The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational...... unpredictability of quantum mechanics(4-11). However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique(12-15). Here we present a simple experimental setup based on homodyne measurements that uses...

  1. Random walks in the quarter-plane: invariant measures and performance bounds

    NARCIS (Netherlands)

    Chen, Y.

    2015-01-01

    This monograph focuses on random walks in the quarter-plane. Such random walks are frequently used to model queueing systems and the invariant measure of a random walk is of major importance in studying the performance of these systems. In special cases the invariant measure of a random walk can be

  2. The Random Walk Model Based on Bipartite Network

    Directory of Open Access Journals (Sweden)

    Zhang Man-Dun

    2016-01-01

    Full Text Available With the continuing development of the electronic commerce and growth of network information, there is a growing possibility for citizens to be confused by the information. Though the traditional technology of information retrieval have the ability to relieve the overload of information in some extent, it can not offer a targeted personality service based on user’s interests and activities. In this context, the recommendation algorithm arose. In this paper, on the basis of conventional recommendation, we studied the scheme of random walk based on bipartite network and the application of it. We put forward a similarity measurement based on implicit feedback. In this method, a uneven character vector is imported(the weight of item in the system. We put forward a improved random walk pattern which make use of partial or incomplete neighbor information to create recommendation information. In the end, there is an experiment in the real data set, the recommendation accuracy and practicality are improved. We promise the reality of the result of the experiment

  3. A programmable Gaussian random pulse generator for automated performance measurements

    International Nuclear Information System (INIS)

    Abdel-Aal, R.E.

    1989-01-01

    This paper describes a versatile random signal generator which produces logic pulses with a Gaussian distribution for the pulse spacing. The average rate at the pulse generator output can be software-programmed, which makes it useful in performing automated measurements of dead time and CPU time performance of data acquisition systems and modules over a wide range of data rates. Hardware and software components are described and data on the input-output characteristics and the statistical properties of the pulse generator are given. Typical applications are discussed together with advantages over using radioactive test sources. Results obtained from an automated performance run on a VAX 11/785 data acquisition system are presented. (orig.)

  4. Online evolution reconstruction from a single measurement record with random time intervals for quantum communication

    Science.gov (United States)

    Zhou, Hua; Su, Yang; Wang, Rong; Zhu, Yong; Shen, Huiping; Pu, Tao; Wu, Chuanxin; Zhao, Jiyong; Zhang, Baofu; Xu, Zhiyong

    2017-10-01

    Online reconstruction of a time-variant quantum state from the encoding/decoding results of quantum communication is addressed by developing a method of evolution reconstruction from a single measurement record with random time intervals. A time-variant two-dimensional state is reconstructed on the basis of recovering its expectation value functions of three nonorthogonal projectors from a random single measurement record, which is composed from the discarded qubits of the six-state protocol. The simulated results prove that our method is robust to typical metro quantum channels. Our work extends the Fourier-based method of evolution reconstruction from the version for a regular single measurement record with equal time intervals to a unified one, which can be applied to arbitrary single measurement records. The proposed protocol of evolution reconstruction runs concurrently with the one of quantum communication, which can facilitate the online quantum tomography.

  5. Pseudo-random bit generator based on Chebyshev map

    Science.gov (United States)

    Stoyanov, B. P.

    2013-10-01

    In this paper, we study a pseudo-random bit generator based on two Chebyshev polynomial maps. The novel derivative algorithm shows perfect statistical properties established by number of statistical tests.

  6. Random number generation based on digital differential chaos

    KAUST Repository

    Zidan, Mohammed A.; Radwan, Ahmed G.; Salama, Khaled N.

    2012-01-01

    In this paper, we present a fully digital differential chaos based random number generator. The output of the digital circuit is proved to be chaotic by calculating the output time series maximum Lyapunov exponent. We introduce a new post processing

  7. Ultrafast quantum random number generation based on quantum phase fluctuations.

    Science.gov (United States)

    Xu, Feihu; Qi, Bing; Ma, Xiongfeng; Xu, He; Zheng, Haoxuan; Lo, Hoi-Kwong

    2012-05-21

    A quantum random number generator (QRNG) can generate true randomness by exploiting the fundamental indeterminism of quantum mechanics. Most approaches to QRNG employ single-photon detection technologies and are limited in speed. Here, we experimentally demonstrate an ultrafast QRNG at a rate over 6 Gbits/s based on the quantum phase fluctuations of a laser operating near threshold. Moreover, we consider a potential adversary who has partial knowledge on the raw data and discuss how one can rigorously remove such partial knowledge with postprocessing. We quantify the quantum randomness through min-entropy by modeling our system and employ two randomness extractors--Trevisan's extractor and Toeplitz-hashing--to distill the randomness, which is information-theoretically provable. The simplicity and high-speed of our experimental setup show the feasibility of a robust, low-cost, high-speed QRNG.

  8. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  9. Modeling and optimizing of the random atomic spin gyroscope drift based on the atomic spin gyroscope

    Energy Technology Data Exchange (ETDEWEB)

    Quan, Wei; Lv, Lin, E-mail: lvlinlch1990@163.com; Liu, Baiqi [School of Instrument Science and Opto-Electronics Engineering, Beihang University, Beijing 100191 (China)

    2014-11-15

    In order to improve the atom spin gyroscope's operational accuracy and compensate the random error caused by the nonlinear and weak-stability characteristic of the random atomic spin gyroscope (ASG) drift, the hybrid random drift error model based on autoregressive (AR) and genetic programming (GP) + genetic algorithm (GA) technique is established. The time series of random ASG drift is taken as the study object. The time series of random ASG drift is acquired by analyzing and preprocessing the measured data of ASG. The linear section model is established based on AR technique. After that, the nonlinear section model is built based on GP technique and GA is used to optimize the coefficients of the mathematic expression acquired by GP in order to obtain a more accurate model. The simulation result indicates that this hybrid model can effectively reflect the characteristics of the ASG's random drift. The square error of the ASG's random drift is reduced by 92.40%. Comparing with the AR technique and the GP + GA technique, the random drift is reduced by 9.34% and 5.06%, respectively. The hybrid modeling method can effectively compensate the ASG's random drift and improve the stability of the system.

  10. Pseudo-random bit generator based on lag time series

    Science.gov (United States)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  11. Strain measurement based battery testing

    Science.gov (United States)

    Xu, Jeff Qiang; Steiber, Joe; Wall, Craig M.; Smith, Robert; Ng, Cheuk

    2017-05-23

    A method and system for strain-based estimation of the state of health of a battery, from an initial state to an aged state, is provided. A strain gauge is applied to the battery. A first strain measurement is performed on the battery, using the strain gauge, at a selected charge capacity of the battery and at the initial state of the battery. A second strain measurement is performed on the battery, using the strain gauge, at the selected charge capacity of the battery and at the aged state of the battery. The capacity degradation of the battery is estimated as the difference between the first and second strain measurements divided by the first strain measurement.

  12. DNA-based random number generation in security circuitry.

    Science.gov (United States)

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  13. Variances as order parameter and complexity measure for random Boolean networks

    International Nuclear Information System (INIS)

    Luque, Bartolo; Ballesteros, Fernando J; Fernandez, Manuel

    2005-01-01

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems

  14. Variances as order parameter and complexity measure for random Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  15. Implementing traceability using particle randomness-based textile printed tags

    Science.gov (United States)

    Agrawal, T. K.; Koehl, L.; Campagne, C.

    2017-10-01

    This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.

  16. N-state random switching based on quantum tunnelling

    Science.gov (United States)

    Bernardo Gavito, Ramón; Jiménez Urbanos, Fernando; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J.; Woodhead, Christopher S.; Missous, Mohamed; Roedig, Utz; Young, Robert J.

    2017-08-01

    In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.

  17. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study

    OpenAIRE

    Ayça Utkan Karasu; Elif Balevi Batur; Gülçin Kaymak Karataş

    2018-01-01

    Objective: To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. Methods: During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12) or the control group (n = 11) by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Asses...

  18. Some Limits Using Random Slope Models to Measure Academic Growth

    Directory of Open Access Journals (Sweden)

    Daniel B. Wright

    2017-11-01

    Full Text Available Academic growth is often estimated using a random slope multilevel model with several years of data. However, if there are few time points, the estimates can be unreliable. While using random slope multilevel models can lower the variance of the estimates, these procedures can produce more highly erroneous estimates—zero and negative correlations with the true underlying growth—than using ordinary least squares estimates calculated for each student or school individually. An example is provided where schools with increasing graduation rates are estimated to have negative growth and vice versa. The estimation is worse when the underlying data are skewed. It is recommended that there are at least six time points for estimating growth if using a random slope model. A combination of methods can be used to avoid some of the aberrant results if it is not possible to have six or more time points.

  19. Randomly forced CGL equation stationary measures and the inviscid limit

    CERN Document Server

    Kuksin, S

    2003-01-01

    We study a complex Ginzburg-Landau (CGL) equation perturbed by a random force which is white in time and smooth in the space variable~$x$. Assuming that $\\dim x\\le4$, we prove that this equation has a unique solution and discuss its asymptotic in time properties. Next we consider the case when the random force is proportional to the square root of the viscosity and study the behaviour of stationary solutions as the viscosity goes to zero. We show that, under this limit, a subsequence of solutions in question converges to a nontrivial stationary process formed by global strong solutions of the nonlinear Schr\\"odinger equation.

  20. Age replacement policy based on imperfect repair with random probability

    International Nuclear Information System (INIS)

    Lim, J.H.; Qu, Jian; Zuo, Ming J.

    2016-01-01

    In most of literatures of age replacement policy, failures before planned replacement age can be either minimally repaired or perfectly repaired based on the types of failures, cost for repairs and so on. In this paper, we propose age replacement policy based on imperfect repair with random probability. The proposed policy incorporates the case that such intermittent failure can be either minimally repaired or perfectly repaired with random probabilities. The mathematical formulas of the expected cost rate per unit time are derived for both the infinite-horizon case and the one-replacement-cycle case. For each case, we show that the optimal replacement age exists and is finite. - Highlights: • We propose a new age replacement policy with random probability of perfect repair. • We develop the expected cost per unit time. • We discuss the optimal age for replacement minimizing the expected cost rate.

  1. ANALYTIC WORD RECOGNITION WITHOUT SEGMENTATION BASED ON MARKOV RANDOM FIELDS

    NARCIS (Netherlands)

    Coisy, C.; Belaid, A.

    2004-01-01

    In this paper, a method for analytic handwritten word recognition based on causal Markov random fields is described. The words models are HMMs where each state corresponds to a letter; each letter is modelled by a NSHP­HMM (Markov field). Global models are build dynamically, and used for recognition

  2. Modal Analysis Based on the Random Decrement Transform

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    of this paper is to present a state-of-the-art description of the Random Decrement technique where the statistical theory is outlined and examples are given. But also new results such as estimation of frequency response functions and quality assessment are introduced. Special attention is given......During the last years several papers utilizing the Random Decrement transform as a basis for extraction of modal parameters from the response of linear systems subjected to unknown ambient loads have been presented. Although the Random Decrement technique was developed in a decade starting from...... the introduktion in 1968 the technique seems still to be attractive. This is probably due to the simplicity and the speed of the algorithm and the fact that the theory of the technique has been extended by introducing statistical measures such as correlation functions or spectral densities. The purpose...

  3. Novel pseudo-random number generator based on quantum random walks

    Science.gov (United States)

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-02-01

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation.

  4. Novel pseudo-random number generator based on quantum random walks.

    Science.gov (United States)

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-02-04

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation.

  5. Text Clustering Algorithm Based on Random Cluster Core

    Directory of Open Access Journals (Sweden)

    Huang Long-Jun

    2016-01-01

    Full Text Available Nowadays clustering has become a popular text mining algorithm, but the huge data can put forward higher requirements for the accuracy and performance of text mining. In view of the performance bottleneck of traditional text clustering algorithm, this paper proposes a text clustering algorithm with random features. This is a kind of clustering algorithm based on text density, at the same time using the neighboring heuristic rules, the concept of random cluster is introduced, which effectively reduces the complexity of the distance calculation.

  6. Optical image encryption based on interference under convergent random illumination

    International Nuclear Information System (INIS)

    Kumar, Pramod; Joseph, Joby; Singh, Kehar

    2010-01-01

    In an optical image encryption system based on the interference principle, two pure phase masks are designed analytically to hide an image. These two masks are illuminated with a plane wavefront to retrieve the original image in the form of an interference pattern at the decryption plane. Replacement of the plane wavefront with convergent random illumination in the proposed scheme leads to an improvement in the security of interference based encryption. The proposed encryption scheme retains the simplicity of an interference based method, as the two pure masks are generated with an analytical method without any iterative algorithm. In addition to the free-space propagation distance and the two pure phase masks, the convergence distance and the randomized lens phase function are two new encryption parameters to enhance the system security. The robustness of this scheme against occlusion of the random phase mask of the randomized lens phase function is investigated. The feasibility of the proposed scheme is demonstrated with numerical simulation results

  7. A random spatial network model based on elementary postulates

    Science.gov (United States)

    Karlinger, Michael R.; Troutman, Brent M.

    1989-01-01

    A model for generating random spatial networks that is based on elementary postulates comparable to those of the random topology model is proposed. In contrast to the random topology model, this model ascribes a unique spatial specification to generated drainage networks, a distinguishing property of some network growth models. The simplicity of the postulates creates an opportunity for potential analytic investigations of the probabilistic structure of the drainage networks, while the spatial specification enables analyses of spatially dependent network properties. In the random topology model all drainage networks, conditioned on magnitude (number of first-order streams), are equally likely, whereas in this model all spanning trees of a grid, conditioned on area and drainage density, are equally likely. As a result, link lengths in the generated networks are not independent, as usually assumed in the random topology model. For a preliminary model evaluation, scale-dependent network characteristics, such as geometric diameter and link length properties, and topologic characteristics, such as bifurcation ratio, are computed for sets of drainage networks generated on square and rectangular grids. Statistics of the bifurcation and length ratios fall within the range of values reported for natural drainage networks, but geometric diameters tend to be relatively longer than those for natural networks.

  8. Mindfulness-based stress reduction for residents: A randomized controlled trial

    NARCIS (Netherlands)

    Verweij, H.; Ravesteijn, H.J. van; Hooff, M.L.M. van; Lagro-Janssen, A.L.M.; Speckens, A.E.M.

    2018-01-01

    Background: Burnout is highly prevalent in residents. No randomized controlled trials have been conducted measuring the effects of Mindfulness-Based Stress Reduction (MBSR) on burnout in residents. Objective: To determine the effectiveness of MBSR in reducing burnout in residents. Design: A

  9. Random Valued Impulse Noise Removal Using Region Based Detection Approach

    Directory of Open Access Journals (Sweden)

    S. Banerjee

    2017-12-01

    Full Text Available Removal of random valued noisy pixel is extremely challenging when the noise density is above 50%. The existing filters are generally not capable of eliminating such noise when density is above 70%. In this paper a region wise density based detection algorithm for random valued impulse noise has been proposed. On the basis of the intensity values, the pixels of a particular window are sorted and then stored into four regions. The higher density based region is considered for stepwise detection of noisy pixels. As a result of this detection scheme a maximum of 75% of noisy pixels can be detected. For this purpose this paper proposes a unique noise removal algorithm. It was experimentally proved that the proposed algorithm not only performs exceptionally when it comes to visual qualitative judgment of standard images but also this filter combination outsmarts the existing algorithm in terms of MSE, PSNR and SSIM comparison even up to 70% noise density level.

  10. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Dhruba Das

    2015-04-01

    Full Text Available In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM/M/1 and M/FM/1 has been studied and constructed their membership functions of the system characteristics based on the aforesaid principle. The former represents a queue with fuzzy exponential arrivals and exponential service rate while the latter represents a queue with exponential arrival rate and fuzzy exponential service rate.

  11. Effects of psychological therapies in randomized trials and practice-based studies.

    Science.gov (United States)

    Barkham, Michael; Stiles, William B; Connell, Janice; Twigg, Elspeth; Leach, Chris; Lucock, Mike; Mellor-Clark, John; Bower, Peter; King, Michael; Shapiro, David A; Hardy, Gillian E; Greenberg, Leslie; Angus, Lynne

    2008-11-01

    Randomized trials of the effects of psychological therapies seek internal validity via homogeneous samples and standardized treatment protocols. In contrast, practice-based studies aim for clinical realism and external validity via heterogeneous samples of clients treated under routine practice conditions. We compared indices of treatment effects in these two types of studies. Using published transformation formulas, the Beck Depression Inventory (BDI) scores from five randomized trials of depression (N = 477 clients) were transformed into Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) scores and compared with CORE-OM data collected in four practice-based studies (N = 4,196 clients). Conversely, the practice-based studies' CORE-OM scores were transformed into BDI scores and compared with randomized trial data. Randomized trials showed a modest advantage over practice-based studies in amount of pre-post improvement. This difference was compressed or exaggerated depending on the direction of the transformation but averaged about 12%. There was a similarly sized advantage to randomized trials in rates of reliable and clinically significant improvement (RCSI). The largest difference was yielded by comparisons of effect sizes which suggested an advantage more than twice as large, reflecting narrower pre-treatment distributions in the randomized trials. Outcomes of completed treatments for depression in randomized trials appeared to be modestly greater than those in routine care settings. The size of the difference may be distorted depending on the method for calculating degree of change. Transforming BDI scores into CORE-OM scores and vice versa may be a preferable alternative to effect sizes for comparisons of studies using these measures.

  12. A probability measure for random surfaces of arbitrary genus and bosonic strings in 4 dimensions

    International Nuclear Information System (INIS)

    Albeverio, S.; Hoeegh-Krohn, R.; Paycha, S.; Scarlatti, S.

    1989-01-01

    We define a probability measure describing random surfaces in R D , 3≤D≤13, parametrized by compact Riemann surfaces of arbitrary genus. The measure involves the path space measure for scalar fields with exponential interaction in 2 space time dimensions. We show that it gives a mathematical realization of Polyakov's heuristic measure for bosonic strings. (orig.)

  13. Prediction of Geological Subsurfaces Based on Gaussian Random Field Models

    Energy Technology Data Exchange (ETDEWEB)

    Abrahamsen, Petter

    1997-12-31

    During the sixties, random functions became practical tools for predicting ore reserves with associated precision measures in the mining industry. This was the start of the geostatistical methods called kriging. These methods are used, for example, in petroleum exploration. This thesis reviews the possibilities for using Gaussian random functions in modelling of geological subsurfaces. It develops methods for including many sources of information and observations for precise prediction of the depth of geological subsurfaces. The simple properties of Gaussian distributions make it possible to calculate optimal predictors in the mean square sense. This is done in a discussion of kriging predictors. These predictors are then extended to deal with several subsurfaces simultaneously. It is shown how additional velocity observations can be used to improve predictions. The use of gradient data and even higher order derivatives are also considered and gradient data are used in an example. 130 refs., 44 figs., 12 tabs.

  14. Some effects of random dose measurement errors on analysis of atomic bomb survivor data

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1985-01-01

    The effects of random dose measurement errors on analyses of atomic bomb survivor data are described and quantified for several procedures. It is found that the ways in which measurement error is most likely to mislead are through downward bias in the estimated regression coefficients and through distortion of the shape of the dose-response curve. The magnitude of the bias with simple linear regression is evaluated for several dose treatments including the use of grouped and ungrouped data, analyses with and without truncation at 600 rad, and analyses which exclude doses exceeding 200 rad. Limited calculations have also been made for maximum likelihood estimation based on Poisson regression. 16 refs., 6 tabs

  15. Bayesian randomized item response modeling for sensitive measurements

    NARCIS (Netherlands)

    Avetisyan, Marianna

    2012-01-01

    In behavioral, health, and social sciences, any endeavor involving measurement is directed at accurate representation of the latent concept with the manifest observation. However, when sensitive topics, such as substance abuse, tax evasion, or felony, are inquired, substantial distortion of reported

  16. Towards the generation of random bits at terahertz rates based on a chaotic semiconductor laser

    International Nuclear Information System (INIS)

    Kanter, Ido; Aviad, Yaara; Reidler, Igor; Cohen, Elad; Rosenbluh, Michael

    2010-01-01

    Random bit generators (RBGs) are important in many aspects of statistical physics and crucial in Monte-Carlo simulations, stochastic modeling and quantum cryptography. The quality of a RBG is measured by the unpredictability of the bit string it produces and the speed at which the truly random bits can be generated. Deterministic algorithms generate pseudo-random numbers at high data rates as they are only limited by electronic hardware speed, but their unpredictability is limited by the very nature of their deterministic origin. It is widely accepted that the core of any true RBG must be an intrinsically non-deterministic physical process, e.g. measuring thermal noise from a resistor. Owing to low signal levels, such systems are highly susceptible to bias, introduced by amplification, and to small nonrandom external perturbations resulting in a limited generation rate, typically less than 100M bit/s. We present a physical random bit generator, based on a chaotic semiconductor laser, having delayed optical feedback, which operates reliably at rates up to 300Gbit/s. The method uses a high derivative of the digitized chaotic laser intensity and generates the random sequence by retaining a number of the least significant bits of the high derivative value. The method is insensitive to laser operational parameters and eliminates the necessity for all external constraints such as incommensurate sampling rates and laser external cavity round trip time. The randomness of long bit strings is verified by standard statistical tests.

  17. Towards the generation of random bits at terahertz rates based on a chaotic semiconductor laser

    Science.gov (United States)

    Kanter, Ido; Aviad, Yaara; Reidler, Igor; Cohen, Elad; Rosenbluh, Michael

    2010-06-01

    Random bit generators (RBGs) are important in many aspects of statistical physics and crucial in Monte-Carlo simulations, stochastic modeling and quantum cryptography. The quality of a RBG is measured by the unpredictability of the bit string it produces and the speed at which the truly random bits can be generated. Deterministic algorithms generate pseudo-random numbers at high data rates as they are only limited by electronic hardware speed, but their unpredictability is limited by the very nature of their deterministic origin. It is widely accepted that the core of any true RBG must be an intrinsically non-deterministic physical process, e.g. measuring thermal noise from a resistor. Owing to low signal levels, such systems are highly susceptible to bias, introduced by amplification, and to small nonrandom external perturbations resulting in a limited generation rate, typically less than 100M bit/s. We present a physical random bit generator, based on a chaotic semiconductor laser, having delayed optical feedback, which operates reliably at rates up to 300Gbit/s. The method uses a high derivative of the digitized chaotic laser intensity and generates the random sequence by retaining a number of the least significant bits of the high derivative value. The method is insensitive to laser operational parameters and eliminates the necessity for all external constraints such as incommensurate sampling rates and laser external cavity round trip time. The randomness of long bit strings is verified by standard statistical tests.

  18. Cellular Automata-Based Parallel Random Number Generators Using FPGAs

    Directory of Open Access Journals (Sweden)

    David H. K. Hoe

    2012-01-01

    Full Text Available Cellular computing represents a new paradigm for implementing high-speed massively parallel machines. Cellular automata (CA, which consist of an array of locally connected processing elements, are a basic form of a cellular-based architecture. The use of field programmable gate arrays (FPGAs for implementing CA accelerators has shown promising results. This paper investigates the design of CA-based pseudo-random number generators (PRNGs using an FPGA platform. To improve the quality of the random numbers that are generated, the basic CA structure is enhanced in two ways. First, the addition of a superrule to each CA cell is considered. The resulting self-programmable CA (SPCA uses the superrule to determine when to make a dynamic rule change in each CA cell. The superrule takes its inputs from neighboring cells and can be considered itself a second CA working in parallel with the main CA. When implemented on an FPGA, the use of lookup tables in each logic cell removes any restrictions on how the super-rules should be defined. Second, a hybrid configuration is formed by combining a CA with a linear feedback shift register (LFSR. This is advantageous for FPGA designs due to the compactness of the LFSR implementations. A standard software package for statistically evaluating the quality of random number sequences known as Diehard is used to validate the results. Both the SPCA and the hybrid CA/LFSR were found to pass all the Diehard tests.

  19. Research on machine learning framework based on random forest algorithm

    Science.gov (United States)

    Ren, Qiong; Cheng, Hui; Han, Hai

    2017-03-01

    With the continuous development of machine learning, industry and academia have released a lot of machine learning frameworks based on distributed computing platform, and have been widely used. However, the existing framework of machine learning is limited by the limitations of machine learning algorithm itself, such as the choice of parameters and the interference of noises, the high using threshold and so on. This paper introduces the research background of machine learning framework, and combined with the commonly used random forest algorithm in machine learning classification algorithm, puts forward the research objectives and content, proposes an improved adaptive random forest algorithm (referred to as ARF), and on the basis of ARF, designs and implements the machine learning framework.

  20. Chaos-based Pseudo-random Number Generation

    KAUST Repository

    Barakat, Mohamed L.

    2014-04-10

    Various methods and systems related to chaos-based pseudo-random number generation are presented. In one example, among others, a system includes a pseudo-random number generator (PRNG) to generate a series of digital outputs and a nonlinear post processing circuit to perform an exclusive OR (XOR) operation on a first portion of a current digital output of the PRNG and a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output. In another example, a method includes receiving at least a first portion of a current output from a PRNG and performing an XOR operation on the first portion of the current PRNG output with a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output.

  1. Chaos-based Pseudo-random Number Generation

    KAUST Repository

    Barakat, Mohamed L.; Mansingka, Abhinav S.; Radwan, Ahmed Gomaa Ahmed; Salama, Khaled N.

    2014-01-01

    Various methods and systems related to chaos-based pseudo-random number generation are presented. In one example, among others, a system includes a pseudo-random number generator (PRNG) to generate a series of digital outputs and a nonlinear post processing circuit to perform an exclusive OR (XOR) operation on a first portion of a current digital output of the PRNG and a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output. In another example, a method includes receiving at least a first portion of a current output from a PRNG and performing an XOR operation on the first portion of the current PRNG output with a permutated version of a corresponding first portion of a previous post processed output to generate a corresponding first portion of a current post processed output.

  2. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    Science.gov (United States)

    Czégel, Dániel; Palla, Gergely

    2015-01-01

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications of the transition matrix describing the random walk process. In addition, the tests on real world networks provided very intuitive results, e.g., the trophic levels obtained from our approach on a food web were highly consistent with former results from ecology. PMID:26657012

  3. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    Science.gov (United States)

    Czégel, Dániel; Palla, Gergely

    2015-12-10

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications of the transition matrix describing the random walk process. In addition, the tests on real world networks provided very intuitive results, e.g., the trophic levels obtained from our approach on a food web were highly consistent with former results from ecology.

  4. Two Notes on Measure-Theoretic Entropy of Random Dynamical Systems

    Institute of Scientific and Technical Information of China (English)

    YuJun ZHU

    2009-01-01

    In this paper, Brin-Katok local entropy formula and Katok's definition of the measure theoretic entropy using spanning set are established for the random dynamical system over an invertible ergodic system.

  5. DNA based random key generation and management for OTP encryption.

    Science.gov (United States)

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  6. The invariant measure of random walks in the quarter-plane: respresentation in geometric terms

    NARCIS (Netherlands)

    Chen, Y.; Boucherie, Richardus J.; Goseling, Jasper

    We consider the invariant measure of homogeneous random walks in the quarter-plane. In particular, we consider measures that can be expressed as a finite linear combination of geometric terms and present conditions on the structure of these linear combinations such that the resulting measure may

  7. An AES chip with DPA resistance using hardware-based random order execution

    International Nuclear Information System (INIS)

    Yu Bo; Li Xiangyu; Chen Cong; Sun Yihe; Wu Liji; Zhang Xiangmin

    2012-01-01

    This paper presents an AES (advanced encryption standard) chip that combats differential power analysis (DPA) side-channel attack through hardware-based random order execution. Both decryption and encryption procedures of an AES are implemented on the chip. A fine-grained dataflow architecture is proposed, which dynamically exploits intrinsic byte-level independence in the algorithm. A novel circuit called an HMF (Hold-Match-Fetch) unit is proposed for random control, which randomly sets execution orders for concurrent operations. The AES chip was manufactured in SMIC 0.18 μm technology. The average energy for encrypting one group of plain texts (128 bits secrete keys) is 19 nJ. The core area is 0.43 mm 2 . A sophisticated experimental setup was built to test the DPA resistance. Measurement-based experimental results show that one byte of a secret key cannot be disclosed from our chip under random mode after 64000 power traces were used in the DPA attack. Compared with the corresponding fixed order execution, the hardware based random order execution is improved by at least 21 times the DPA resistance. (semiconductor integrated circuits)

  8. An AES chip with DPA resistance using hardware-based random order execution

    Science.gov (United States)

    Bo, Yu; Xiangyu, Li; Cong, Chen; Yihe, Sun; Liji, Wu; Xiangmin, Zhang

    2012-06-01

    This paper presents an AES (advanced encryption standard) chip that combats differential power analysis (DPA) side-channel attack through hardware-based random order execution. Both decryption and encryption procedures of an AES are implemented on the chip. A fine-grained dataflow architecture is proposed, which dynamically exploits intrinsic byte-level independence in the algorithm. A novel circuit called an HMF (Hold-Match-Fetch) unit is proposed for random control, which randomly sets execution orders for concurrent operations. The AES chip was manufactured in SMIC 0.18 μm technology. The average energy for encrypting one group of plain texts (128 bits secrete keys) is 19 nJ. The core area is 0.43 mm2. A sophisticated experimental setup was built to test the DPA resistance. Measurement-based experimental results show that one byte of a secret key cannot be disclosed from our chip under random mode after 64000 power traces were used in the DPA attack. Compared with the corresponding fixed order execution, the hardware based random order execution is improved by at least 21 times the DPA resistance.

  9. Random fiber lasers based on artificially controlled backscattering fibers

    Science.gov (United States)

    Chen, Daru; Wang, Xiaoliang; She, Lijuan; Qiang, Zexuan; Yu, Zhangwei

    2017-10-01

    The random fiber laser (RFL) which is a milestone in laser physics and nonlinear optics, has attracted considerable attention recently. Most previous RFLs are based on distributed feedback of Rayleigh scattering amplified through stimulated Raman/Brillouin scattering effect in single mode fibers, which required long-distance (tens of kilometers) single mode fibers and high threshold up to watt-level due to the extremely small Rayleigh scattering coefficient of the fiber. We proposed and demonstrated a half-open cavity RFL based on a segment of a artificially controlled backscattering SMF(ACB-SMF) with a length of 210m, 310m or 390m. A fiber Bragg grating with the central wavelength of 1530nm and a segment of ACB-SMF forms the half-open cavity. The proposed RFL achieves the threshold of 25mW, 30mW and 30mW, respectively. Random lasing at the wavelength of 1530nm and the extinction ratio of 50dB is achieved when a segment of 5m EDF is pumped by a 980nm LD in the RFL. Another half-open cavity RFL based on a segment of a artificially controlled backscattering EDF(ACBS-EDF) is also demonstrated without an ACB-SMF. The 3m ACB-EDF is fabricated by using the femtosecond laser with pulse energy of 0.34mJ which introduces about 50 reflectors in the EDF. Random lasing at the wavelength of 1530nm is achieved with the output power of 7.5mW and the efficiency of 1.88%. Two novel RFLs with much short cavities have been achieved with low threshold and high efficiency.

  10. A random network based, node attraction facilitated network evolution method

    Directory of Open Access Journals (Sweden)

    WenJun Zhang

    2016-03-01

    Full Text Available In present study, I present a method of network evolution that based on random network, and facilitated by node attraction. In this method, I assume that the initial network is a random network, or a given initial network. When a node is ready to connect, it tends to link to the node already owning the most connections, which coincides with the general rule (Barabasi and Albert, 1999 of node connecting. In addition, a node may randomly disconnect a connection i.e., the addition of connections in the network is accompanied by the pruning of some connections. The dynamics of network evolution is determined of the attraction factor Lamda of nodes, the probability of node connection, the probability of node disconnection, and the expected initial connectance. The attraction factor of nodes, the probability of node connection, and the probability of node disconnection are time and node varying. Various dynamics can be achieved by adjusting these parameters. Effects of simplified parameters on network evolution are analyzed. The changes of attraction factor Lamda can reflect various effects of the node degree on connection mechanism. Even the changes of Lamda only will generate various networks from the random to the complex. Therefore, the present algorithm can be treated as a general model for network evolution. Modeling results show that to generate a power-law type of network, the likelihood of a node attracting connections is dependent upon the power function of the node's degree with a higher-order power. Matlab codes for simplified version of the method are provided.

  11. Random number generation based on digital differential chaos

    KAUST Repository

    Zidan, Mohammed A.

    2012-07-29

    In this paper, we present a fully digital differential chaos based random number generator. The output of the digital circuit is proved to be chaotic by calculating the output time series maximum Lyapunov exponent. We introduce a new post processing technique to improve the distribution and statistical properties of the generated data. The post-processed output passes the NIST Sp. 800-22 statistical tests. The system is written in Verilog VHDL and realized on Xilinx Virtex® FPGA. The generator can fit into a very small area and have a maximum throughput of 2.1 Gb/s.

  12. Random-Profiles-Based 3D Face Recognition System

    Directory of Open Access Journals (Sweden)

    Joongrock Kim

    2014-03-01

    Full Text Available In this paper, a noble nonintrusive three-dimensional (3D face modeling system for random-profile-based 3D face recognition is presented. Although recent two-dimensional (2D face recognition systems can achieve a reliable recognition rate under certain conditions, their performance is limited by internal and external changes, such as illumination and pose variation. To address these issues, 3D face recognition, which uses 3D face data, has recently received much attention. However, the performance of 3D face recognition highly depends on the precision of acquired 3D face data, while also requiring more computational power and storage capacity than 2D face recognition systems. In this paper, we present a developed nonintrusive 3D face modeling system composed of a stereo vision system and an invisible near-infrared line laser, which can be directly applied to profile-based 3D face recognition. We further propose a novel random-profile-based 3D face recognition method that is memory-efficient and pose-invariant. The experimental results demonstrate that the reconstructed 3D face data consists of more than 50 k 3D point clouds and a reliable recognition rate against pose variation.

  13. Dipole location using SQUID based measurements: Application to magnetocardiography

    Science.gov (United States)

    Mariyappa, N.; Parasakthi, C.; Sengottuvel, S.; Gireesan, K.; Patel, Rajesh; Janawadkar, M. P.; Sundar, C. S.; Radhakrishnan, T. S.

    2012-07-01

    We report a method of inferring the dipole location using iterative nonlinear least square optimization based on Levenberg-Marquardt algorithm, wherein, we use different sets of pseudo-random numbers as initial parameter values. The method has been applied to (i) the simulated data representing the calculated magnetic field distribution produced by a point dipole placed at a known position, (ii) the experimental data from SQUID based measurements of the magnetic field distribution produced by a source coil carrying current, and (iii) the actual experimentally measured magnetocardiograms of human subjects using a SQUID based system.

  14. Status of radiation-based measurement technology

    International Nuclear Information System (INIS)

    Moon, B. S.; Lee, J. W.; Chung, C. E.; Hong, S. B.; Kim, J. T.; Park, W. M.; Kim, J. Y.

    1999-03-01

    This report describes the status of measurement equipment using radiation source and new technologies in this field. This report includes the development status in Korea together with a brief description of the technology development and application status in ten countries including France, America, and Japan. Also this report describes technical factors related to radiation-based measurement and trends of new technologies. Measurement principles are also described for the equipment that is widely used among radiation-based measurement, such as level measurement, density measurement, basis weight measurement, moisture measurement, and thickness measurement. (author). 7 refs., 2 tabs., 21 figs

  15. SQUID-based measuring systems

    Indian Academy of Sciences (India)

    field produced by a given two-dimensional current density distribution is inverted using the Fourier transform technique. Keywords ... Superconducting quantum interference devices (SQUIDs) are the most sensitive detectors for measurement of ... omagnetic prospecting, detection of gravity waves etc. Judging the importance ...

  16. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  17. Order-based representation in random networks of cortical neurons.

    Directory of Open Access Journals (Sweden)

    Goded Shahaf

    2008-11-01

    Full Text Available The wide range of time scales involved in neural excitability and synaptic transmission might lead to ongoing change in the temporal structure of responses to recurring stimulus presentations on a trial-to-trial basis. This is probably the most severe biophysical constraint on putative time-based primitives of stimulus representation in neuronal networks. Here we show that in spontaneously developing large-scale random networks of cortical neurons in vitro the order in which neurons are recruited following each stimulus is a naturally emerging representation primitive that is invariant to significant temporal changes in spike times. With a relatively small number of randomly sampled neurons, the information about stimulus position is fully retrievable from the recruitment order. The effective connectivity that makes order-based representation invariant to time warping is characterized by the existence of stations through which activity is required to pass in order to propagate further into the network. This study uncovers a simple invariant in a noisy biological network in vitro; its applicability under in vivo constraints remains to be seen.

  18. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  19. Random fiber laser based on artificially controlled backscattering fibers.

    Science.gov (United States)

    Wang, Xiaoliang; Chen, Daru; Li, Haitao; She, Lijuan; Wu, Qiong

    2018-01-10

    The random fiber laser (RFL), which is a milestone in laser physics and nonlinear optics, has attracted considerable attention recently. Most previously reported RFLs are based on distributed feedback of Rayleigh scattering amplified through the stimulated Raman-Brillouin scattering effect in single-mode fibers, which require long-distance (tens of kilometers) single-mode fibers and high threshold, up to watt level, due to the extremely small Rayleigh scattering coefficient of the fiber. We proposed and demonstrated a half-open-cavity RFL based on a segment of an artificially controlled backscattering single-mode fiber with a length of 210 m, 310 m, or 390 m. A fiber Bragg grating with a central wavelength of 1530 nm and a segment of artificially controlled backscattering single-mode fiber fabricated by using a femtosecond laser form the half-open cavity. The proposed RFL achieves thresholds of 25 mW, 30 mW, and 30 mW, respectively. Random lasing at a wavelength of 1530 nm and extinction ratio of 50 dB is achieved when a segment of 5 m erbium-doped fiber is pumped by a 980 nm laser diode in the RFL. A novel RFL with many short cavities has been achieved with low threshold.

  20. Spectrophotometer-Based Color Measurements

    Science.gov (United States)

    2017-10-24

    equipment. There are several American Society for Testing and Materials ( ASTM ) chapters covering the use of spectrometers for color measurements (refs. 3...Perkin Elmer software and procedures described in ASTM chapter E308 (ref. 3). All spectral data was stored on the computer. A summary of the color...similarity, or lack thereof, between two colors (ref. 5). In this report, the Euclidean distance metric, E, is used and recommended in ASTM D2244

  1. A Rewritable, Random-Access DNA-Based Storage System.

    Science.gov (United States)

    Yazdi, S M Hossein Tabatabaei; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica

    2015-09-18

    We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.

  2. The generation of 68 Gbps quantum random number by measuring laser phase fluctuations

    International Nuclear Information System (INIS)

    Nie, You-Qi; Liu, Yang; Zhang, Jun; Pan, Jian-Wei; Huang, Leilei; Payne, Frank

    2015-01-01

    The speed of a quantum random number generator is essential for practical applications, such as high-speed quantum key distribution systems. Here, we push the speed of a quantum random number generator to 68 Gbps by operating a laser around its threshold level. To achieve the rate, not only high-speed photodetector and high sampling rate are needed but also a very stable interferometer is required. A practical interferometer with active feedback instead of common temperature control is developed to meet the requirement of stability. Phase fluctuations of the laser are measured by the interferometer with a photodetector and then digitalized to raw random numbers with a rate of 80 Gbps. The min-entropy of the raw data is evaluated by modeling the system and is used to quantify the quantum randomness of the raw data. The bias of the raw data caused by other signals, such as classical and detection noises, can be removed by Toeplitz-matrix hashing randomness extraction. The final random numbers can pass through the standard randomness tests. Our demonstration shows that high-speed quantum random number generators are ready for practical usage

  3. Nonparametric indices of dependence between components for inhomogeneous multivariate random measures and marked sets

    OpenAIRE

    van Lieshout, Maria Nicolette Margaretha

    2018-01-01

    We propose new summary statistics to quantify the association between the components in coverage-reweighted moment stationary multivariate random sets and measures. They are defined in terms of the coverage-reweighted cumulant densities and extend classic functional statistics for stationary random closed sets. We study the relations between these statistics and evaluate them explicitly for a range of models. Unbiased estimators are given for all statistics and applied to simulated examples a...

  4. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Energy Technology Data Exchange (ETDEWEB)

    Chelouche, Doron; Pozo-Nuñez, Francisco [Department of Physics, Faculty of Natural Sciences, University of Haifa, Haifa 3498838 (Israel); Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il [Department of Geosciences, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Tel Aviv 6997801 (Israel)

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  5. Pseudo random number generator based on quantum chaotic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Mobaraki, A.; Lim, S.-C.; Hassan, Z.

    2014-01-01

    For many years dissipative quantum maps were widely used as informative models of quantum chaos. In this paper, a new scheme for generating good pseudo-random numbers (PRNG), based on quantum logistic map is proposed. Note that the PRNG merely relies on the equations used in the quantum chaotic map. The algorithm is not complex, which does not impose high requirement on computer hardware and thus computation speed is fast. In order to face the challenge of using the proposed PRNG in quantum cryptography and other practical applications, the proposed PRNG is subjected to statistical tests using well-known test suites such as NIST, DIEHARD, ENT and TestU01. The results of the statistical tests were promising, as the proposed PRNG successfully passed all these tests. Moreover, the degree of non-periodicity of the chaotic sequences of the quantum map is investigated through the Scale index technique. The obtained result shows that, the sequence is more non-periodic. From these results it can be concluded that, the new scheme can generate a high percentage of usable pseudo-random numbers for simulation and other applications in scientific computing.

  6. Motifs in triadic random graphs based on Steiner triple systems

    Science.gov (United States)

    Winkler, Marco; Reichardt, Jörg

    2013-08-01

    Conventionally, pairwise relationships between nodes are considered to be the fundamental building blocks of complex networks. However, over the last decade, the overabundance of certain subnetwork patterns, i.e., the so-called motifs, has attracted much attention. It has been hypothesized that these motifs, instead of links, serve as the building blocks of network structures. Although the relation between a network's topology and the general properties of the system, such as its function, its robustness against perturbations, or its efficiency in spreading information, is the central theme of network science, there is still a lack of sound generative models needed for testing the functional role of subgraph motifs. Our work aims to overcome this limitation. We employ the framework of exponential random graph models (ERGMs) to define models based on triadic substructures. The fact that only a small portion of triads can actually be set independently poses a challenge for the formulation of such models. To overcome this obstacle, we use Steiner triple systems (STSs). These are partitions of sets of nodes into pair-disjoint triads, which thus can be specified independently. Combining the concepts of ERGMs and STSs, we suggest generative models capable of generating ensembles of networks with nontrivial triadic Z-score profiles. Further, we discover inevitable correlations between the abundance of triad patterns, which occur solely for statistical reasons and need to be taken into account when discussing the functional implications of motif statistics. Moreover, we calculate the degree distributions of our triadic random graphs analytically.

  7. Strong Tracking Filter for Nonlinear Systems with Randomly Delayed Measurements and Correlated Noises

    Directory of Open Access Journals (Sweden)

    Hongtao Yang

    2018-01-01

    Full Text Available This paper proposes a novel strong tracking filter (STF, which is suitable for dealing with the filtering problem of nonlinear systems when the following cases occur: that is, the constructed model does not match the actual system, the measurements have the one-step random delay, and the process and measurement noises are correlated at the same epoch. Firstly, a framework of decoupling filter (DF based on equivalent model transformation is derived. Further, according to the framework of DF, a new extended Kalman filtering (EKF algorithm via using first-order linearization approximation is developed. Secondly, the computational process of the suboptimal fading factor is derived on the basis of the extended orthogonality principle (EOP. Thirdly, the ultimate form of the proposed STF is obtained by introducing the suboptimal fading factor into the above EKF algorithm. The proposed STF can automatically tune the suboptimal fading factor on the basis of the residuals between available and predicted measurements and further the gain matrices of the proposed STF tune online to improve the filtering performance. Finally, the effectiveness of the proposed STF has been proved through numerical simulation experiments.

  8. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  9. Generative Learning Objects Instantiated with Random Numbers Based Expressions

    Directory of Open Access Journals (Sweden)

    Ciprian Bogdan Chirila

    2015-12-01

    Full Text Available The development of interactive e-learning content requires special skills like programming techniques, web integration, graphic design etc. Generally, online educators do not possess such skills and their e-learning products tend to be static like presentation slides and textbooks. In this paper we propose a new interactive model of generative learning objects as a compromise betweenstatic, dull materials and dynamic, complex software e-learning materials developed by specialized teams. We find that random numbers based automatic initialization learning objects increases content diversity, interactivity thus enabling learners’ engagement. The resulted learning object model is at a limited level of complexity related to special e-learning software, intuitive and capable of increasing learners’ interactivity, engagement and motivation through dynamic content. The approach was applied successfully on several computer programing disciplines.

  10. Pseudo-Random Number Generator Based on Coupled Map Lattices

    Science.gov (United States)

    Lü, Huaping; Wang, Shihong; Hu, Gang

    A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.

  11. Bluetooth-based distributed measurement system

    International Nuclear Information System (INIS)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng

    2007-01-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit

  12. Bluetooth-based distributed measurement system

    Science.gov (United States)

    Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng

    2007-07-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  13. Bluetooth-based distributed measurement system

    Energy Technology Data Exchange (ETDEWEB)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng [Department of Mechatronics, College of Mechanical Engineering, Chongqing University, Chongqing, 400030 (China)

    2007-07-15

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  14. An improved label propagation algorithm based on node importance and random walk for community detection

    Science.gov (United States)

    Ma, Tianren; Xia, Zhengyou

    2017-05-01

    Currently, with the rapid development of information technology, the electronic media for social communication is becoming more and more popular. Discovery of communities is a very effective way to understand the properties of complex networks. However, traditional community detection algorithms consider the structural characteristics of a social organization only, with more information about nodes and edges wasted. In the meanwhile, these algorithms do not consider each node on its merits. Label propagation algorithm (LPA) is a near linear time algorithm which aims to find the community in the network. It attracts many scholars owing to its high efficiency. In recent years, there are more improved algorithms that were put forward based on LPA. In this paper, an improved LPA based on random walk and node importance (NILPA) is proposed. Firstly, a list of node importance is obtained through calculation. The nodes in the network are sorted in descending order of importance. On the basis of random walk, a matrix is constructed to measure the similarity of nodes and it avoids the random choice in the LPA. Secondly, a new metric IAS (importance and similarity) is calculated by node importance and similarity matrix, which we can use to avoid the random selection in the original LPA and improve the algorithm stability. Finally, a test in real-world and synthetic networks is given. The result shows that this algorithm has better performance than existing methods in finding community structure.

  15. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  16. Random measurement error: Why worry? An example of cardiovascular risk factors.

    Science.gov (United States)

    Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H

    2018-01-01

    With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  17. Random measurement error: Why worry? An example of cardiovascular risk factors.

    Directory of Open Access Journals (Sweden)

    Timo B Brakenhoff

    Full Text Available With the increased use of data not originally recorded for research, such as routine care data (or 'big data', measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate. For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  18. Zero-inflated count models for longitudinal measurements with heterogeneous random effects.

    Science.gov (United States)

    Zhu, Huirong; Luo, Sheng; DeSantis, Stacia M

    2017-08-01

    Longitudinal zero-inflated count data arise frequently in substance use research when assessing the effects of behavioral and pharmacological interventions. Zero-inflated count models (e.g. zero-inflated Poisson or zero-inflated negative binomial) with random effects have been developed to analyze this type of data. In random effects zero-inflated count models, the random effects covariance matrix is typically assumed to be homogeneous (constant across subjects). However, in many situations this matrix may be heterogeneous (differ by measured covariates). In this paper, we extend zero-inflated count models to account for random effects heterogeneity by modeling their variance as a function of covariates. We show via simulation that ignoring intervention and covariate-specific heterogeneity can produce biased estimates of covariate and random effect estimates. Moreover, those biased estimates can be rectified by correctly modeling the random effects covariance structure. The methodological development is motivated by and applied to the Combined Pharmacotherapies and Behavioral Interventions for Alcohol Dependence (COMBINE) study, the largest clinical trial of alcohol dependence performed in United States with 1383 individuals.

  19. Standardized Effect Size Measures for Mediation Analysis in Cluster-Randomized Trials

    Science.gov (United States)

    Stapleton, Laura M.; Pituch, Keenan A.; Dion, Eric

    2015-01-01

    This article presents 3 standardized effect size measures to use when sharing results of an analysis of mediation of treatment effects for cluster-randomized trials. The authors discuss 3 examples of mediation analysis (upper-level mediation, cross-level mediation, and cross-level mediation with a contextual effect) with demonstration of the…

  20. Robust state estimation for double pantographs with random missing measurements in high-speed railway

    DEFF Research Database (Denmark)

    Lu, Xiaobing; Liu, Zhigang; Wang, Yanbo

    2016-01-01

    Active control of pantograph could be performed to decrease the fluctuation in pantograph-catenary contact force (PCCF) in high-speed railway. However, it is difficult to obtain the states of the pantograph when state feedback control is implemented. And the measurements may randomly miss due...

  1. Focal plane based wavefront sensing with random DM probes

    Science.gov (United States)

    Pluzhnik, Eugene; Sirbu, Dan; Belikov, Ruslan; Bendek, Eduardo; Dudinov, Vladimir N.

    2017-09-01

    An internal coronagraph with an adaptive optical system for wavefront control is being considered for direct imaging of exoplanets with upcoming space missions and concepts, including WFIRST, HabEx, LUVOIR, EXCEDE and ACESat. The main technical challenge associated with direct imaging of exoplanets is to control of both diffracted and scattered light from the star so that even a dim planetary companion can be imaged. For a deformable mirror (DM) to create a dark hole with 10-10 contrast in the image plane, wavefront errors must be accurately measured on the science focal plane detector to ensure a common optical path. We present here a method that uses a set of random phase probes applied to the DM to obtain a high accuracy wavefront estimate even for a dynamically changing optical system. The presented numerical simulations and experimental results show low noise sensitivity, high reliability, and robustness of the proposed approach. The method does not use any additional optics or complex calibration procedures and can be used during the calibration stage of any direct imaging mission. It can also be used in any optical experiment that uses a DM as an active optical element in the layout.

  2. Visibility and aerosol measurement by diode-laser random-modulation CW lidar

    Science.gov (United States)

    Takeuchi, N.; Baba, H.; Sakurai, K.; Ueno, T.; Ishikawa, N.

    1986-01-01

    Examples of diode laser (DL) random-modulation continuous wave (RM-CW) lidar measurements are reported. The ability of the measurement of the visibility, vertical aerosol profile, and the cloud ceiling height is demonstrated. Although the data shown here were all measured at night time, the daytime measurement is, of course, possible. For that purpose, accurate control of the laser frequency to the center frequency of a narrow band filter is required. Now a new system with a frequency control is under construction.

  3. Microbiota-based Signature of Gingivitis Treatments: A Randomized Study.

    Science.gov (United States)

    Huang, Shi; Li, Zhen; He, Tao; Bo, Cunpei; Chang, Jinlan; Li, Lin; He, Yanyan; Liu, Jiquan; Charbonneau, Duane; Li, Rui; Xu, Jian

    2016-04-20

    Plaque-induced gingivitis can be alleviated by various treatment regimens. To probe the impacts of various anti-gingivitis treatments on plaque microflora, here a double blinded, randomized controlled trial of 91 adults with moderate gingivitis was designed with two anti-gingivitis regimens: the brush-alone treatment and the brush-plus-rinse treatment. In the later group, more reduction in both Plaque Index (TMQHI) and Gingival Index (mean MGI) at Day 3, Day 11 and Day 27 was evident, and more dramatic changes were found between baseline and other time points for both supragingival plaque microbiota structure and salivary metabonomic profiles. A comparison of plaque microbiota changes was also performed between these two treatments and a third dataset where 50 subjects received regimen of dental scaling. Only Actinobaculum, TM7 and Leptotrichia were consistently reduced by all the three treatments, whereas the different microbial signatures of the three treatments during gingivitis relieve indicate distinct mechanisms of action. Our study suggests that microbiota based signatures can serve as a valuable approach for understanding and potentially comparing the modes of action for clinical treatments and oral-care products in the future.

  4. Enhancing Security of Double Random Phase Encoding Based on Random S-Box

    Science.gov (United States)

    Girija, R.; Singh, Hukum

    2018-06-01

    In this paper, we propose a novel asymmetric cryptosystem for double random phase encoding (DRPE) using random S-Box. While utilising S-Box separately is not reliable and DRPE does not support non-linearity, so, our system unites the effectiveness of S-Box with an asymmetric system of DRPE (through Fourier transform). The uniqueness of proposed cryptosystem lies on employing high sensitivity dynamic S-Box for our DRPE system. The randomness and scalability achieved due to applied technique is an additional feature of the proposed solution. The firmness of random S-Box is investigated in terms of performance parameters such as non-linearity, strict avalanche criterion, bit independence criterion, linear and differential approximation probabilities etc. S-Boxes convey nonlinearity to cryptosystems which is a significant parameter and very essential for DRPE. The strength of proposed cryptosystem has been analysed using various parameters such as MSE, PSNR, correlation coefficient analysis, noise analysis, SVD analysis, etc. Experimental results are conferred in detail to exhibit proposed cryptosystem is highly secure.

  5. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    Science.gov (United States)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  6. Necessary conditions for the invariant measure of a random walk to be a sum of geometric terms

    NARCIS (Netherlands)

    Chen, Y.; Boucherie, Richardus J.; Goseling, Jasper

    We consider the invariant measure of homogeneous random walks in the quarter-plane. In particular, we consider measures that can be expressed as an infinite sum of geometric terms. We present necessary conditions for the invariant measure of a random walk to be a sum of geometric terms. We

  7. Concentration inequalities for functions of Gibbs fields with application to diffraction and random Gibbs measures

    CERN Document Server

    Külske, C

    2003-01-01

    We derive useful general concentration inequalities for functions of Gibbs fields in the uniqueness regime. We also consider expectations of random Gibbs measures that depend on an additional disorder field, and prove concentration w.r.t the disorder field. Both fields are assumed to be in the uniqueness regime, allowing in particular for non-independent disorder field. The modification of the bounds compared to the case of an independent field can be expressed in terms of constants that resemble the Dobrushin contraction coefficient, and are explicitly computable. On the basis of these inequalities, we obtain bounds on the deviation of a diffraction pattern created by random scatterers located on a general discrete point set in the Euclidean space, restricted to a finite volume. Here we also allow for thermal dislocations of the scatterers around their equilibrium positions. Extending recent results for independent scatterers, we give a universal upper bound on the probability of a deviation of the random sc...

  8. Using complete measurement statistics for optimal device-independent randomness evaluation

    International Nuclear Information System (INIS)

    Nieto-Silleras, O; Pironio, S; Silman, J

    2014-01-01

    The majority of recent works investigating the link between non-locality and randomness, e.g. in the context of device-independent cryptography, do so with respect to some specific Bell inequality, usually the CHSH inequality. However, the joint probabilities characterizing the measurement outcomes of a Bell test are richer than just the degree of violation of a single Bell inequality. In this work we show how to take this extra information into account in a systematic manner in order to optimally evaluate the randomness that can be certified from non-local correlations. We further show that taking into account the complete set of outcome probabilities is equivalent to optimizing over all possible Bell inequalities, thereby allowing us to determine the optimal Bell inequality for certifying the maximal amount of randomness from a given set of non-local correlations. (paper)

  9. Korean Clinic Based Outcome Measure Studies

    OpenAIRE

    Jongbae Park

    2003-01-01

    Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented...

  10. Influence of low ambient temperature on epitympanic temperature measurement: a prospective randomized clinical study.

    Science.gov (United States)

    Strapazzon, Giacomo; Procter, Emily; Putzer, Gabriel; Avancini, Giovanni; Dal Cappello, Tomas; Überbacher, Norbert; Hofer, Georg; Rainer, Bernhard; Rammlmair, Georg; Brugger, Hermann

    2015-11-05

    Epitympanic temperature (Tty) measured with thermistor probes correlates with core body temperature (Tcore), but the reliability of measurements at low ambient temperature is unknown. The aim of this study was to determine if commercially-available thermistor-based Tty reflects Tcore in low ambient temperature and if Tty is influenced by insulation of the ear. Thirty-one participants (two females) were exposed to room (23.2 ± 0.4 °C) and low (-18.7 ± 1.0 °C) ambient temperature for 10 min using a randomized cross-over design. Tty was measured using an epitympanic probe (M1024233, GE Healthcare Finland Oy) and oesophageal temperature (Tes) with an oesophageal probe (M1024229, GE Healthcare Finland Oy) inserted into the lower third of the oesophagus. Ten participants wore ear protectors (Arton 2200, Emil Lux GmbH & Co. KG, Wermelskirchen, Switzerland) to insulate the ear from ambient air. During exposure to room temperature, mean Tty increased from 33.4 ± 1.5 to 34.2 ± 0.8 °C without insulation of the ear and from 35.0 ± 0.8 to 35.5 ± 0.7 °C with insulation. During exposure to low ambient temperature, mean Tty decreased from 32.4 ± 1.6 to 28.5 ± 2.0 °C without insulation and from 35.6 ± 0.6 to 35.2 ± 0.9 °C with insulation. The difference between Tty and Tes at low ambient temperature was reduced by 82% (from 7.2 to 1.3 °C) with insulation of the ear. Epitympanic temperature measurements are influenced by ambient temperature and deviate from Tes at room and low ambient temperature. Insulating the ear with ear protectors markedly reduced the difference between Tty and Tes and improved the stability of measurements. The use of models to correct Tty may be possible, but results should be validated in larger studies.

  11. Bearing Fault Classification Based on Conditional Random Field

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2013-01-01

    Full Text Available Condition monitoring of rolling element bearing is paramount for predicting the lifetime and performing effective maintenance of the mechanical equipment. To overcome the drawbacks of the hidden Markov model (HMM and improve the diagnosis accuracy, conditional random field (CRF model based classifier is proposed. In this model, the feature vectors sequences and the fault categories are linked by an undirected graphical model in which their relationship is represented by a global conditional probability distribution. In comparison with the HMM, the main advantage of the CRF model is that it can depict the temporal dynamic information between the observation sequences and state sequences without assuming the independence of the input feature vectors. Therefore, the interrelationship between the adjacent observation vectors can also be depicted and integrated into the model, which makes the classifier more robust and accurate than the HMM. To evaluate the effectiveness of the proposed method, four kinds of bearing vibration signals which correspond to normal, inner race pit, outer race pit and roller pit respectively are collected from the test rig. And the CRF and HMM models are built respectively to perform fault classification by taking the sub band energy features of wavelet packet decomposition (WPD as the observation sequences. Moreover, K-fold cross validation method is adopted to improve the evaluation accuracy of the classifier. The analysis and comparison under different fold times show that the accuracy rate of classification using the CRF model is higher than the HMM. This method brings some new lights on the accurate classification of the bearing faults.

  12. A randomized controlled trial of aquatic and land-based exercise in patients with knee osteoarthritis

    DEFF Research Database (Denmark)

    Lund, H.; Weile, U.; Christensen, R.

    2008-01-01

    patients reported adverse events (i.e. discomfort) in land-based exercise, while only 3 reported adverse events in the aquatic exercise. Conclusion: Only land-based exercise showed some improvement in pain and muscle strength compared with the control group, while no clinical benefits were detectable after......Objective: To compare the efficacy of aquatic exercise and a land-based exercise programme vs control in patients with knee osteoarthritis. Methods: Primary outcome was change in pain, and in addition Knee Injury and Osteoarthritis Outcome Score questionnaire (KOOS). Standing balance and strength...... was also measured after and at 3-month follow-up. Seventy-nine patients (62 women), with a mean age of 68 years (age range 40-89 years) were randomized to aquatic exercise (n = 27), land-based exercise (n = 25) or control (n = 27). Results: No effect was observed immediately after exercise cessation (8...

  13. A pilot randomized trial teaching mindfulness-based stress reduction to traumatized youth in foster care.

    Science.gov (United States)

    Jee, Sandra H; Couderc, Jean-Philippe; Swanson, Dena; Gallegos, Autumn; Hilliard, Cammie; Blumkin, Aaron; Cunningham, Kendall; Heinert, Sara

    2015-08-01

    This article presents a pilot project implementing a mindfulness-based stress reduction program among traumatized youth in foster and kinship care over 10 weeks. Forty-two youth participated in this randomized controlled trial that used a mixed-methods (quantitative, qualitative, and physiologic) evaluation. Youth self-report measuring mental health problems, mindfulness, and stress were lower than anticipated, and the relatively short time-frame to teach these skills to traumatized youth may not have been sufficient to capture significant changes in stress as measured by electrocardiograms. Main themes from qualitative data included expressed competence in managing ongoing stress, enhanced self-awareness, and new strategies to manage stress. We share our experiences and recommendations for future research and practice, including focusing efforts on younger youth, and using community-based participatory research principles to promote engagement and co-learning. CLINICALTRIALS.GOV: Protocol Registration System ID NCT01708291. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Multimedia based health information to parents in a pediatric acute ward: a randomized controlled trial.

    Science.gov (United States)

    Botngård, Anja; Skranes, Lars P; Skranes, Jon; Døllner, Henrik

    2013-12-01

    To determine whether multimedia based health information presented to parents of children with breathing difficulties in a pediatric acute ward, is more effective than verbal information, to reduce parental anxiety and increase satisfaction. This randomized controlled trial was conducted in a pediatric acute ward in Norway, from January to March 2011. Parents were randomly assigned to a multimedia intervention (n=53), or verbal health information (n=48). Primary outcome measure was parental anxiety, and secondary outcome measures were parental satisfaction with nursing care and health information. Parental anxiety decreased from arrival to discharge within both groups. At discharge the anxiety levels in the intervention group were no lower than in the control group. There was no difference in satisfaction with nursing care between the groups, but parents in the intervention group reported higher satisfaction with the health information given in the acute ward (p=.005). Multimedia based health information did not reduce anxiety more than verbal information, among parents to children with breathing difficulties. However, after discharge the parents were more satisfied with the multimedia approach. More research is needed to recommend the use of multimedia based information as a routine to parents in pediatric emergency care. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Multi-parameter sensor based on random fiber lasers

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2016-09-01

    Full Text Available We demonstrate a concept of utilizing random fiber lasers to achieve multi-parameter sensing. The proposed random fiber ring laser consists of an erbium-doped fiber as the gain medium and a random fiber grating as the feedback. The random feedback is effectively realized by a large number of reflections from around 50000 femtosecond laser induced refractive index modulation regions over a 10cm standard single mode fiber. Numerous polarization-dependent spectral filters are formed and superimposed to provide multiple lasing lines with high signal-to-noise ratio up to 40dB, which gives an access for a high-fidelity multi-parameter sensing scheme. The number of sensing parameters can be controlled by the number of the lasing lines via input polarizations and wavelength shifts of each peak can be explored for the simultaneous multi-parameter sensing with one sensing probe. In addition, the random grating induced coupling between core and cladding modes can be potentially used for liquid medical sample sensing in medical diagnostics, biology and remote sensing in hostile environments.

  16. Nurse-Moderated Internet-Based Support for New Mothers: Non-Inferiority, Randomized Controlled Trial.

    Science.gov (United States)

    Sawyer, Michael G; Reece, Christy E; Bowering, Kerrie; Jeffs, Debra; Sawyer, Alyssa C P; Mittinty, Murthy; Lynch, John W

    2017-07-24

    Internet-based interventions moderated by community nurses have the potential to improve support offered to new mothers, many of whom now make extensive use of the Internet to obtain information about infant care. However, evidence from population-based randomized controlled trials is lacking. The aim of this study was to test the non-inferiority of outcomes for mothers and infants who received a clinic-based postnatal health check plus nurse-moderated, Internet-based group support when infants were aged 1-7 months as compared with outcomes for those who received standard care consisting of postnatal home-based support provided by a community nurse. The design of the study was a pragmatic, preference, non-inferiority randomized control trial. Participants were recruited from mothers contacted for their postnatal health check, which is offered to all mothers in South Australia. Mothers were assigned either (1) on the basis of their preference to clinic+Internet or home-based support groups (n=328), or (2) randomly assigned to clinic+Internet or home-based groups if they declared no strong preference (n=491). The overall response rate was 44.8% (819/1827). The primary outcome was parenting self-competence, as measured by the Parenting Stress Index (PSI) Competence subscale, and the Karitane Parenting Confidence Scale scores. Secondary outcome measures included PSI Isolation, Interpersonal Support Evaluation List-Short Form, Maternal Support Scale, Ages and Stages Questionnaire-Social-Emotional and MacArthur Communicative Development Inventory (MCDI) scores. Assessments were completed offline via self-assessment questionnaires at enrolment (mean child age=4.1 weeks, SD 1.3) and again when infants were aged 9, 15, and 21 months. Generalized estimating equations adjusting for post-randomization baseline imbalances showed that differences in outcomes between mothers in the clinic+Internet and home-based support groups did not exceed the pre-specified margin of

  17. A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jae-Hee Hur

    2017-01-01

    Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.

  18. Sensitivity analysis for missing dichotomous outcome data in multi-visit randomized clinical trial with randomization-based covariance adjustment.

    Science.gov (United States)

    Li, Siying; Koch, Gary G; Preisser, John S; Lam, Diana; Sanchez-Kam, Matilde

    2017-01-01

    Dichotomous endpoints in clinical trials have only two possible outcomes, either directly or via categorization of an ordinal or continuous observation. It is common to have missing data for one or more visits during a multi-visit study. This paper presents a closed form method for sensitivity analysis of a randomized multi-visit clinical trial that possibly has missing not at random (MNAR) dichotomous data. Counts of missing data are redistributed to the favorable and unfavorable outcomes mathematically to address possibly informative missing data. Adjusted proportion estimates and their closed form covariance matrix estimates are provided. Treatment comparisons over time are addressed with Mantel-Haenszel adjustment for a stratification factor and/or randomization-based adjustment for baseline covariables. The application of such sensitivity analyses is illustrated with an example. An appendix outlines an extension of the methodology to ordinal endpoints.

  19. Reporting of Positive Results in Randomized Controlled Trials of Mindfulness-Based Mental Health Interventions.

    Directory of Open Access Journals (Sweden)

    Stephanie Coronado-Montoya

    Full Text Available A large proportion of mindfulness-based therapy trials report statistically significant results, even in the context of very low statistical power. The objective of the present study was to characterize the reporting of "positive" results in randomized controlled trials of mindfulness-based therapy. We also assessed mindfulness-based therapy trial registrations for indications of possible reporting bias and reviewed recent systematic reviews and meta-analyses to determine whether reporting biases were identified.CINAHL, Cochrane CENTRAL, EMBASE, ISI, MEDLINE, PsycInfo, and SCOPUS databases were searched for randomized controlled trials of mindfulness-based therapy. The number of positive trials was described and compared to the number that might be expected if mindfulness-based therapy were similarly effective compared to individual therapy for depression. Trial registries were searched for mindfulness-based therapy registrations. CINAHL, Cochrane CENTRAL, EMBASE, ISI, MEDLINE, PsycInfo, and SCOPUS were also searched for mindfulness-based therapy systematic reviews and meta-analyses.108 (87% of 124 published trials reported ≥1 positive outcome in the abstract, and 109 (88% concluded that mindfulness-based therapy was effective, 1.6 times greater than the expected number of positive trials based on effect size d = 0.55 (expected number positive trials = 65.7. Of 21 trial registrations, 13 (62% remained unpublished 30 months post-trial completion. No trial registrations adequately specified a single primary outcome measure with time of assessment. None of 36 systematic reviews and meta-analyses concluded that effect estimates were overestimated due to reporting biases.The proportion of mindfulness-based therapy trials with statistically significant results may overstate what would occur in practice.

  20. Quantum random number generator based on quantum tunneling effect

    OpenAIRE

    Zhou, Haihan; Li, Junlin; Pan, Dong; Zhang, Weixing; Long, Guilu

    2017-01-01

    In this paper, we proposed an experimental implementation of quantum random number generator(QRNG) with inherent randomness of quantum tunneling effect of electrons. We exploited InGaAs/InP diodes, whose valance band and conduction band shared a quasi-constant energy barrier. We applied a bias voltage on the InGaAs/InP avalanche diode, which made the diode works under Geiger mode, and triggered the tunneling events with a periodic pulse. Finally, after data collection and post-processing, our...

  1. Utility based maintenance analysis using a Random Sign censoring model

    International Nuclear Information System (INIS)

    Andres Christen, J.; Ruggeri, Fabrizio; Villa, Enrique

    2011-01-01

    Industrial systems subject to failures are usually inspected when there are evident signs of an imminent failure. Maintenance is therefore performed at a random time, somehow dependent on the failure mechanism. A competing risk model, namely a Random Sign model, is considered to relate failure and maintenance times. We propose a novel Bayesian analysis of the model and apply it to actual data from a water pump in an oil refinery. The design of an optimal maintenance policy is then discussed under a formal decision theoretic approach, analyzing the goodness of the current maintenance policy and making decisions about the optimal maintenance time.

  2. Community-based peer-led diabetes self-management: a randomized trial.

    Science.gov (United States)

    Lorig, Kate; Ritter, Philip L; Villa, Frank J; Armas, Jean

    2009-01-01

    The purpose of this study is to determine the effectiveness of a community-based diabetes self-management program comparing treatment participants to a randomized usual-care control group at 6 months. A total of 345 adults with type 2 diabetes but no criteria for high A1C were randomized to a usual-care control group or 6-week community-based, peer-led diabetes self-management program (DSMP). Randomized participants were compared at 6 months. The DSMP intervention participants were followed for an additional 6 months (12 months total). A1C and body mass index were measured at baseline, 6 months, and 12 months. All other data were collected by self-administered questionnaires. At 6 months, DSMP participants did not demonstrate improvements in A1C as compared with controls. Baseline A1C was much lower than in similar trials. Participants did have significant improvements in depression, symptoms of hypoglycemia, communication with physicians, healthy eating, and reading food labels (P < .01). They also had significant improvements in patient activation and self-efficacy. At 12 months, DSMP intervention participants continued to demonstrate improvements in depression, communication with physicians, healthy eating, patient activation, and self-efficacy (P < .01). There were no significant changes in utilization measures. These findings suggest that people with diabetes without elevated A1C can benefit from a community-based, peer-led diabetes program. Given the large number of people with diabetes and lack of low-cost diabetes education, the DSMP deserves consideration for implementation.

  3. Basic properties of the current-current correlation measure for random Schroedinger operators

    International Nuclear Information System (INIS)

    Hislop, Peter D.; Lenoble, Olivier

    2006-01-01

    The current-current correlation measure plays a crucial role in the theory of conductivity for disordered systems. We prove a Pastur-Shubin-type formula for the current-current correlation measure expressing it as a thermodynamic limit for random Schroedinger operators on the lattice and the continuum. We prove that the limit is independent of the self-adjoint boundary conditions and independent of a large family of expanding regions. We relate this finite-volume definition to the definition obtained by using the infinite-volume operators and the trace-per-unit volume

  4. How to Measure Motivational Interviewing Fidelity in Randomized Controlled Trials: Practical Recommendations.

    Science.gov (United States)

    Jelsma, Judith G M; Mertens, Vera-Christina; Forsberg, Lisa; Forsberg, Lars

    2015-07-01

    Many randomized controlled trials in which motivational interviewing (MI) is a key intervention make no provision for the assessment of treatment fidelity. This methodological shortcoming makes it impossible to distinguish between high- and low-quality MI interventions, and, consequently, to know whether MI provision has contributed to any intervention effects. This article makes some practical recommendations for the collection, selection, coding and reporting of MI fidelity data, as measured using the Motivational Interviewing Treatment Integrity Code. We hope that researchers will consider these recommendations and include MI fidelity measures in future studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Reconstruction of photon number conditioned states using phase randomized homodyne measurements

    International Nuclear Information System (INIS)

    Chrzanowski, H M; Assad, S M; Bernu, J; Hage, B; Lam, P K; Symul, T; Lund, A P; Ralph, T C

    2013-01-01

    We experimentally demonstrate the reconstruction of a photon number conditioned state without using a photon number discriminating detector. By using only phase randomized homodyne measurements, we reconstruct up to the three photon subtracted squeezed vacuum state. The reconstructed Wigner functions of these states show regions of pronounced negativity, signifying the non-classical nature of the reconstructed states. The techniques presented allow for complete characterization of the role of a conditional measurement on an ensemble of states, and might prove useful in systems where photon counting still proves technically challenging. (paper)

  6. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  7. Genetic diversity of Kenyan Prosopis populations based on random ...

    African Journals Online (AJOL)

    To determine whether naturally established stands consist of a single or mixture of species, six populations from Bamburi, Bura, Isiolo, Marigat, Taveta and Turkwel were compared for relatedness with reference to Prosopis chilensis, Prosopis juliflora and Prosopis pallida using random amplified polymorphic DNA markers.

  8. Genetic relationships among Rosa species based on random ...

    African Journals Online (AJOL)

    To investigate the genetic diversity of Rosa accessions, random amplified polymorphism DNA (RAPD) approach was employed. Nine of ten primers amplified 138 scorable RAPD loci with 111 polymorphic bands (80%). Percentages of polymorphic bands ranged from 75 to 100%. Sizes of amplified DNA fragments ranged ...

  9. Forecasting method in multilateration accuracy based on laser tracker measurement

    International Nuclear Information System (INIS)

    Aguado, Sergio; Santolaria, Jorge; Samper, David; José Aguilar, Juan

    2017-01-01

    Multilateration based on a laser tracker (LT) requires the measurement of a set of points from three or more positions. Although the LTs’ angular information is not used, multilateration produces a volume of measurement uncertainty. This paper presents two new coefficients from which to determine whether the measurement of a set of points, before performing the necessary measurements, will improve or worsen the accuracy of the multilateration results, avoiding unnecessary measurement, and reducing the time and economic cost required. The first specific coefficient measurement coefficient (MC LT ) is unique for each laser tracker. It determines the relationship between the radial and angular laser tracker measurement noise. Similarly, the second coefficient is related with specific conditions of measurement β . It is related with the spatial angle between the laser tracker positions α and its effect on error reduction. Both parameters MC LT and β are linked in error reduction limits. Beside these, a new methodology to determine the multilateration reduction limit according to the multilateration technique of an ideal laser tracker distribution and a random one are presented. It provides general rules and advice from synthetic tests that are validated through a real test carried out in a coordinate measurement machine. (paper)

  10. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  11. An USB-based time measurement system

    International Nuclear Information System (INIS)

    Qin Xi; Liu Shubin; An Qi

    2010-01-01

    In this paper,we report the electronics of a timing measurement system of PTB(portable TDC board), which is a handy tool based on USB interface, customized for high precision time measurements without any crates. The time digitization is based on the High Performance TDC Chip (HPTDC). The real-time compensation for HPTDC outputs and the USB master logic are implemented in an ALTERA's Cyclone FPGA. The architecture design and logic design are described in detail. Test of the system showed a time resolution of 13.3 ps. (authors)

  12. Toward Measuring Network Aesthetics Based on Symmetry

    Directory of Open Access Journals (Sweden)

    Zengqiang Chen

    2017-05-01

    Full Text Available In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here we take a very different approach, abandoning reliance on geometrical properties, and apply information-theoretic measures to abstract graphs and networks directly (rather than to their visual representaions as a means of capturing classical appreciation of structural symmetry. Examples are used solely to motivate the approach to measurement, and to elucidate our symmetry-based mathematical theory of network aesthetics.

  13. Characterization of cervigram image sharpness using multiple self-referenced measurements and random forest classifiers

    Science.gov (United States)

    Jaiswal, Mayoore; Horning, Matt; Hu, Liming; Ben-Or, Yau; Champlin, Cary; Wilson, Benjamin; Levitz, David

    2018-02-01

    Cervical cancer is the fourth most common cancer among women worldwide and is especially prevalent in low resource settings due to lack of screening and treatment options. Visual inspection with acetic acid (VIA) is a widespread and cost-effective screening method for cervical pre-cancer lesions, but accuracy depends on the experience level of the health worker. Digital cervicography, capturing images of the cervix, enables review by an off-site expert or potentially a machine learning algorithm. These reviews require images of sufficient quality. However, image quality varies greatly across users. A novel algorithm was developed to evaluate the sharpness of images captured with the MobileODT's digital cervicography device (EVA System), in order to, eventually provide feedback to the health worker. The key challenges are that the algorithm evaluates only a single image of each cervix, it needs to be robust to the variability in cervix images and fast enough to run in real time on a mobile device, and the machine learning model needs to be small enough to fit on a mobile device's memory, train on a small imbalanced dataset and run in real-time. In this paper, the focus scores of a preprocessed image and a Gaussian-blurred version of the image are calculated using established methods and used as features. A feature selection metric is proposed to select the top features which were then used in a random forest classifier to produce the final focus score. The resulting model, based on nine calculated focus scores, achieved significantly better accuracy than any single focus measure when tested on a holdout set of images. The area under the receiver operating characteristics curve was 0.9459.

  14. Accuracy of magnetic resonance based susceptibility measurements

    Science.gov (United States)

    Erdevig, Hannah E.; Russek, Stephen E.; Carnicka, Slavka; Stupic, Karl F.; Keenan, Kathryn E.

    2017-05-01

    Magnetic Resonance Imaging (MRI) is increasingly used to map the magnetic susceptibility of tissue to identify cerebral microbleeds associated with traumatic brain injury and pathological iron deposits associated with neurodegenerative diseases such as Parkinson's and Alzheimer's disease. Accurate measurements of susceptibility are important for determining oxygen and iron content in blood vessels and brain tissue for use in noninvasive clinical diagnosis and treatment assessments. Induced magnetic fields with amplitude on the order of 100 nT, can be detected using MRI phase images. The induced field distributions can then be inverted to obtain quantitative susceptibility maps. The focus of this research was to determine the accuracy of MRI-based susceptibility measurements using simple phantom geometries and to compare the susceptibility measurements with magnetometry measurements where SI-traceable standards are available. The susceptibilities of paramagnetic salt solutions in cylindrical containers were measured as a function of orientation relative to the static MRI field. The observed induced fields as a function of orientation of the cylinder were in good agreement with simple models. The MRI susceptibility measurements were compared with SQUID magnetometry using NIST-traceable standards. MRI can accurately measure relative magnetic susceptibilities while SQUID magnetometry measures absolute magnetic susceptibility. Given the accuracy of moment measurements of tissue mimicking samples, and the need to look at small differences in tissue properties, the use of existing NIST standard reference materials to calibrate MRI reference structures is problematic and better reference materials are required.

  15. A Correction of Random Incidence Absorption Coefficients for the Angular Distribution of Acoustic Energy under Measurement Conditions

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2009-01-01

    Most acoustic measurements are based on an assumption of ideal conditions. One such ideal condition is a diffuse and reverberant field. In practice, a perfectly diffuse sound field cannot be achieved in a reverberation chamber. Uneven incident energy density under measurement conditions can cause...... discrepancies between the measured value and the theoretical random incidence absorption coefficient. Therefore the angular distribution of the incident acoustic energy onto an absorber sample should be taken into account. The angular distribution of the incident energy density was simulated using the beam...... tracing method for various room shapes and source positions. The averaged angular distribution is found to be similar to a Gaussian distribution. As a result, an angle-weighted absorption coefficient was proposed by considering the angular energy distribution to improve the agreement between...

  16. Analysis of covariance with pre-treatment measurements in randomized trials: comparison of equal and unequal slopes.

    Science.gov (United States)

    Funatogawa, Ikuko; Funatogawa, Takashi

    2011-09-01

    In randomized trials, an analysis of covariance (ANCOVA) is often used to analyze post-treatment measurements with pre-treatment measurements as a covariate to compare two treatment groups. Random allocation guarantees only equal variances of pre-treatment measurements. We hence consider data with unequal covariances and variances of post-treatment measurements without assuming normality. Recently, we showed that the actual type I error rate of the usual ANCOVA assuming equal slopes and equal residual variances is asymptotically at a nominal level under equal sample sizes, and that of the ANCOVA with unequal variances is asymptotically at a nominal level, even under unequal sample sizes. In this paper, we investigated the asymptotic properties of the ANCOVA with unequal slopes for such data. The estimators of the treatment effect at the observed mean are identical between equal and unequal variance assumptions, and these are asymptotically normal estimators for the treatment effect at the true mean. However, the variances of these estimators based on standard formulas are biased, and the actual type I error rates are not at a nominal level, irrespective of variance assumptions. In equal sample sizes, the efficiency of the usual ANCOVA assuming equal slopes and equal variances is asymptotically the same as those of the ANCOVA with unequal slopes and higher than that of the ANCOVA with equal slopes and unequal variances. Therefore, the use of the usual ANCOVA is appropriate in equal sample sizes. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Expectation-based approach for one-dimensional randomly disordered phononic crystals

    International Nuclear Information System (INIS)

    Wu, Feng; Gao, Qiang; Xu, Xiaoming; Zhong, Wanxie

    2014-01-01

    An expectation-based approach to the statistical theorem is proposed for the one-dimensional randomly disordered phononic crystal. In the proposed approach, the expectations of the random eigenstates of randomly disordered phononic crystals are investigated. In terms of the expectations of the random eigenstates, the wave propagation and localization phenomenon in the random phononic crystal could be understood in a statistical perspective. Using the proposed approach, it is proved that for a randomly disordered phononic crystal, the Bloch theorem holds in the perspective of expectation. A one-dimensional randomly disordered binary phononic crystal consisting of two materials with the random geometry size or random physical parameter is addressed by using the proposed approach. From the result, it can be observed that with the increase of the disorder degree, the localization of the expectations of the eigenstates is strengthened. The effect of the random disorder on the eigenstates at higher frequencies is more significant than that at lower frequencies. Furthermore, after introducing the random disorder into phononic crystals, some random divergent eigenstates are changed to localized eigenstates in expectation sense.

  18. Morphological changes after pelvic floor muscle training measured by 3-dimensional ultrasonography: a randomized controlled trial.

    Science.gov (United States)

    Braekken, Ingeborg Hoff; Hoff Braekken, Ingeborg; Majida, Memona; Engh, Marie Ellström; Bø, Kari

    2010-02-01

    To investigate morphological and functional changes after pelvic floor muscle training in women with pelvic organ prolapse. This randomized controlled trial was conducted at a university hospital and a physical therapy clinic. One hundred nine women with pelvic organ prolapse stages I, II, and III were randomly allocated by a computer-generated random number system to pelvic floor muscle training (n=59) or control (n=50). Both groups received lifestyle advice and learned to contract the pelvic floor muscles before and during increases in intraabdominal pressure. In addition the pelvic floor muscle training group did individual strength training with a physical therapist and daily home exercise for 6 months. Primary outcome measures were pelvic floor muscle (pubovisceral muscle) thickness, levator hiatus area, pubovisceral muscle length at rest and Valsalva, and resting position of bladder and rectum, measured by three-dimensional ultrasonography. Seventy-nine percent of women in the pelvic floor muscle training group adhered to at least 80% of the training protocol. Compared with women in the control group, women in the pelvic floor muscle training group increased muscle thickness (difference between groups: 1.9 mm, 95% confidence interval [CI] 1.1-2.7, Ppelvic floor muscle stiffness. Supervised pelvic floor muscle training can increase muscle volume, close the levator hiatus, shorten muscle length, and elevate the resting position of the bladder and rectum. www.clinicaltrials.gov, NCT00271297. I.

  19. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study

    Directory of Open Access Journals (Sweden)

    Ayça Utkan Karasu

    2018-03-01

    Full Text Available Objective: To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. Methods: During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12 or the control group (n = 11 by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Assessment Scale for Stroke Patients, Timed Up and Go Test and Static Balance Index. Secondary outcome measures were postural sway, as assessed with Emed-X, Functional Independence Measure Transfer and Ambulation Scores. An evaluator who was blinded to the groups made assessments immediately before (baseline, immediately after (post-treatment, and 4 weeks after completion of the study (follow-up. Results: Group-time interaction was significant in the Berg Balance Scale, Functional Reach Test, anteroposterior and mediolateral centre of pressure displacement with eyes open, anteroposterior centre of pressure displacement with eyes closed, centre of pressure displacement during weight shifting to affected side, to unaffected side and total centre of pressure displacement during weight shifting. Demonstrating significant group-time interaction in those parameters suggests that, while both groups exhibited significant improvement, the experimental group showed greater improvement than the control group. Conclusion: Virtual reality exercises with the Nintendo Wii system could represent a useful adjunctive therapy to traditional treatment to improve static and dynamic balance in stroke patients.

  20. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study.

    Science.gov (United States)

    Karasu, Ayça Utkan; Batur, Elif Balevi; Karataş, Gülçin Kaymak

    2018-05-08

    To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12) or the control group (n = 11) by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Assessment Scale for Stroke Patients, Timed Up and Go Test and Static Balance Index. Secondary outcome measures were postural sway, as assessed with Emed-X, Functional Independence Measure Transfer and Ambulation Scores. An evaluator who was blinded to the groups made assessments immediately before (baseline), immediately after (post-treatment), and 4 weeks after completion of the study (follow-up). Group-time interaction was significant in the Berg Balance Scale, Functional Reach Test, anteroposterior and mediolateral centre of pressure displacement with eyes open, anteroposterior centre of pressure displacement with eyes closed, centre of pressure displacement during weight shifting to affected side, to unaffected side and total centre of pressure displacement during weight shifting. Demonstrating significant group-time interaction in those parameters suggests that, while both groups exhibited significant improvement, the experimental group showed greater improvement than the control group. Virtual reality exercises with the Nintendo Wii system could represent a useful adjunctive therapy to traditional treatment to improve static and dynamic balance in stroke patients.

  1. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  2. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  3. A homodyne detector integrated onto a photonic chip for measuring quantum states and generating random numbers

    Science.gov (United States)

    Raffaelli, Francesco; Ferranti, Giacomo; Mahler, Dylan H.; Sibson, Philip; Kennard, Jake E.; Santamato, Alberto; Sinclair, Gary; Bonneau, Damien; Thompson, Mark G.; Matthews, Jonathan C. F.

    2018-04-01

    Optical homodyne detection has found use as a characterisation tool in a range of quantum technologies. So far implementations have been limited to bulk optics. Here we present the optical integration of a homodyne detector onto a silicon photonics chip. The resulting device operates at high speed, up 150 MHz, it is compact and it operates with low noise, quantified with 11 dB clearance between shot noise and electronic noise. We perform on-chip quantum tomography of coherent states with the detector and show that it meets the requirements for characterising more general quantum states of light. We also show that the detector is able to produce quantum random numbers at a rate of 1.2 Gbps, by measuring the vacuum state of the electromagnetic field and applying off-line post processing. The produced random numbers pass all the statistical tests provided by the NIST test suite.

  4. Implementation of IMAGE STEGANOGRAPHY Based on Random LSB

    OpenAIRE

    Ashish kumari; Shyama Sharma; Navdeep Bohra

    2012-01-01

    Steganography is the technique of hiding a private message within a file in such a manner that third party cannot know the existence of matter or the hidden information. The purpose of Steganography is to create secrete communication between the sender and the receiver byreplacing the least significant bits (LSB)of the cover image with the data bits. And in this paper we have shown that how image steganography (random and sequential LSB) works and practical understanding of what image Stegano...

  5. Deep Random based Key Exchange protocol resisting unlimited MITM

    OpenAIRE

    de Valroger, Thibault

    2018-01-01

    We present a protocol enabling two legitimate partners sharing an initial secret to mutually authenticate and to exchange an encryption session key. The opponent is an active Man In The Middle (MITM) with unlimited computation and storage capacities. The resistance to unlimited MITM is obtained through the combined use of Deep Random secrecy, formerly introduced and proved as unconditionally secure against passive opponent for key exchange, and universal hashing techniques. We prove the resis...

  6. Improving Pulse Rate Measurements during Random Motion Using a Wearable Multichannel Reflectance Photoplethysmograph

    Directory of Open Access Journals (Sweden)

    Kristen M. Warren

    2016-03-01

    Full Text Available Photoplethysmographic (PPG waveforms are used to acquire pulse rate (PR measurements from pulsatile arterial blood volume. PPG waveforms are highly susceptible to motion artifacts (MA, limiting the implementation of PR measurements in mobile physiological monitoring devices. Previous studies have shown that multichannel photoplethysmograms can successfully acquire diverse signal information during simple, repetitive motion, leading to differences in motion tolerance across channels. In this paper, we investigate the performance of a custom-built multichannel forehead-mounted photoplethysmographic sensor under a variety of intense motion artifacts. We introduce an advanced multichannel template-matching algorithm that chooses the channel with the least motion artifact to calculate PR for each time instant. We show that for a wide variety of random motion, channels respond differently to motion artifacts, and the multichannel estimate outperforms single-channel estimates in terms of motion tolerance, signal quality, and PR errors. We have acquired 31 data sets consisting of PPG waveforms corrupted by random motion and show that the accuracy of PR measurements achieved was increased by up to 2.7 bpm when the multichannel-switching algorithm was compared to individual channels. The percentage of PR measurements with error ≤ 5 bpm during motion increased by 18.9% when the multichannel switching algorithm was compared to the mean PR from all channels. Moreover, our algorithm enables automatic selection of the best signal fidelity channel at each time point among the multichannel PPG data.

  7. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  8. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  9. Ellipsometry measurements of glass transition breadth in bulk films of random, block, and gradient copolymers.

    Science.gov (United States)

    Mok, M M; Kim, J; Marrou, S R; Torkelson, J M

    2010-03-01

    Bulk films of random, block and gradient copolymer systems were studied using ellipsometry to demonstrate the applicability of the numerical differentiation technique pioneered by Kawana and Jones for studying the glass transition temperature (T (g)) behavior and thermal expansivities of copolymers possessing different architectures and different levels of nanoheterogeneity. In a series of styrene/n -butyl methacrylate (S/nBMA) random copolymers, T (g) breadths were observed to increase from approximately 17( degrees ) C in styrene-rich cases to almost 30( degrees ) C in nBMA-rich cases, reflecting previous observations of significant nanoheterogeneity in PnBMA homopolymers. The derivative technique also revealed for the first time a substantial increase in glassy-state expansivity with increasing nBMA content in S/nBMA random copolymers, from 1.4x10(-4) K-1 in PS to 3.5x10(-4) K-1 in PnBMA. The first characterization of block copolymer T (g) 's and T (g) breadths by ellipsometry is given, examining the impact of nanophase-segregated copolymer structure on ellipsometric measurements of glass transition. The results show that, while the technique is effective in detecting the two T (g) 's expected in certain block copolymer systems, the details of the glass transition can become suppressed in ellipsometry measurements of a rubbery minor phase under conditions where the matrix is glassy; meanwhile, both transitions are easily discernible by differential scanning calorimetry. Finally, broad glass transition regions were measured in gradient copolymers, yielding in some cases extraordinary T (g) breadths of 69- 71( degrees ) C , factors of 4-5 larger than the T (g) breadths of related homopolymers and random copolymers. Surprisingly, one gradient copolymer demonstrated a slightly narrower T (g) breadth than the S/nBMA random copolymers with the highest nBMA content. This highlights the fact that nanoheterogeneity relevant to the glass transition response in selected

  10. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  11. Effects of curriculum-based measurement on teachers' instructional planning.

    Science.gov (United States)

    Fuchs, L S; Fuchs, D; Stecker, P M

    1989-01-01

    This study assessed the effects of curriculum-based measurement (CBM) on teachers' instructional planning. Subjects were 30 teachers, assigned randomly to a computer-assisted CBM group, a noncomputer CBM group, and a contrast group. In the CBM groups, teachers specified 15-week reading goals, established CBM systems to measure student progress toward goals at least twice weekly, and systematically evaluated those data bases to determine when instructional modifications were necessary. Contrast teachers monitored student progress toward Individualized Education Program (IEP) goals as they wished and were encouraged to develop instructional programs as necessary. At the end of a 12- to 15-week implementation period, teachers completed a questionnaire with reference to one randomly selected pupil. Analyses of variance indicated no difference between the CBM groups. However, compared to the contrast group, CBM teachers (a) used more specific, acceptable goals; (b) were less optimistic about goal attainment; (c) cited more objective and frequent data sources for determining the adequacy of student progress and for deciding whether program modifications were necessary; and (d) modified student programs more frequently. Questionnaire responses were correlated with verifiable data sources, and results generally supported the usefulness of the self-report information. Implications for special education research and practice are discussed.

  12. Estimating random transverse velocities in the fast solar wind from EISCAT Interplanetary Scintillation measurements

    Directory of Open Access Journals (Sweden)

    A. Canals

    2002-09-01

    Full Text Available Interplanetary scintillation measurements can yield estimates of a large number of solar wind parameters, including bulk flow speed, variation in bulk velocity along the observing path through the solar wind and random variation in transverse velocity. This last parameter is of particular interest, as it can indicate the flux of low-frequency Alfvén waves, and the dissipation of these waves has been proposed as an acceleration mechanism for the fast solar wind. Analysis of IPS data is, however, a significantly unresolved problem and a variety of a priori assumptions must be made in interpreting the data. Furthermore, the results may be affected by the physical structure of the radio source and by variations in the solar wind along the scintillation ray path. We have used observations of simple point-like radio sources made with EISCAT between 1994 and 1998 to obtain estimates of random transverse velocity in the fast solar wind. The results obtained with various a priori assumptions made in the analysis are compared, and we hope thereby to be able to provide some indication of the reliability of our estimates of random transverse velocity and the variation of this parameter with distance from the Sun.Key words. Interplanetary physics (MHD waves and turbulence; solar wind plasma; instruments and techniques

  13. Estimating random transverse velocities in the fast solar wind from EISCAT Interplanetary Scintillation measurements

    Directory of Open Access Journals (Sweden)

    A. Canals

    Full Text Available Interplanetary scintillation measurements can yield estimates of a large number of solar wind parameters, including bulk flow speed, variation in bulk velocity along the observing path through the solar wind and random variation in transverse velocity. This last parameter is of particular interest, as it can indicate the flux of low-frequency Alfvén waves, and the dissipation of these waves has been proposed as an acceleration mechanism for the fast solar wind. Analysis of IPS data is, however, a significantly unresolved problem and a variety of a priori assumptions must be made in interpreting the data. Furthermore, the results may be affected by the physical structure of the radio source and by variations in the solar wind along the scintillation ray path. We have used observations of simple point-like radio sources made with EISCAT between 1994 and 1998 to obtain estimates of random transverse velocity in the fast solar wind. The results obtained with various a priori assumptions made in the analysis are compared, and we hope thereby to be able to provide some indication of the reliability of our estimates of random transverse velocity and the variation of this parameter with distance from the Sun.

    Key words. Interplanetary physics (MHD waves and turbulence; solar wind plasma; instruments and techniques

  14. Does level of specificity affect measures of motivation to comply? A randomized evaluation.

    Science.gov (United States)

    Branscum, Paul; Senkowski, Valerie

    2018-05-30

    The theory of planned behavior (TPB) is a popular value-expectancy model in social and behavioral health. Motivation to comply, one of the theory's constructs, has not been well operationalized and measured in the past, and to date, there has been no assessment of whether level of specificity affects the measurement of the construct. The purpose of this study was to measure the motivation to comply construct across four domains (from general to TACT-behavior specific) and evaluate the potential impact the differences have when identifying determinants of generalized injunctive norms. Students (n = 234) attending a large southwestern university completed a TPB survey related to sleep and physical activity, and were randomized to one of four domains that measured motivation to comply (General domain, n = 58; Health domain, n = 60; Behavioral domain, n = 56; and TACT domain, n = 60). Across both behaviors, motivation to comply measurements did not appear to be affected by changing the level of specificity. Referents for sleep and physical activity were mostly significant, but the effects were small to medium. Future researchers should consider removing motivation to comply measures from TPB surveys to reduce respondent burden or find alternative ways of measuring the construct.

  15. Intensified follow-up in colorectal cancer patients using frequent Carcino-Embryonic Antigen (CEA) measurements and CEA-triggered imaging : Results of the randomized "CEAwatch" trial

    NARCIS (Netherlands)

    Verberne, C. J.; Zhan, Z.; van den Heuvel, E.; Grossmann, I.; Doornbos, P. M.; Havenga, K.; Manusama, E.; Klaase, J.; van der Mijle, H. C. J.; Lamme, B.; Bosscha, K.; Baas, P.; van Ooijen, B.; Nieuwenhuijzen, G.; Marinelli, A.; van der Zaag, E.; Wasowicz, D.; de Bock, G. H.; Wiggers, T.

    Aim: The value of frequent Carcino-Embryonic Antigen (CEA) measurements and CEA-triggered imaging for detecting recurrent disease in colorectal cancer (CRC) patients was investigated in search for an evidence-based follow-up protocol. Methods: This is a randomized-controlled multicenter prospective

  16. A randomized controlled trial of smartphone-based mindfulness training for smoking cessation: a study protocol.

    Science.gov (United States)

    Garrison, Kathleen A; Pal, Prasanta; Rojiani, Rahil; Dallery, Jesse; O'Malley, Stephanie S; Brewer, Judson A

    2015-04-14

    Tobacco use is responsible for the death of about 1 in 10 individuals worldwide. Mindfulness training has shown preliminary efficacy as a behavioral treatment for smoking cessation. Recent advances in mobile health suggest advantages to smartphone-based smoking cessation treatment including smartphone-based mindfulness training. This study evaluates the efficacy of a smartphone app-based mindfulness training program for improving smoking cessation rates at 6-months follow-up. A two-group parallel-randomized clinical trial with allocation concealment will be conducted. Group assignment will be concealed from study researchers through to follow-up. The study will be conducted by smartphone and online. Daily smokers who are interested in quitting smoking and own a smartphone (n = 140) will be recruited through study advertisements posted online. After completion of a baseline survey, participants will be allocated randomly to the control or intervention group. Participants in both groups will receive a 22-day smartphone-based treatment program for smoking. Participants in the intervention group will receive mobile mindfulness training plus experience sampling. Participants in the control group will receive experience sampling-only. The primary outcome measure will be one-week point prevalence abstinence from smoking (at 6-months follow-up) assessed using carbon monoxide breath monitoring, which will be validated through smartphone-based video chat. This is the first intervention study to evaluate smartphone-based delivery of mindfulness training for smoking cessation. Such an intervention may provide treatment in-hand, in real-world contexts, to help individuals quit smoking. Clinicaltrials.gov NCT02134509 . Registered 7 May 2014.

  17. Link-Based Similarity Measures Using Reachability Vectors

    Directory of Open Access Journals (Sweden)

    Seok-Ho Yoon

    2014-01-01

    Full Text Available We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures.

  18. Phonon structures of GaN-based random semiconductor alloys

    Science.gov (United States)

    Zhou, Mei; Chen, Xiaobin; Li, Gang; Zheng, Fawei; Zhang, Ping

    2017-12-01

    Accurate modeling of thermal properties is strikingly important for developing next-generation electronics with high performance. Many thermal properties are closely related to phonon dispersions, such as sound velocity. However, random substituted semiconductor alloys AxB1-x usually lack translational symmetry, and simulation with periodic boundary conditions often requires large supercells, which makes phonon dispersion highly folded and hardly comparable with experimental results. Here, we adopt a large supercell with randomly distributed A and B atoms to investigate substitution effect on the phonon dispersions of semiconductor alloys systematically by using phonon unfolding method [F. Zheng, P. Zhang, Comput. Mater. Sci. 125, 218 (2016)]. The results reveal the extent to which phonon band characteristics in (In,Ga)N and Ga(N,P) are preserved or lost at different compositions and q points. Generally, most characteristics of phonon dispersions can be preserved with indium substitution of gallium in GaN, while substitution of nitrogen with phosphorus strongly perturbs the phonon dispersion of GaN, showing a rapid disintegration of the Bloch characteristics of optical modes and introducing localized impurity modes. In addition, the sound velocities of both (In,Ga)N and Ga(N,P) display a nearly linear behavior as a function of substitution compositions. Supplementary material in the form of one pdf file available from the Journal web page at http://https://doi.org/10.1140/epjb/e2017-80481-0.

  19. Embedded Memory Hierarchy Exploration Based on Magnetic Random Access Memory

    Directory of Open Access Journals (Sweden)

    Luís Vitório Cargnini

    2014-08-01

    Full Text Available Static random access memory (SRAM is the most commonly employed semiconductor in the design of on-chip processor memory. However, it is unlikely that the SRAM technology will have a cell size that will continue to scale below 45 nm, due to the leakage current that is caused by the quantum tunneling effect. Magnetic random access memory (MRAM is a candidate technology to replace SRAM, assuming appropriate dimensioning given an operating threshold voltage. The write current of spin transfer torque (STT-MRAM is a known limitation; however, this has been recently mitigated by leveraging perpendicular magnetic tunneling junctions. In this article, we present a comprehensive comparison of spin transfer torque-MRAM (STT-MRAM and SRAM cache set banks. The non-volatility of STT-MRAM allows the definition of new instant on/off policies and leakage current optimizations. Through our experiments, we demonstrate that STT-MRAM is a candidate for the memory hierarchy of embedded systems, due to the higher densities and reduced leakage of MRAM.We demonstrate that adopting STT-MRAM in L1 and L2 caches mitigates the impact of higher write latencies and increased current draw due to the use of MRAM. With the correct system-on-chip (SoC design, we believe that STT-MRAM is a viable alternative to SRAM, which minimizes leakage current and the total power consumed by the SoC.

  20. A SVD Based Image Complexity Measure

    DEFF Research Database (Denmark)

    Gustafsson, David Karl John; Pedersen, Kim Steenstrup; Nielsen, Mads

    2009-01-01

    Images are composed of geometric structures and texture, and different image processing tools - such as denoising, segmentation and registration - are suitable for different types of image contents. Characterization of the image content in terms of geometric structure and texture is an important...... problem that one is often faced with. We propose a patch based complexity measure, based on how well the patch can be approximated using singular value decomposition. As such the image complexity is determined by the complexity of the patches. The concept is demonstrated on sequences from the newly...... collected DIKU Multi-Scale image database....

  1. Ordinal-Measure Based Shape Correspondence

    Directory of Open Access Journals (Sweden)

    Faouzi Alaya Cheikh

    2002-04-01

    Full Text Available We present a novel approach to shape similarity estimation based on distance transformation and ordinal correlation. The proposed method operates in three steps: object alignment, contour to multilevel image transformation, and similarity evaluation. This approach is suitable for use in shape classification, content-based image retrieval and performance evaluation of segmentation algorithms. The two latter applications are addressed in this papers. Simulation results show that in both applications our proposed measure performs quite well in quantifying shape similarity. The scores obtained using this technique reflect well the correspondence between object contours as humans perceive it.

  2. Web-Based Cognitive Remediation Improves Supported Employment Outcomes in Severe Mental Illness: Randomized Controlled Trial.

    Science.gov (United States)

    Harris, Anthony Wf; Kosic, Tanya; Xu, Jean; Walker, Chris; Gye, William; Redoblado Hodge, Antoinette

    2017-09-20

    Finding work is a top priority for most people; however, this goal remains out of reach for the majority of individuals with a severe mental illness (SMI) who remain on benefits or are unemployed. Supported employment (SE) programs aimed at returning people with a severe mental illness to work are successful; however, they still leave a significant number of people with severe mental illness unemployed. Cognitive deficits are commonly found in SMI and are a powerful predictor of poor outcome. Fortunately, these deficits are amenable to treatment with cognitive remediation therapy (CRT) that significantly improves cognition in SMI. CRT combined with SE significantly increases the likelihood of individuals with severe mental illness obtaining and staying in work. However, the availability of CRT is limited in many settings. The aim of this study was to examine whether Web-based CRT combined with a SE program can improve the rate return to work of people with severe mental illness. A total of 86 people with severe mental illness (mean age 39.6 years; male: n=55) who were unemployed and who had joined a SE program were randomized to either a Web-based CRT program (CogRem) or an Internet-based control condition (WebInfo). Primary outcome measured was hours worked over 6 months post treatment. At 6 months, those participants randomized to CogRem had worked significantly more hours (P=.01) and had earned significantly more money (P=.03) than those participants randomized to the WebInfo control condition. No change was observed in cognition. This study corroborates other work that has found a synergistic effect of combining CRT with a SE program and extends this to the use of Web-based CRT. The lack of any improvement in cognition obscures the mechanism by which an improved wage outcome for participants randomized to the active treatment was achieved. However, the study substantially lowers the barrier to the deployment of CRT with other psychosocial interventions for

  3. Evaluation of a Drowning Prevention Program Based on Testimonial Videos: A Randomized Controlled Trial.

    Science.gov (United States)

    Shen, Jiabin; Pang, Shulan; Schwebel, David C

    2016-06-01

    Unintentional drowning is the most common cause of childhood death in rural China. Global intervention efforts offer mixed results regarding the efficacy of educational programs. Using a randomized controlled design, we evaluated a testimonial-based intervention to reduce drowning risk among 280 3rd- and 4th-grade rural Chinese children. Children were randomly assigned to view either testimonials on drowning risk (intervention) or dog-bite risk (control). Safety knowledge and perceived vulnerability were measured by self-report questionnaires, and simulated behaviors in and near water were assessed with a culturally appropriate dollhouse task. Children in the intervention group had improved children's safety knowledge and simulated behaviors but not perceived vulnerability compared with controls. The testimonial-based intervention's efficacy appears promising, as it improved safety knowledge and simulated risk behaviors with water among rural Chinese children. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Green maritime transportation: Market based measures

    DEFF Research Database (Denmark)

    Psaraftis, Harilaos N.

    2016-01-01

    The purpose of this chapter is to introduce the concept of Market Based Measures (MBMs) to reduce Green House Gas (GHG) emissions from ships, and review several distinct MBM proposals that have been under consideration by the International Maritime Organization (IMO). The chapter discusses the me...... the mechanisms used by MBMs, and explores how the concept of the Marginal Abatement Cost (MAC) can be linked to MBMs. It also attempts to discuss the pros and cons of the submitted proposals....

  5. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  6. Quantum random number generator based on quantum nature of vacuum fluctuations

    Science.gov (United States)

    Ivanova, A. E.; Chivilikhin, S. A.; Gleim, A. V.

    2017-11-01

    Quantum random number generator (QRNG) allows obtaining true random bit sequences. In QRNG based on quantum nature of vacuum, optical beam splitter with two inputs and two outputs is normally used. We compare mathematical descriptions of spatial beam splitter and fiber Y-splitter in the quantum model for QRNG, based on homodyne detection. These descriptions were identical, that allows to use fiber Y-splitters in practical QRNG schemes, simplifying the setup. Also we receive relations between the input radiation and the resulting differential current in homodyne detector. We experimentally demonstrate possibility of true random bits generation by using QRNG based on homodyne detection with Y-splitter.

  7. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  8. Bioimpedance measurement based evaluation of wound healing.

    Science.gov (United States)

    Kekonen, Atte; Bergelin, Mikael; Eriksson, Jan-Erik; Vaalasti, Annikki; Ylänen, Heimo; Viik, Jari

    2017-06-22

    Our group has developed a bipolar bioimpedance measurement-based method for determining the state of wound healing. The objective of this study was to assess the capability of the method. To assess the performance of the method, we arranged a follow-up study of four acute wounds. The wounds were measured using the method and photographed throughout the healing process. Initially the bioimpedance of the wounds was significantly lower than the impedance of the undamaged skin, used as a baseline. Gradually, as healing progressed, the wound impedance increased and finally reached the impedance of the undamaged skin. The clinical appearance of the wounds examined in this study corresponded well with the parameters derived from the bioimpedance data. Hard-to-heal wounds are a significant and growing socioeconomic burden, especially in the developed countries, due to aging populations and to the increasing prevalence of various lifestyle related diseases. The assessment and the monitoring of chronic wounds are mainly based on visual inspection by medical professionals. The dressings covering the wound must be removed before assessment; this may disturb the wound healing process and significantly increases the work effort of the medical staff. There is a need for an objective and quantitative method for determining the status of a wound without removing the wound dressings. This study provided evidence of the capability of the bioimpedance based method for assessing the wound status. In the future measurements with the method should be extended to concern hard-to-heal wounds.

  9. On the design of henon and logistic map-based random number generator

    Science.gov (United States)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  10. High-speed true random number generation based on paired memristors for security electronics

    Science.gov (United States)

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-01

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  11. Fiber-Type Random Laser Based on a Cylindrical Waveguide with a Disordered Cladding Layer.

    Science.gov (United States)

    Zhang, Wei Li; Zheng, Meng Ya; Ma, Rui; Gong, Chao Yang; Yang, Zhao Ji; Peng, Gang Ding; Rao, Yun Jiang

    2016-05-25

    This letter reports a fiber-type random laser (RL) which is made from a capillary coated with a disordered layer at its internal surface and filled with a gain (laser dye) solution in the core region. This fiber-type optical structure, with the disordered layer providing randomly scattered light into the gain region and the cylindrical waveguide providing confinement of light, assists the formation of random lasing modes and enables a flexible and efficient way of making random lasers. We found that the RL is sensitive to laser dye concentration in the core region and there exists a fine exponential relationship between the lasing intensity and particle concentration in the gain solution. The proposed structure could be a fine platform of realizing random lasing and random lasing based sensing.

  12. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  13. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  14. A school-based randomized controlled trial to improve physical activity among Iranian high school girls

    Directory of Open Access Journals (Sweden)

    Ghofranipour Fazloalha

    2008-04-01

    Full Text Available Abstract Background Physical activity (PA rates decline precipitously during the high school years and are consistently lower among adolescent girls than adolescent boys. Due to cultural barriers, this problem might be exacerbated in female Iranian adolescents. However, little intervention research has been conducted to try to increase PA participation rates with this population. Because PA interventions in schools have the potential to reach many children and adolescents, this study reports on PA intervention research conducted in all-female Iranian high schools. Methods A randomized controlled trial was conducted to examine the effects of two six-month tailored interventions on potential determinants of PA and PA behavior. Students (N = 161 were randomly allocated to one of three conditions: an intervention based on Pender's Health Promotion model (HP, an intervention based on an integration of the health promotion model and selected constructs from the Transtheoretical model (THP, and a control group (CON. Measures were administered prior to the intervention, at post-intervention and at a six-month follow-up. Results Repeated measure ANOVAs showed a significant interaction between group and time for perceived benefits, self efficacy, interpersonal norms, social support, behavioral processes, and PA behavior, indicating that both intervention groups significantly improved across the 24-week intervention, whereas the control group did not. Participants in the THP group showed greater use of counter conditioning and stimulus control at post-intervention and at follow-up. While there were no significant differences in PA between the HP and CON groups at follow-up, a significant difference was still found between the THP and the CON group. Conclusion This study provides the first evidence of the effectiveness of a PA intervention based on Pender's HP model combined with selected aspects of the TTM on potential determinants to increase PA among

  15. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  16. Animal-based measures for welfare assessment

    Directory of Open Access Journals (Sweden)

    Agostino Sevi

    2010-01-01

    Full Text Available Animal welfare assessment can’t be irrespective of measures taken on animals. Indeed, housing parametersrelatedtostructures, designandmicro-environment, evenifreliable parameters related to structures, design and micro-environment, even if reliable and easier to take, can only identify conditions which could be detrimental to animal welfare, but can’t predict poor welfare in animals per se. Welfare assessment through animal-based measures is almost complex, given that animals’ responses to stressful conditions largely depend on the nature, length and intensity of challenges and on physiological status, age, genetic susceptibility and previous experience of animals. Welfare assessment requires a multi-disciplinary approach and the monitoring of productive, ethological, endocrine, immunological and pathological param- eters to be exhaustive and reliable. So many measures are needed, because stresses can act only on some of the mentioned parameters or on all of them but at different times and degree. Under this point of view, the main aim of research is to find feasible and most responsive indicators of poor animal welfare. In last decades, studies focused on the following parameters for animal wel- fare assessment indexes of biological efficiency, responses to behavioral tests, cortisol secretion, neutrophil to lymphocyte ratio, lymphocyte proliferation, production of antigen specific IgG and cytokine release, somatic cell count and acute phase proteins. Recently, a lot of studies have been addressed to reduce handling and constraint of animals for taking measures to be used in welfare assessment, since such procedures can induce stress in animals and undermined the reliability of measures taken for welfare assessment. Range of animal-based measures for welfare assessment is much wider under experimental condition than at on-farm level. In welfare monitoring on-farm the main aim is to find feasible measures of proved validity and reliability

  17. Errors due to random noise in velocity measurement using incoherent-scatter radar

    Directory of Open Access Journals (Sweden)

    P. J. S. Williams

    1996-12-01

    Full Text Available The random-noise errors involved in measuring the Doppler shift of an 'incoherent-scatter' spectrum are predicted theoretically for all values of Te/Ti from 1.0 to 3.0. After correction has been made for the effects of convolution during transmission and reception and the additional errors introduced by subtracting the average of the background gates, the rms errors can be expressed by a simple semi-empirical formula. The observed errors are determined from a comparison of simultaneous EISCAT measurements using an identical pulse code on several adjacent frequencies. The plot of observed versus predicted error has a slope of 0.991 and a correlation coefficient of 99.3%. The prediction also agrees well with the mean of the error distribution reported by the standard EISCAT analysis programme.

  18. Evaluation of domain randomness in periodically poled lithium niobate by diffraction noise measurement.

    Science.gov (United States)

    Dwivedi, Prashant Povel; Choi, Hee Joo; Kim, Byoung Joo; Cha, Myoungsik

    2013-12-16

    Random duty-cycle errors (RDE) in ferroelectric quasi-phase-matching (QPM) devices not only affect the frequency conversion efficiency, but also generate non-phase-matched parasitic noise that can be detrimental to some applications. We demonstrate an accurate but simple method for measuring the RDE in periodically poled lithium niobate. Due to the equivalence between the undepleted harmonic generation spectrum and the diffraction pattern from the QPM grating, we employed linear diffraction measurement which is much simpler than tunable harmonic generation experiments [J. S. Pelc, et al., Opt. Lett.36, 864-866 (2011)]. As a result, we could relate the RDE for the QPM device to the relative noise intensity between the diffraction orders.

  19. Measuring globalization-based acculturation in Ladakh

    DEFF Research Database (Denmark)

    Ozer, Simon; Schwartz, Seth

    2016-01-01

    Theories and methodologies within acculturation psychology have been advanced in orderto capture the complex process of intercultural contact in various contexts. Differentiatingglobalization-based acculturation from immigrant-based acculturation has broadened thefield of acculturation psychology...... to include groups who are exposed to global culturalstreams without international migration. The globalization-based acculturation process inthe North Indian region of Ladakh appears to be a tricultural encounter, suggesting anaddendum to the bidimensional acculturation model for this group (and perhaps...... for othersas well). This study explores the development, usability, and validity of a tridimensionalacculturation measure aiming to capture the multicultural orientations initiated by theprocess of globalization in Ladakh. The tridimensional acculturation scale was found to fitthe data significantly better...

  20. Property-Based Software Engineering Measurement

    Science.gov (United States)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  1. Measurements of Electromagnetic Fields Emitted from Cellular Base Stations in

    Directory of Open Access Journals (Sweden)

    K. J. Ali

    2013-05-01

    Full Text Available With increasing the usage of mobile communication devices and internet network information, the entry of private telecommunications companies in Iraq has been started since 2003. These companies began to build up cellular towers to accomplish the telecommunication works but they ignore the safety conditions imposed for the health and environment that are considered in random way. These negative health effects which may cause a health risk for life beings and environment pollution. The aim of this work is to determine the safe and unsafe ranges and discuss damage caused by radiation emitted from Asia cell base stations in Shirqat city and discuses the best ways in which can be minimize its exposure level to avoid its negative health effects. Practical measurements of power density around base stations has been accomplished by using a radiation survey meter type (Radio frequency EMF Strength Meter 480846 in two ways. The first way of measurements has been accomplished at a height of 2 meters above ground for different distances from (0-300 meters .The second way is at a distance of 150 meters for different levels from (2-15 meters above ground level. The maximum measured power density is about (3 mW/m2. Results indicate that the levels of power density are far below the RF radiation exposure of USSR safety standards levels. And that means these cellular base station don't cause negative the health effect for life being if the exposure is within the acceptable international standard levels.

  2. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  3. Brownian motion properties of optoelectronic random bit generators based on laser chaos.

    Science.gov (United States)

    Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge

    2016-07-11

    The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.

  4. Cross-covariance functions for multivariate random fields based on latent dimensions

    KAUST Repository

    Apanasovich, T. V.; Genton, M. G.

    2010-01-01

    The problem of constructing valid parametric cross-covariance functions is challenging. We propose a simple methodology, based on latent dimensions and existing covariance models for univariate random fields, to develop flexible, interpretable

  5. Markov random field based automatic image alignment for electron tomography.

    Science.gov (United States)

    Amat, Fernando; Moussavi, Farshid; Comolli, Luis R; Elidan, Gal; Downing, Kenneth H; Horowitz, Mark

    2008-03-01

    We present a method for automatic full-precision alignment of the images in a tomographic tilt series. Full-precision automatic alignment of cryo electron microscopy images has remained a difficult challenge to date, due to the limited electron dose and low image contrast. These facts lead to poor signal to noise ratio (SNR) in the images, which causes automatic feature trackers to generate errors, even with high contrast gold particles as fiducial features. To enable fully automatic alignment for full-precision reconstructions, we frame the problem probabilistically as finding the most likely particle tracks given a set of noisy images, using contextual information to make the solution more robust to the noise in each image. To solve this maximum likelihood problem, we use Markov Random Fields (MRF) to establish the correspondence of features in alignment and robust optimization for projection model estimation. The resulting algorithm, called Robust Alignment and Projection Estimation for Tomographic Reconstruction, or RAPTOR, has not needed any manual intervention for the difficult datasets we have tried, and has provided sub-pixel alignment that is as good as the manual approach by an expert user. We are able to automatically map complete and partial marker trajectories and thus obtain highly accurate image alignment. Our method has been applied to challenging cryo electron tomographic datasets with low SNR from intact bacterial cells, as well as several plastic section and X-ray datasets.

  6. Korean Clinic Based Outcome Measure Studies

    Directory of Open Access Journals (Sweden)

    Jongbae Park

    2003-02-01

    Full Text Available Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented briefly here including 1 Quality of Life of liver cancer patients after 8 Constitutional acupuncture; 2 Developing a Korean version of Measuring yourself Medical Outcome profile (MYMOP; and 3 Survey on 5 Shu points: a pilot In the first study, we have included 4 primary or secondary liver cancer patients collecting their diagnostic X-ray film and clinical data f개m their hospital, and asked them to fill in the European Organization Research and Treatment of Cancer, Quality of Life Questionnaire before the commencement of the treatment. The acupuncture treatment is set up format but not disclosed yet. The translation and developing a Korean version of outcome measures that is Korean clinician friendly has been sought for MYMOP is one of the most appropriate one. The permission was granted, the translation into Korean was done, then back translated into English only based on the Korean translation by the researcher who is bilingual in both languages. The back translation was compared by the original developer of MYMOP and confirmed usable. In order to test the existence of acupoints and meridians through popular forms of Korean acupuncture regimes, we aim at collecting opinions from 101 Korean clinicians that have used those forms. The questions asked include most effective symptoms, 5 Shu points, points those are least likely to use due to either adverse events or the lack of effectiveness, theoretical reasons for the above proposals, proposing outcome measures

  7. Effectiveness of a web-based intervention for injured claimants: a randomized controlled trial.

    Science.gov (United States)

    Elbers, Nieke A; Akkermans, Arno J; Cuijpers, Pim; Bruinvels, David J

    2013-07-20

    There is considerable evidence showing that injured people who are involved in a compensation process show poorer physical and mental recovery than those with similar injuries who are not involved in a compensation process. One explanation for this reduced recovery is that the legal process and the associated retraumatization are very stressful for the claimant. The aim of this study was to empower injured claimants in order to facilitate recovery. Participants were recruited by three Dutch claims settlement offices. The participants had all been injured in a traffic crash and were involved in a compensation process. The study design was a randomized controlled trial. An intervention website was developed with (1) information about the compensation process, and (2) an evidence-based, therapist-assisted problem-solving course. The control website contained a few links to already existing websites. Outcome measures were empowerment, self-efficacy, health status (including depression, anxiety, and somatic symptoms), perceived fairness, ability to work, claims knowledge and extent of burden. The outcomes were self-reported through online questionnaires and were measured four times: at baseline, and at 3, 6, and 12 months. In total, 176 participants completed the baseline questionnaire after which they were randomized into either the intervention group (n=88) or the control group (n=88). During the study, 35 participants (20%) dropped out. The intervention website was used by 55 participants (63%). The health outcomes of the intervention group were no different to those of the control group. However, the intervention group considered the received compensation to be fairer (Pwebsite was evaluated positively. Although the web-based intervention was not used enough to improve the health of injured claimants in compensation processes, it increased the perceived fairness of the compensation amount. Netherlands Trial Register NTR2360.

  8. Internet-Based, Randomized Controlled Trial of Omega-3 Fatty Acids for Hyperactivity in Autism

    Science.gov (United States)

    Bent, Stephen; Hendren, Robert L.; Zandi, Tara; Law, Kiely; Choi, Jae-Eun; Widjaja, Felicia; Kalb, Luther; Nestle, Jay; Law, Paul

    2014-01-01

    Objective Preliminary evidence suggests that omega-3 fatty acids may reduce hyperactivity in children with autism spectrum disorder (ASD). We sought to examine the feasibility of a novel, internet-based clinical trial design to evaluate the efficacy of this supplement. Method E-mail invitations were sent to parents of children aged 5-8 enrolled in the Interactive Autism Network. All study procedures, including screening, informed consent, and collection of outcome measures took place over the internet. The primary outcome measures were parent- and teacher-rated changes in hyperactivity on the Aberrant Behavior Checklist. Results During the 6-week recruitment period, 57 children from 28 states satisfied all eligibility criteria and were randomly assigned to 1.3 grams of omega-3 fatty acids or an identical placebo daily for 6 weeks. Outcome assessments were obtained from all 57 participants and 57 teachers, and the study was completed in 3 months. Children in the omega-3 fatty acid group had a greater reduction in hyperactivity (-5.3 points) compared to the placebo group (-2.6 points), but the difference was not statistically significant (1.9 point greater improvement in the omega-3 group, 95% CI -2.2 to 5.2). Side effects were rare and not associated with omega-3 fatty acids. Participant feedback was positive. Conclusion Internet-based randomized controlled trials of therapies in children with ASD are feasible and may lead to marked reductions in the time and cost of completing trials. A larger sample size is required to definitively determine the efficacy of omega-3 fatty acids. Clinical trial registration information—Omega-3 Fatty Acids for Hyperactivity Treatment in Autism Spectrum Disorder; http://clinicaltrials.gov; NCT01694667. PMID:24839884

  9. Internet-based cognitive-behavior therapy for procrastination: A randomized controlled trial.

    Science.gov (United States)

    Rozental, Alexander; Forsell, Erik; Svensson, Andreas; Andersson, Gerhard; Carlbring, Per

    2015-08-01

    Procrastination can be a persistent behavior pattern associated with personal distress. However, research investigating different treatment interventions is scarce, and no randomized controlled trial has examined the efficacy of cognitive-behavior therapy (CBT). Meanwhile, Internet-based CBT has been found promising for several conditions, but has not yet been used for procrastination. Participants (N = 150) were randomized to guided self-help, unguided self-help, and wait-list control. Outcome measures were administered before and after treatment, or weekly throughout the treatment period. They included the Pure Procrastination Scale, the Irrational Procrastination Scale, the Susceptibility to Temptation Scale, the Montgomery Åsberg Depression Rating Scale-Self-report version, the Generalized Anxiety Disorder Assessment, and the Quality of Life Inventory. The intention-to-treat principle was used for all statistical analyses. Mixed-effects models revealed moderate between-groups effect sizes comparing guided and unguided self-help with wait-list control; the Pure Procrastination Scale, Cohen's d = 0.70, 95% confidence interval (CI) [0.29, 1.10], and d = 0.50, 95% CI [0.10, 0.90], and the Irrational Procrastination Scale, d = 0.81 95% CI [0.40, 1.22], and d = 0.69 95% CI [0.29, 1.09]. Clinically significant change was achieved among 31.3-40.0% for guided self-help, compared with 24.0-36.0% for unguided self-help. Neither of the treatment conditions was found to be superior on any of the outcome measures, Fs(98, 65.17-72.55) .19. Internet-based CBT could be useful for managing self-reported difficulties due to procrastination, both with and without the guidance of a therapist. (c) 2015 APA, all rights reserved).

  10. Development of a Layered Conditional Random Field Based ...

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    2014-12-01

    Dec 1, 2014 ... The recent denial of service attacks on major Internet sites has shown that no open ..... of a single record, which further degrades attack detection accuracy. ... distributed intrusion detection framework based on mobile agents.

  11. Task-Based Mirror Therapy Augmenting Motor Recovery in Poststroke Hemiparesis: A Randomized Controlled Trial.

    Science.gov (United States)

    Arya, Kamal Narayan; Pandian, Shanta; Kumar, Dharmendra; Puri, Vinod

    2015-08-01

    To establish the effect of the task-based mirror therapy (TBMT) on the upper limb recovery in stroke. A pilot, randomized, controlled, assessor-blinded trial was conducted in a rehabilitation institute. A convenience sample of 33 poststroke (mean duration, 12.5 months) hemiparetic subjects was randomized into 2 groups (experimental, 17; control, 16). The subjects were allocated to receive either TBMT or standard motor rehabilitation-40 sessions (5/week) for a period of 8 weeks. The TBMT group received movements using various goal-directed tasks and a mirror box. The movements were performed by the less-affected side superimposed on the affected side. The main outcome measures were Brunnstrom recovery stage (BRS) and Fugl-Meyer assessment (FMA)-FMA of upper extremity (FMA-UE), including upper arm (FMA-UA) and wrist-hand (FMA-WH). The TBMT group exhibited highly significant improvement on mean scores of FMA-WH (P hemiparesis. MT using tasks may be used as an adjunct in stroke rehabilitation. Copyright © 2015 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  12. New distributed fusion filtering algorithm based on covariances over sensor networks with random packet dropouts

    Science.gov (United States)

    Caballero-Águila, R.; Hermoso-Carazo, A.; Linares-Pérez, J.

    2017-07-01

    This paper studies the distributed fusion estimation problem from multisensor measured outputs perturbed by correlated noises and uncertainties modelled by random parameter matrices. Each sensor transmits its outputs to a local processor over a packet-erasure channel and, consequently, random losses may occur during transmission. Different white sequences of Bernoulli variables are introduced to model the transmission losses. For the estimation, each lost output is replaced by its estimator based on the information received previously, and only the covariances of the processes involved are used, without requiring the signal evolution model. First, a recursive algorithm for the local least-squares filters is derived by using an innovation approach. Then, the cross-correlation matrices between any two local filters is obtained. Finally, the distributed fusion filter weighted by matrices is obtained from the local filters by applying the least-squares criterion. The performance of the estimators and the influence of both sensor uncertainties and transmission losses on the estimation accuracy are analysed in a numerical example.

  13. True random number generation from mobile telephone photo based on chaotic cryptography

    International Nuclear Information System (INIS)

    Zhao Liang; Liao Xiaofeng; Xiao Di; Xiang Tao; Zhou Qing; Duan Shukai

    2009-01-01

    A cheap, convenient and universal TRNG based on mobile telephone photo for producing random bit sequence is proposed. To settle the problem of sequential pixels and comparability, three chaos-based approaches are applied to post-process the generated binary image. The random numbers produced by three users are tested using US NIST RNG statistical test software. The experimental results indicate that the Arnold cat map is the fastest way to generate a random bit sequence and can be accepted on general PC. The 'MASK' algorithm also performs well. Finally, comparing with the TRNG of Hu et al. [Hu Y, Liao X, Wong KW, Zhou Q. A true random number generator based on mouse movement and chaotic cryptography. Chaos, Solitons and Fractals 2007. doi: 10.1016/j.chaos.2007.10.022] which is presented by Hu et al., many merits of the proposed TRNG in this paper has been found.

  14. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  15. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    Science.gov (United States)

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  16. Analytic degree distributions of horizontal visibility graphs mapped from unrelated random series and multifractal binomial measures

    Science.gov (United States)

    Xie, Wen-Jie; Han, Rui-Qi; Jiang, Zhi-Qiang; Wei, Lijian; Zhou, Wei-Xing

    2017-08-01

    Complex network is not only a powerful tool for the analysis of complex system, but also a promising way to analyze time series. The algorithm of horizontal visibility graph (HVG) maps time series into graphs, whose degree distributions are numerically and analytically investigated for certain time series. We derive the degree distributions of HVGs through an iterative construction process of HVGs. The degree distributions of the HVG and the directed HVG for random series are derived to be exponential, which confirms the analytical results from other methods. We also obtained the analytical expressions of degree distributions of HVGs and in-degree and out-degree distributions of directed HVGs transformed from multifractal binomial measures, which agree excellently with numerical simulations.

  17. Inference-Based Similarity Search in Randomized Montgomery Domains for Privacy-Preserving Biometric Identification.

    Science.gov (United States)

    Wang, Yi; Wan, Jianwu; Guo, Jun; Cheung, Yiu-Ming; C Yuen, Pong

    2017-07-14

    Similarity search is essential to many important applications and often involves searching at scale on high-dimensional data based on their similarity to a query. In biometric applications, recent vulnerability studies have shown that adversarial machine learning can compromise biometric recognition systems by exploiting the biometric similarity information. Existing methods for biometric privacy protection are in general based on pairwise matching of secured biometric templates and have inherent limitations in search efficiency and scalability. In this paper, we propose an inference-based framework for privacy-preserving similarity search in Hamming space. Our approach builds on an obfuscated distance measure that can conceal Hamming distance in a dynamic interval. Such a mechanism enables us to systematically design statistically reliable methods for retrieving most likely candidates without knowing the exact distance values. We further propose to apply Montgomery multiplication for generating search indexes that can withstand adversarial similarity analysis, and show that information leakage in randomized Montgomery domains can be made negligibly small. Our experiments on public biometric datasets demonstrate that the inference-based approach can achieve a search accuracy close to the best performance possible with secure computation methods, but the associated cost is reduced by orders of magnitude compared to cryptographic primitives.

  18. Optimizing block-based maintenance under random machine usage

    NARCIS (Netherlands)

    de Jonge, Bram; Jakobsons, Edgars

    Existing studies on maintenance optimization generally assume that machines are either used continuously, or that times until failure do not depend on the actual usage. In practice, however, these assumptions are often not realistic. In this paper, we consider block-based maintenance optimization

  19. Web-Based Cognitive Behavioral Therapy for Female Patients With Eating Disorders: Randomized Controlled Trial.

    Science.gov (United States)

    ter Huurne, Elke D; de Haan, Hein A; Postel, Marloes G; van der Palen, Job; VanDerNagel, Joanne E L; DeJong, Cornelis A J

    2015-06-18

    Many patients with eating disorders do not receive help for their symptoms, even though these disorders have severe morbidity. The Internet may offer alternative low-threshold treatment interventions. This study evaluated the effects of a Web-based cognitive behavioral therapy (CBT) intervention using intensive asynchronous therapeutic support to improve eating disorder psychopathology, and to reduce body dissatisfaction and related health problems among patients with eating disorders. A two-arm open randomized controlled trial comparing a Web-based CBT intervention to a waiting list control condition (WL) was carried out among female patients with bulimia nervosa (BN), binge eating disorder (BED), and eating disorders not otherwise specified (EDNOS). The eating disorder diagnosis was in accordance with the Diagnostic and Statistical Manual of Mental Disorders, 4th edition, and was established based on participants' self-report. Participants were recruited from an open-access website, and the intervention consisted of a structured two-part program within a secure Web-based application. The aim of the first part was to analyze participant's eating attitudes and behaviors, while the second part focused on behavioral change. Participants had asynchronous contact with a personal therapist twice a week, solely via the Internet. Self-report measures of eating disorder psychopathology (primary outcome), body dissatisfaction, physical health, mental health, self-esteem, quality of life, and social functioning were completed at baseline and posttest. A total of 214 participants were randomized to either the Web-based CBT group (n=108) or to the WL group (n=106) stratified by type of eating disorder (BN: n=44; BED: n=85; EDNOS: n=85). Study attrition was low with 94% of the participants completing the posttest assignment. Overall, Web-based CBT showed a significant improvement over time for eating disorder psychopathology (F97=63.07, PWeb-based CBT participants in all three

  20. A true random number generator based on mouse movement and chaotic cryptography

    International Nuclear Information System (INIS)

    Hu Yue; Liao Xiaofeng; Wong, Kwok-wo; Zhou Qing

    2009-01-01

    True random number generators are in general more secure than pseudo random number generators. In this paper, we propose a novel true random number generator which generates a 256-bit random number by computer mouse movement. It is cheap, convenient and universal for personal computers. To eliminate the effect of similar movement patterns generated by the same user, three chaos-based approaches, namely, discretized 2D chaotic map permutation, spatiotemporal chaos and 'MASK' algorithm, are adopted to post-process the captured mouse movements. Random bits generated by three users are tested using NIST statistical tests. Both the spatiotemporal chaos approach and the 'MASK' algorithm pass the tests successfully. However, the latter has a better performance in terms of efficiency and effectiveness and so is more practical for common personal computer applications.

  1. Post-processing Free Quantum Random Number Generator Based on Avalanche Photodiode Array

    International Nuclear Information System (INIS)

    Li Yang; Liao Sheng-Kai; Liang Fu-Tian; Shen Qi; Liang Hao; Peng Cheng-Zhi

    2016-01-01

    Quantum random number generators adopting single photon detection have been restricted due to the non-negligible dead time of avalanche photodiodes (APDs). We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32 × 32 APD array is up to tens of Gbits/s. (paper)

  2. Incorporating PROMIS Symptom Measures into Primary Care Practice-a Randomized Clinical Trial.

    Science.gov (United States)

    Kroenke, Kurt; Talib, Tasneem L; Stump, Timothy E; Kean, Jacob; Haggstrom, David A; DeChant, Paige; Lake, Kittie R; Stout, Madison; Monahan, Patrick O

    2018-04-05

    Symptoms account for more than 400 million clinic visits annually in the USA. The SPADE symptoms (sleep, pain, anxiety, depression, and low energy/fatigue) are particularly prevalent and undertreated. To assess the effectiveness of providing PROMIS (Patient-Reported Outcome Measure Information System) symptom scores to clinicians on symptom outcomes. Randomized clinical trial conducted from March 2015 through May 2016 in general internal medicine and family practice clinics in an academic healthcare system. Primary care patients who screened positive for at least one SPADE symptom. After completing the PROMIS symptom measures electronically immediately prior to their visit, the 300 study participants were randomized to a feedback group in which their clinician received a visual display of symptom scores or a control group in which scores were not provided to clinicians. The primary outcome was the 3-month change in composite SPADE score. Secondary outcomes were individual symptom scores, symptom documentation in the clinic note, symptom-specific clinician actions, and patient satisfaction. Most patients (84%) had multiple clinically significant (T-score ≥ 55) SPADE symptoms. Both groups demonstrated moderate symptom improvement with a non-significant trend favoring the feedback compared to control group (between-group difference in composite T-score improvement, 1.1; P = 0.17). Symptoms present at baseline resolved at 3-month follow-up only one third of the time, and patients frequently still desired treatment. Except for pain, clinically significant symptoms were documented less than half the time. Neither symptom documentation, symptom-specific clinician actions, nor patient satisfaction differed between treatment arms. Predictors of greater symptom improvement included female sex, black race, fewer medical conditions, and receiving care in a family medicine clinic. Simple feedback of symptom scores to primary care clinicians in the absence of

  3. Mindfulness-based prevention for eating disorders: A school-based cluster randomized controlled study.

    Science.gov (United States)

    Atkinson, Melissa J; Wade, Tracey D

    2015-11-01

    Successful prevention of eating disorders represents an important goal due to damaging long-term impacts on health and well-being, modest treatment outcomes, and low treatment seeking among individuals at risk. Mindfulness-based approaches have received early support in the treatment of eating disorders, but have not been evaluated as a prevention strategy. This study aimed to assess the feasibility, acceptability, and efficacy of a novel mindfulness-based intervention for reducing the risk of eating disorders among adolescent females, under both optimal (trained facilitator) and task-shifted (non-expert facilitator) conditions. A school-based cluster randomized controlled trial was conducted in which 19 classes of adolescent girls (N = 347) were allocated to a three-session mindfulness-based intervention, dissonance-based intervention, or classes as usual control. A subset of classes (N = 156) receiving expert facilitation were analyzed separately as a proxy for delivery under optimal conditions. Task-shifted facilitation showed no significant intervention effects across outcomes. Under optimal facilitation, students receiving mindfulness demonstrated significant reductions in weight and shape concern, dietary restraint, thin-ideal internalization, eating disorder symptoms, and psychosocial impairment relative to control by 6-month follow-up. Students receiving dissonance showed significant reductions in socio-cultural pressures. There were no statistically significant differences between the two interventions. Moderate intervention acceptability was reported by both students and teaching staff. Findings show promise for the application of mindfulness in the prevention of eating disorders; however, further work is required to increase both impact and acceptability, and to enable successful outcomes when delivered by less expert providers. © 2015 Wiley Periodicals, Inc.

  4. Permutation Entropy for Random Binary Sequences

    Directory of Open Access Journals (Sweden)

    Lingfeng Liu

    2015-12-01

    Full Text Available In this paper, we generalize the permutation entropy (PE measure to binary sequences, which is based on Shannon’s entropy, and theoretically analyze this measure for random binary sequences. We deduce the theoretical value of PE for random binary sequences, which can be used to measure the randomness of binary sequences. We also reveal the relationship between this PE measure with other randomness measures, such as Shannon’s entropy and Lempel–Ziv complexity. The results show that PE is consistent with these two measures. Furthermore, we use PE as one of the randomness measures to evaluate the randomness of chaotic binary sequences.

  5. An audit strategy for time-to-event outcomes measured with error: application to five randomized controlled trials in oncology.

    Science.gov (United States)

    Dodd, Lori E; Korn, Edward L; Freidlin, Boris; Gu, Wenjuan; Abrams, Jeffrey S; Bushnell, William D; Canetta, Renzo; Doroshow, James H; Gray, Robert J; Sridhara, Rajeshwari

    2013-10-01

    Measurement error in time-to-event end points complicates interpretation of treatment effects in clinical trials. Non-differential measurement error is unlikely to produce large bias [1]. When error depends on treatment arm, bias is of greater concern. Blinded-independent central review (BICR) of all images from a trial is commonly undertaken to mitigate differential measurement-error bias that may be present in hazard ratios (HRs) based on local evaluations. Similar BICR and local evaluation HRs may provide reassurance about the treatment effect, but BICR adds considerable time and expense to trials. We describe a BICR audit strategy [2] and apply it to five randomized controlled trials to evaluate its use and to provide practical guidelines. The strategy requires BICR on a subset of study subjects, rather than a complete-case BICR, and makes use of an auxiliary-variable estimator. When the effect size is relatively large, the method provides a substantial reduction in the size of the BICRs. In a trial with 722 participants and a HR of 0.48, an average audit of 28% of the data was needed and always confirmed the treatment effect as assessed by local evaluations. More moderate effect sizes and/or smaller trial sizes required larger proportions of audited images, ranging from 57% to 100% for HRs ranging from 0.55 to 0.77 and sample sizes between 209 and 737. The method is developed for a simple random sample of study subjects. In studies with low event rates, more efficient estimation may result from sampling individuals with events at a higher rate. The proposed strategy can greatly decrease the costs and time associated with BICR, by reducing the number of images undergoing review. The savings will depend on the underlying treatment effect and trial size, with larger treatment effects and larger trials requiring smaller proportions of audited data.

  6. Linear systems a measurement based approach

    CERN Document Server

    Bhattacharyya, S P; Mohsenizadeh, D N

    2014-01-01

    This brief presents recent results obtained on the analysis, synthesis and design of systems described by linear equations. It is well known that linear equations arise in most branches of science and engineering as well as social, biological and economic systems. The novelty of this approach is that no models of the system are assumed to be available, nor are they required. Instead, a few measurements made on the system can be processed strategically to directly extract design values that meet specifications without constructing a model of the system, implicitly or explicitly. These new concepts are illustrated by applying them to linear DC and AC circuits, mechanical, civil and hydraulic systems, signal flow block diagrams and control systems. These applications are preliminary and suggest many open problems. The results presented in this brief are the latest effort in this direction and the authors hope these will lead to attractive alternatives to model-based design of engineering and other systems.

  7. A Random-Dot Kinematogram for Web-Based Vision Research

    Directory of Open Access Journals (Sweden)

    Sivananda Rajananda

    2018-01-01

    Full Text Available Web-based experiments using visual stimuli have become increasingly common in recent years, but many frequently-used stimuli in vision research have yet to be developed for online platforms. Here, we introduce the first open access random-dot kinematogram (RDK for use in web browsers. This fully customizable RDK offers options to implement several different types of noise (random position, random walk, random direction and parameters to control aperture shape, coherence level, the number of dots, and other features. We include links to commented JavaScript code for easy implementation in web-based experiments, as well as an example of how this stimulus can be integrated as a plugin with a JavaScript library for online studies (jsPsych.

  8. A data based random number generator for a multivariate distribution (using stochastic interpolation)

    Science.gov (United States)

    Thompson, J. R.; Taylor, M. S.

    1982-01-01

    Let X be a K-dimensional random variable serving as input for a system with output Y (not necessarily of dimension k). given X, an outcome Y or a distribution of outcomes G(Y/X) may be obtained either explicitly or implicity. The situation is considered in which there is a real world data set X sub j sub = 1 (n) and a means of simulating an outcome Y. A method for empirical random number generation based on the sample of observations of the random variable X without estimating the underlying density is discussed.

  9. Heterogeneity Measurement Based on Distance Measure for Polarimetric SAR Data

    Science.gov (United States)

    Xing, Xiaoli; Chen, Qihao; Liu, Xiuguo

    2018-04-01

    To effectively test the scene heterogeneity for polarimetric synthetic aperture radar (PolSAR) data, in this paper, the distance measure is introduced by utilizing the similarity between the sample and pixels. Moreover, given the influence of the distribution and modeling texture, the K distance measure is deduced according to the Wishart distance measure. Specifically, the average of the pixels in the local window replaces the class center coherency or covariance matrix. The Wishart and K distance measure are calculated between the average matrix and the pixels. Then, the ratio of the standard deviation to the mean is established for the Wishart and K distance measure, and the two features are defined and applied to reflect the complexity of the scene. The proposed heterogeneity measure is proceeded by integrating the two features using the Pauli basis. The experiments conducted on the single-look and multilook PolSAR data demonstrate the effectiveness of the proposed method for the detection of the scene heterogeneity.

  10. Motivational interviewing in a Web-based physical activity intervention with an avatar: randomized controlled trial.

    Science.gov (United States)

    Friederichs, Stijn; Bolman, Catherine; Oenema, Anke; Guyaux, Janneke; Lechner, Lilian

    2014-02-13

    Developing Web-based physical activity (PA) interventions based on motivational interviewing (MI) could increase the availability and reach of MI techniques for PA promotion. Integrating an avatar in such an intervention could lead to more positive appreciation and higher efficacy of the intervention, compared to an intervention that is purely text-based. The present study aims to determine whether a Web-based PA intervention based on MI with an avatar results in more positive appreciation and higher effectiveness of the intervention, when compared to an intervention that is purely text-based. A three-arm randomized controlled trial was conducted, containing the following research conditions: (1) a Web-based PA intervention based on MI with an avatar, (2) a content-identical intervention without an avatar, and (3) a control condition that received no intervention. Measurements included PA behavior and process variables, measured at baseline, directly following the intervention and 1 month post intervention. Both interventions significantly increased self-reported PA at 1 month, compared to the control condition (beta(AVATARvsCONTROL)=.39, P=.011; beta(TEXTvsCONTROL)=.44, P=.006). No distinctions were found regarding intervention effect on PA between both interventions. Similarly, the results of the process evaluation did not indicate any significant differences between both interventions. Due to the limited relational skills of the avatar in this study, it probably did not succeed in forming a stronger relationship with the user, over and above text alone. The findings suggest that avatars that do not strengthen the social relationship with the user do not enhance the intervention impact. Future research should determine whether Web-based PA interventions based on MI could benefit from inclusion of a virtual coach capable of more complex relational skills than used in the current study, such as responding in gesture to the user's state and input. Dutch Trial

  11. Using satellite-based measurements to explore ...

    Science.gov (United States)

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently spatially averaged) measurements of atmospheric conditions to diagnose the occurrence of NPF and NPF characteristics. We demonstrate the potential for using satellite-measurements of insolation (UV), trace gas concentrations (sulfur dioxide (SO2), nitrogen dioxide (NO2), ammonia (NH3), formaldehyde (HCHO), ozone (O3)), aerosol optical properties (aerosol optical depth (AOD), Ångström exponent (AE)), and a proxy of biogenic volatile organic compound emissions (leaf area index (LAI), temperature (T)) as predictors for NPF characteristics: formation rates, growth rates, survival probabilities, and ultrafine particle (UFP) concentrations at five locations across North America. NPF at all sites is most frequent in spring, exhibits a one-day autocorrelation, and is associated with low condensational sink (AOD×AE) and HCHO concentrations, and high UV. However, there are important site-to-site variations in NPF frequency and characteristics, and in which of the predictor variables (particularly gas concentrations) significantly contribute to the explanatory power of regression models built to predict those characteristics. This finding may provide a partial explanation for the reported spatia

  12. Measuring participant rurality in Web-based interventions

    Directory of Open Access Journals (Sweden)

    McKay H Garth

    2007-08-01

    Full Text Available Abstract Background Web-based health behavior change programs can reach large groups of disparate participants and thus they provide promise of becoming important public health tools. Data on participant rurality can complement other demographic measures to deepen our understanding of the success of these programs. Specifically, analysis of participant rurality can inform recruitment and social marketing efforts, and facilitate the targeting and tailoring of program content. Rurality analysis can also help evaluate the effectiveness of interventions across population groupings. Methods We describe how the RUCAs (Rural-Urban Commuting Area Codes methodology can be used to examine results from two Randomized Controlled Trials of Web-based tobacco cessation programs: the ChewFree.com project for smokeless tobacco cessation and the Smokers' Health Improvement Program (SHIP project for smoking cessation. Results Using RUCAs methodology helped to highlight the extent to which both Web-based interventions reached a substantial percentage of rural participants. The ChewFree program was found to have more rural participation which is consistent with the greater prevalence of smokeless tobacco use in rural settings as well as ChewFree's multifaceted recruitment program that specifically targeted rural settings. Conclusion Researchers of Web-based health behavior change programs targeted to the US should routinely include RUCAs as a part of analyzing participant demographics. Researchers in other countries should examine rurality indices germane to their country.

  13. An internet-based intervention for adjustment disorder (TAO): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Rachyla, Iryna; Pérez-Ara, Marian; Molés, Mar; Campos, Daniel; Mira, Adriana; Botella, Cristina; Quero, Soledad

    2018-05-31

    Adjustment Disorder (AjD) is a common and disabling mental health problem. The lack of research on this disorder has led to the absence of evidence-based interventions for its treatment. Moreover, because the available data indicate that a high percentage of people with mental illness are not treated, it is necessary to develop new ways to provide psychological assistance. The present study describes a Randomized Controlled Trial (RCT) aimed at assessing the effectiveness and acceptance of a linear internet-delivered cognitive-behavioral therapy (ICBT) intervention for AjD. A two-armed RCT was designed to compare an intervention group to a waiting list control group. Participants from the intervention group will receive TAO, an internet-based program for AjD composed of seven modules. TAO combines CBT and Positive Psychology strategies in order to provide patients with complete support, reducing their clinical symptoms and enhancing their capacity to overcome everyday adversity. Participants will also receive short weekly telephone support. Participants in the control group will be assessed before and after a seven-week waiting period, and then they will be offered the same intervention. Participants will be randomly assigned to one of the 2 groups. Measurements will be taken at five different moments: baseline, post-intervention, and three follow-up periods (3-, 6- and 12-month). BDI-II and BAI will be used as primary outcome measures. Secondary outcomes will be symptoms of AjD, posttraumatic growth, positive and negative affect, and quality of life. The development of ICBT programs like TAO responds to a need for evidence-based interventions that can reach most of the people who need them, reducing the burden and cost of mental disorders. More specifically, TAO targets AjD and will entail a step forward in the treatment of this prevalent but under-researched disorder. Finally, it should be noted that this is the first RCT focusing on an internet-based

  14. Guided Web-Based Cognitive Behavior Therapy for Perfectionism: Results From Two Different Randomized Controlled Trials.

    Science.gov (United States)

    Rozental, Alexander; Shafran, Roz; Wade, Tracey D; Kothari, Radha; Egan, Sarah J; Ekberg, Linda; Wiss, Maria; Carlbring, Per; Andersson, Gerhard

    2018-04-26

    Perfectionism can become a debilitating condition that may negatively affect functioning in multiple areas, including mental health. Prior research has indicated that internet-based cognitive behavioral therapy can be beneficial, but few studies have included follow-up data. The objective of this study was to explore the outcomes at follow-up of internet-based cognitive behavioral therapy with guided self-help, delivered as 2 separate randomized controlled trials conducted in Sweden and the United Kingdom. In total, 120 participants randomly assigned to internet-based cognitive behavioral therapy were included in both intention-to-treat and completer analyses: 78 in the Swedish trial and 62 in the UK trial. The primary outcome measure was the Frost Multidimensional Perfectionism Scale, Concern over Mistakes subscale (FMPS CM). Secondary outcome measures varied between the trials and consisted of the Clinical Perfectionism Questionnaire (CPQ; both trials), the 9-item Patient Health Questionnaire (PHQ-9; Swedish trial), the 7-item Generalized Anxiety Disorder scale (GAD-7; Swedish trial), and the 21-item Depression Anxiety Stress Scale (DASS-21; UK trial). Follow-up occurred after 6 months for the UK trial and after 12 months for the Swedish trial. Analysis of covariance revealed a significant difference between pretreatment and follow-up in both studies. Intention-to-treat within-group Cohen d effect sizes were 1.21 (Swedish trial; 95% CI 0.86-1.54) and 1.24 (UK trial; 95% CI 0.85-1.62) for the FMPS CM. Furthermore, 29 (59%; Swedish trial) and 15 (43%; UK trial) of the participants met the criteria for recovery on the FMPS CM. Improvements were also significant for the CPQ, with effect sizes of 1.32 (Swedish trial; 95% CI 0.97-1.66) and 1.49 (UK trial; 95% CI 1.09-1.88); the PHQ-9, effect size 0.60 (95% CI 0.28-0.92); the GAD-7, effect size 0.67 (95% CI 0.34-0.99); and the DASS-21, effect size 0.50 (95% CI 0.13-0.85). The results are promising for the use of

  15. Web-Based Tools for Educating Caregivers About Childhood Fever: A Randomized Controlled Trial.

    Science.gov (United States)

    Hart, Lara; Nedadur, Rashmi; Reardon, Jaime; Sirizzotti, Natalie; Poonai, Caroline; Speechley, Kathy N; Loftus, Jay; Miller, Michael; Salvadori, Marina; Spadafora, Amanda; Poonai, Naveen

    2016-10-04

    Fever is a common reason for an emergency department visit and misconceptions abound. We assessed the effectiveness of an interactive Web-based module (WBM), read-only Web site (ROW), and written and verbal information (standard of care [SOC]) to educate caregivers about fever in their children. Caregivers in the emergency department were randomized to a WBM, ROW, or SOC. Primary outcome was the gain score on a novel questionnaire testing knowledge surrounding measurement and management of fever. Secondary outcome was caregiver satisfaction with the interventions. There were 77, 79, and 77 participants in the WBM, ROW, and SOC groups, respectively. With a maximum of 33 points, Web-based interventions were associated with a significant mean (SD) pretest to immediate posttest gain score of 3.5 (4.2) for WBM (P ROW > SOC (P Web-based interventions are associated with significant improvements in caregiver knowledge about fever and high caregiver satisfaction. These interventions should be used to educate caregivers pending the demonstration of improved patient-centered outcomes.

  16. Fault Diagnosis for Hydraulic Servo System Using Compressed Random Subspace Based ReliefF

    Directory of Open Access Journals (Sweden)

    Yu Ding

    2018-01-01

    Full Text Available Playing an important role in electromechanical systems, hydraulic servo system is crucial to mechanical systems like engineering machinery, metallurgical machinery, ships, and other equipment. Fault diagnosis based on monitoring and sensory signals plays an important role in avoiding catastrophic accidents and enormous economic losses. This study presents a fault diagnosis scheme for hydraulic servo system using compressed random subspace based ReliefF (CRSR method. From the point of view of feature selection, the scheme utilizes CRSR method to determine the most stable feature combination that contains the most adequate information simultaneously. Based on the feature selection structure of ReliefF, CRSR employs feature integration rules in the compressed domain. Meanwhile, CRSR substitutes information entropy and fuzzy membership for traditional distance measurement index. The proposed CRSR method is able to enhance the robustness of the feature information against interference while selecting the feature combination with balanced information expressing ability. To demonstrate the effectiveness of the proposed CRSR method, a hydraulic servo system joint simulation model is constructed by HyPneu and Simulink, and three fault modes are injected to generate the validation data.

  17. A cluster-based randomized controlled trial promoting community participation in arsenic mitigation efforts in Bangladesh

    OpenAIRE

    George, Christine Marie; van Geen, Alexander; Slavkovich, Vesna; Singha, Ashit; Levy, Diane; Islam, Tariqul; Ahmed, Kazi Matin; Moon-Howard, Joyce; Tarozzi, Alessandro; Liu, Xinhua; Factor-Litvak, Pam; Graziano, Joseph

    2012-01-01

    Abstract Objective To reduce arsenic (As) exposure, we evaluated the effectiveness of training community members to perform water arsenic (WAs) testing and provide As education compared to sending representatives from outside communities to conduct these tasks. Methods We conducted a cluster based randomized controlled trial of 20 villages in Singair, Bangladesh. Fifty eligible respondents were randomly selected in each village. In 10 villages, a community member provided As education and WAs...

  18. Dissemination of Evidence-Based Antipsychotic Prescribing Guidelines to Nursing Homes: A Cluster Randomized Trial.

    Science.gov (United States)

    Tjia, Jennifer; Field, Terry; Mazor, Kathleen; Lemay, Celeste A; Kanaan, Abir O; Donovan, Jennifer L; Briesacher, Becky A; Peterson, Daniel; Pandolfi, Michelle; Spenard, Ann; Gurwitz, Jerry H

    2015-07-01

    To evaluate the effectiveness of efforts to translate and disseminate evidence-based guidelines about atypical antipsychotic use to nursing homes (NHs). Three-arm, cluster randomized trial. NHs. NHs in the state of Connecticut. Evidence-based guidelines for atypical antipsychotic prescribing were translated into a toolkit targeting NH stakeholders, and 42 NHs were recruited and randomized to one of three toolkit dissemination strategies: mailed toolkit delivery (minimal intensity); mailed toolkit delivery with quarterly audit and feedback reports about facility-level antipsychotic prescribing (moderate intensity); and in-person toolkit delivery with academic detailing, on-site behavioral management training, and quarterly audit and feedback reports (high intensity). Outcomes were evaluated using the Reach, Effectiveness, Adoption, Implementation, Maintenance (RE-AIM) framework. Toolkit awareness of 30% (7/23) of leadership of low-intensity NHs, 54% (19/35) of moderate-intensity NHs, and 82% (18/22) of high-intensity NHs reflected adoption and implementation of the intervention. Highest levels of use and knowledge among direct care staff were reported in high-intensity NHs. Antipsychotic prescribing levels declined during the study period, but there were no statistically significant differences between study arms or from secular trends. RE-AIM indicators suggest some success in disseminating the toolkit and differences in reach, adoption, and implementation according to dissemination strategy but no measurable effect on antipsychotic prescribing trends. Further dissemination to external stakeholders such as psychiatry consultants and hospitals may be needed to influence antipsychotic prescribing for NH residents. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  19. Internet-based cognitive behavior therapy for major depressive disorder: A randomized controlled trial.

    Science.gov (United States)

    Rosso, Isabelle M; Killgore, William D S; Olson, Elizabeth A; Webb, Christian A; Fukunaga, Rena; Auerbach, Randy P; Gogel, Hannah; Buchholz, Jennifer L; Rauch, Scott L

    2017-03-01

    Prior research has shown that the Sadness Program, a technician-assisted Internet-based cognitive behavioral therapy (iCBT) intervention developed in Australia, is effective for treating major depressive disorder (MDD). The current study aimed to expand this work by adapting the protocol for an American population and testing the Sadness Program with an attention control group. In this parallel-group, randomized controlled trial, adult MDD participants (18-45 years) were randomized to a 10-week period of iCBT (n = 37) or monitored attention control (MAC; n = 40). Participants in the iCBT group completed six online therapy lessons, which included access to content summaries and homework assignments. During the 10-week trial, iCBT and MAC participants logged into the web-based system six times to complete self-report symptom scales, and a nonclinician technician contacted participants weekly to provide encouragement and support. The primary outcome was the Hamilton Rating Scale for Depression (HRSD), and the secondary outcomes were the Patient Health Questionnaire-9 and Kessler-10. Intent-to-treat analyses revealed significantly greater reductions in depressive symptoms in iCBT compared with MAC participants, using both the self-report measures and the clinician-rated HRSD (d = -0.80). Importantly, iCBT participants also showed significantly higher rates of clinical response and remission. Exploratory analyses did not support illness severity as a moderator of treatment outcome. The Sadness Program led to significant reductions in depression and distress symptoms. With its potential to be delivered in a scalable, cost-efficient manner, iCBT is a promising strategy to enhance access to effective care. © 2016 Wiley Periodicals, Inc.

  20. Development and evaluation of an Individualized Outcome Measure (IOM) for randomized controlled trials in mental health.

    Science.gov (United States)

    Pesola, Francesca; Williams, Julie; Bird, Victoria; Freidl, Marion; Le Boutillier, Clair; Leamy, Mary; Macpherson, Rob; Slade, Mike

    2015-12-01

    Pre-defined, researcher-selected outcomes are routinely used as the clinical end-point in randomized controlled trials (RCTs); however, individualized approaches may be an effective way to assess outcome in mental health research. The present study describes the development and evaluation of the Individualized Outcome Measure (IOM), which is a patient-specific outcome measure to be used for RCTs of complex interventions. IOM was developed using a narrative review, expert consultation and piloting with mental health service users (n = 20). The final version of IOM comprises two components: Goal Attainment (GA) and Personalized Primary Outcome (PPO). For GA, patients identify one relevant goal at baseline and rate its attainment at follow-up. For PPO, patients choose an outcome domain related to their goal from a pre-defined list at baseline, and complete a standardized questionnaire assessing the chosen outcome domain at baseline and follow-up. A feasibility study indicated that IOM had adequate completion (89%) and acceptability (96%) rates in a clinical sample (n = 84). IOM was then evaluated in a RCT (ISRCTN02507940). GA and PPO components were associated with each other and with the trial primary outcome. The use of the PPO component of IOM as the primary outcome could be considered in future RCTs. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. The determination of random event-rate based on counter live-time measurement; Determination de la frequence reelle d'evenements aleatoires par comptage et mesure du temps mort du compteur; Opredelenie skorosti scheta besporyadochno sleduyushchikh impul'sov, osnovannoe na izmerenii rabochego vremeni schetchika; Sobre la determinacion del ritmo de sucesos aleatorios basada en la medida del tiempo eficaz de un contador

    Energy Technology Data Exchange (ETDEWEB)

    Radeka, V [Institut Rudjer Boskovic, Zagreb, Yugoslavia (Croatia)

    1962-04-15

    The method of determining the true rate of events generated by a random process based on a counting device and live-time measurement is analysed. The determined rate is basically independent of the counter-resolving time. It is shown that the error caused by the resolving time of an event-to-pulse converter at the input of the system is substantially lower than the actual reduction of the rate by the converter itself. Live-time measurement error is discussed with respect to the application limit of the method. The analysis given may be applied to pulse-height analysers and counters using live-time measurement. The method can simply be realized in pulse-height analysers and counters with electronic timers. (author) [French] L'auteur presente une methode de determination de la frequence reelle d'evenements engendres dans un processus aleatoire, par comptage et mesure du temps mort du compteur. La frequence mesuree ne depend pas du temps de resolution du compteur. L'auteur montre que l'erreur due au temps de resolution du convertisseur evenement-impulsion place a l'entree du dispositif est nettement inferieure a la diminution de frequence qu'entraine la conversion. Il discute l'erreur de la mesure du temps mort en ce qui concerne la limite d'application de la methode. Cette analyse peut egalement s'appliquer aux ensembles selecteur d'amplitudes et compteur d'impulsions. La methode peut etre realisee d'une maniere simple dans des ensembles selecteur d'amplitudes et compteur d'impulsions comportant un chronoscope electronique. (author) [Spanish] El autor analiza un metodo para determinar el ritmo verdadero de los impulsos generados por un proceso aleatorio que se basa en el empleo de un dispositivo de contaje y en la medicion del tiempo eficaz. El ritmo determinado es basicamente independiente del tiempo de resolucion del contador. Demuestra que el error ocasionado en la alimentacion del sistema por el tiempo de resolucion de un convertidor de sucesos en impulsos es

  2. Internet-based guided self-help for posttraumatic stress disorder (PTSD): Randomized controlled trial.

    Science.gov (United States)

    Lewis, Catrin E; Farewell, Daniel; Groves, Vicky; Kitchiner, Neil J; Roberts, Neil P; Vick, Tracey; Bisson, Jonathan I

    2017-06-01

    There are numerous barriers that limit access to evidence-based treatment for posttraumatic stress disorder (PTSD). Internet-based guided self-help is a treatment option that may help widen access to effective intervention, but the approach has not been sufficiently explored for the treatment of PTSD. Forty two adults with DSM-5 PTSD of mild to moderate severity were randomly allocated to internet-based self-help with up to 3 h of therapist assistance, or to a delayed treatment control group. The internet-based program included eight modules that focused on psychoeducation, grounding, relaxation, behavioural activation, real-life and imaginal exposure, cognitive therapy, and relapse prevention. The primary outcome measure was reduction in clinician-rated traumatic stress symptoms using the clinician administered PTSD scale for DSM-V (CAPS-5). Secondary outcomes were self-reported PTSD symptoms, depression, anxiety, alcohol use, perceived social support, and functional impairment. Posttreatment, the internet-based guided self-help group had significantly lower clinician assessed PTSD symptoms than the delayed treatment control group (between-group effect size Cohen's d = 1.86). The difference was maintained at 1-month follow-up and dissipated once both groups had received treatment. Similar patterns of difference between the two groups were found for depression, anxiety, and functional impairment. The average contact with treating clinicians was 2½ h. Internet-based trauma-focused guided self-help for PTSD is a promising treatment option that requires far less therapist time than current first line face-to-face psychological therapy. © 2017 Wiley Periodicals, Inc.

  3. Teaching of evidence-based medicine to medical students in Mexico: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Sánchez-Mendiola Melchor

    2012-11-01

    Full Text Available Abstract Background Evidence-Based Medicine (EBM is an important competency for the healthcare professional. Experimental evidence of EBM educational interventions from rigorous research studies is limited. The main objective of this study was to assess EBM learning (knowledge, attitudes and self-reported skills in undergraduate medical students with a randomized controlled trial. Methods The educational intervention was a one-semester EBM course in the 5th year of a public medical school in Mexico. The study design was an experimental parallel group randomized controlled trial for the main outcome measures in the 5th year class (M5 EBM vs. M5 non-EBM groups, and quasi-experimental with static-groups comparisons for the 4th year (M4, not yet exposed and 6th year (M6, exposed 6 months to a year earlier groups. EBM attitudes, knowledge and self-reported skills were measured using Taylor’s questionnaire and a summative exam which comprised of a 100-item multiple-choice question (MCQ test. Results 289 Medical students were assessed: M5 EBM=48, M5 non-EBM=47, M4=87, and M6=107. There was a higher reported use of the Cochrane Library and secondary journals in the intervention group (M5 vs. M5 non-EBM. Critical appraisal skills and attitude scores were higher in the intervention group (M5 and in the group of students exposed to EBM instruction during the previous year (M6. The knowledge level was higher after the intervention in the M5 EBM group compared to the M5 non-EBM group (pd=0.88 with Taylor's instrument and 3.54 with the 100-item MCQ test. M6 Students that received the intervention in the previous year had a knowledge score higher than the M4 and M5 non-EBM groups, but lower than the M5 EBM group. Conclusions Formal medical student training in EBM produced higher scores in attitudes, knowledge and self-reported critical appraisal skills compared with a randomized control group. Data from the concurrent groups add validity evidence to the

  4. A randomized, controlled clinical trial: the effect of mindfulness-based cognitive therapy on generalized anxiety disorder among Chinese community patients: protocol for a randomized trial

    Directory of Open Access Journals (Sweden)

    Wong Samuel YS

    2011-11-01

    Full Text Available Abstract Background Research suggests that an eight-week Mindfulness-Based Cognitive Therapy (MBCT program may be effective in the treatment of generalized anxiety disorders. Our objective is to compare the clinical effectiveness of the MBCT program with a psycho-education programme and usual care in reducing anxiety symptoms in people suffering from generalized anxiety disorder. Methods A three armed randomized, controlled clinical trial including 9-month post-treatment follow-up is proposed. Participants screened positive using the Structure Clinical Interview for DSM-IV (SCID for general anxiety disorder will be recruited from community-based clinics. 228 participants will be randomly allocated to the MBCT program plus usual care, psycho-education program plus usual care or the usual care group. Validated Chinese version of instruments measuring anxiety and worry symptoms, depression, quality of life and health service utilization will be used. Our primary end point is the change of anxiety and worry score (Beck Anxiety Inventory and Penn State Worry Scale from baseline to the end of intervention. For primary analyses, treatment outcomes will be assessed by ANCOVA, with change in anxiety score as the baseline variable, while the baseline anxiety score and other baseline characteristics that significantly differ between groups will serve as covariates. Conclusions This is a first randomized controlled trial that compare the effectiveness of MBCT with an active control, findings will advance current knowledge in the management of GAD and the way that group intervention can be delivered and inform future research. Unique Trail Number (assigned by Centre for Clinical Trails, Clinical Trials registry, The Chinese University of Hong Kong: CUHK_CCT00267

  5. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  6. Random numbers from vacuum fluctuations

    International Nuclear Information System (INIS)

    Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda

    2016-01-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  7. Random numbers from vacuum fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com [Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore); Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore); Chng, Brenda [Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore)

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  8. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  9. Home-based family intervention increases knowledge, communication and living donation rates: a randomized controlled trial.

    Science.gov (United States)

    Ismail, S Y; Luchtenburg, A E; Timman, R; Zuidema, W C; Boonstra, C; Weimar, W; Busschbach, J J V; Massey, E K

    2014-08-01

    Our aim was to develop and test an educational program to support well-informed decision making among patients and their social network regarding living donor kidney transplantation (LDKT). One hundred sixty-three patients who were unable to find a living donor were randomized to standard care or standard care plus home-based education. In the education condition, patients and members of their social network participated in home-based educational meetings and discussed renal replacement therapy options. Patients and invitees completed pre-post self-report questionnaires measuring knowledge, risk perception, communication, self-efficacy and subjective norm. LDKT activities were observed for 6 months postintervention. Patients in the experimental group showed significantly more improvements in knowledge (p communication (p = 0.012) compared with the control group. The invitees showed pre-post increases in knowledge (p decision making and promotes access to LDKT. © Copyright 2014 The American Society of Transplantation and the American Society of Transplant Surgeons.

  10. A Randomized Pilot Study of a Phone-Based Mindfulness and Weight Loss Program.

    Science.gov (United States)

    Carpenter, Kelly M; Vickerman, Katrina A; Salmon, Erica E; Javitz, Harold S; Epel, Elissa S; Lovejoy, Jennifer C

    2017-10-06

    This study evaluated the feasibility and efficacy of integrating mindfulness training into a phone-based weight loss program to improve outcomes in those with high levels of emotional eating. Participants were 75 enrollees into an employer-sponsored weight loss program who reported high levels of overeating in response to thoughts and feelings. Seventy-five overweight and obese participants (92% female, 65% Caucasian, aged 26 to 68 years) were randomized to the new mindfulness weight loss program (n = 50) or the standard behavioral weight loss program (n = 25). Both programs consisted of 11 coaching calls with health coaches and registered dietitians with supplemental online materials. Satisfaction, engagement, and percent weight lost did not significantly differ for intervention vs. control at six months. Intervention participants had significantly better scores at six-month follow-up on mindful eating, binge eating, experiential avoidance, and one mindfulness subscale. Exploratory analyses showed that improvements on several measures predicted more weight loss in the intervention group. This pilot study found that integrating mindfulness into a brief phone-based behavioral weight loss program was feasible and acceptable to participants, but did not produce greater weight loss on average, despite hypothesized changes in mindful eating. Only one third of intervention participants reported participating in mindfulness exercises regularly. Mechanisms of change observed within the intervention group suggest that for adults with high levels of emotional eating those who embrace mindful eating and meditation may lose more weight with a mindfulness intervention.

  11. Study protocol: home-based telehealth stroke care: a randomized trial for veterans

    Directory of Open Access Journals (Sweden)

    McGee-Hernandez Nancy

    2010-06-01

    Full Text Available Abstract Background Stroke is one of the most disabling and costly impairments of adulthood in the United States. Stroke patients clearly benefit from intensive inpatient care, but due to the high cost, there is considerable interest in implementing interventions to reduce hospital lengths of stay. Early discharge rehabilitation programs require coordinated, well-organized home-based rehabilitation, yet lack of sufficient information about the home setting impedes successful rehabilitation. This trial examines a multifaceted telerehabilitation (TR intervention that uses telehealth technology to simultaneously evaluate the home environment, assess the patient's mobility skills, initiate rehabilitative treatment, prescribe exercises tailored for stroke patients and provide periodic goal oriented reassessment, feedback and encouragement. Methods We describe an ongoing Phase II, 2-arm, 3-site randomized controlled trial (RCT that determines primarily the effect of TR on physical function and secondarily the effect on disability, falls-related self-efficacy, and patient satisfaction. Fifty participants with a diagnosis of ischemic or hemorrhagic stroke will be randomly assigned to one of two groups: (a TR; or (b Usual Care. The TR intervention uses a combination of three videotaped visits and five telephone calls, an in-home messaging device, and additional telephonic contact as needed over a 3-month study period, to provide a progressive rehabilitative intervention with a treatment goal of safe functional mobility of the individual within an accessible home environment. Dependent variables will be measured at baseline, 3-, and 6-months and analyzed with a linear mixed-effects model across all time points. Discussion For patients recovering from stroke, the use of TR to provide home assessments and follow-up training in prescribed equipment has the potential to effectively supplement existing home health services, assist transition to home and

  12. Shape measurement system for single point incremental forming (SPIF) manufacts by using trinocular vision and random pattern

    International Nuclear Information System (INIS)

    Setti, Francesco; Bini, Ruggero; Lunardelli, Massimo; Bosetti, Paolo; Bruschi, Stefania; De Cecco, Mariolino

    2012-01-01

    Many contemporary works show the interest of the scientific community in measuring the shape of artefacts made by single point incremental forming. In this paper, we will present an algorithm able to detect feature points with a random pattern, check the compatibility of associations exploiting multi-stereo constraints and reject outliers and perform a 3D reconstruction by dense random patterns. The algorithm is suitable for a real-time application, in fact it needs just three images and a synchronous relatively fast processing. The proposed method has been tested on a simple geometry and results have been compared with a coordinate measurement machine acquisition. (paper)

  13. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    International Nuclear Information System (INIS)

    Fu-Lai, Wang

    2010-01-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0–1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator. (general)

  14. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    Science.gov (United States)

    Wang, Fu-Lai

    2010-09-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0-1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator.

  15. A Bidirectional Generalized Synchronization Theorem-Based Chaotic Pseudo-random Number Generator

    Directory of Open Access Journals (Sweden)

    Han Shuangshuang

    2013-07-01

    Full Text Available Based on a bidirectional generalized synchronization theorem for discrete chaos system, this paper introduces a new 5-dimensional bidirectional generalized chaos synchronization system (BGCSDS, whose prototype is a novel chaotic system introduced in [12]. Numerical simulation showed that two pair variables of the BGCSDS achieve generalized chaos synchronization via a transform H.A chaos-based pseudo-random number generator (CPNG was designed by the new BGCSDS. Using the FIPS-140-2 tests issued by the National Institute of Standard and Technology (NIST verified the randomness of the 1000 binary number sequences generated via the CPNG and the RC4 algorithm respectively. The results showed that all the tested sequences passed the FIPS-140-2 tests. The confidence interval analysis showed the statistical properties of the randomness of the sequences generated via the CPNG and the RC4 algorithm do not have significant differences.

  16. Model-based cartilage thickness measurement in the submillimeter range

    International Nuclear Information System (INIS)

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-01-01

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  17. Video-based measurements for wireless capsule endoscope tracking

    International Nuclear Information System (INIS)

    Spyrou, Evaggelos; Iakovidis, Dimitris K

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions. (paper)

  18. Video-based measurements for wireless capsule endoscope tracking

    Science.gov (United States)

    Spyrou, Evaggelos; Iakovidis, Dimitris K.

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions.

  19. Randomized controlled resistance training based physical activity trial for central European nursing home residing older adults.

    Science.gov (United States)

    Barthalos, Istvan; Dorgo, Sandor; Kopkáné Plachy, Judit; Szakály, Zsolt; Ihász, Ferenc; Ráczné Németh, Teodóra; Bognár, József

    2016-10-01

    Nursing home residing older adults often experience fear of sickness or death, functional impairment and pain. It is difficult for these older adults to maintain a physically active lifestyle and to keep a positive outlook on life. This study evaluated the changes in quality of life, attitude to aging, assertiveness, physical fitness and body composition of nursing home residing elderly through a 15-week organized resistance training based physical activity program. Inactive older adults living in a state financed nursing home (N.=45) were randomly divided into two intervention groups and a control group. Both intervention groups were assigned to two physical activity sessions a week, but one of these groups also had weekly discussions on health and quality of life (Mental group). Data on anthropometric measures, fitness performance, as well as quality of life and attitudes to aging survey data were collected. Due to low attendance rate 12 subjects were excluded from the analyses. Statistical analysis included Paired Samples t-tests and Repeated Measures Analysis of Variance. Both intervention groups significantly improved their social participation, and their upper- and lower-body strength scores. Also, subjects in the Mental group showed improvement in agility fitness test and certain survey scales. No positive changes were detected in attitude towards aging and body composition measures in any groups. The post-hoc results suggest that Mental group improved significantly more than the Control group. Regular physical activity with discussions on health and quality of life made a more meaningful difference for the older adults living in nursing home than physical activity alone. Due to the fact that all participants were influenced by the program, it is suggested to further explore this area for better understanding of enhanced quality of life.

  20. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  1. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    Science.gov (United States)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  2. A Fast Reactive Power Optimization in Distribution Network Based on Large Random Matrix Theory and Data Analysis

    Directory of Open Access Journals (Sweden)

    Wanxing Sheng

    2016-05-01

    Full Text Available In this paper, a reactive power optimization method based on historical data is investigated to solve the dynamic reactive power optimization problem in distribution network. In order to reflect the variation of loads, network loads are represented in a form of random matrix. Load similarity (LS is defined to measure the degree of similarity between the loads in different days and the calculation method of the load similarity of load random matrix (LRM is presented. By calculating the load similarity between the forecasting random matrix and the random matrix of historical load, the historical reactive power optimization dispatching scheme that most matches the forecasting load can be found for reactive power control usage. The differences of daily load curves between working days and weekends in different seasons are considered in the proposed method. The proposed method is tested on a standard 14 nodes distribution network with three different types of load. The computational result demonstrates that the proposed method for reactive power optimization is fast, feasible and effective in distribution network.

  3. Effectiveness in practice-based research: Looking for alternatives to the randomized controlled trial (RCT)

    NARCIS (Netherlands)

    Tavecchio, L.

    2015-01-01

    Over the last decade, the status of the randomized controlled trial (RCT), hallmark of evidence-based medicine (research), has been growing strongly in general practice, social work and public health. But this type of research is only practicable under strictly controlled and well-defined settings

  4. The Hidden Flow Structure and Metric Space of Network Embedding Algorithms Based on Random Walks.

    Science.gov (United States)

    Gu, Weiwei; Gong, Li; Lou, Xiaodan; Zhang, Jiang

    2017-10-13

    Network embedding which encodes all vertices in a network as a set of numerical vectors in accordance with it's local and global structures, has drawn widespread attention. Network embedding not only learns significant features of a network, such as the clustering and linking prediction but also learns the latent vector representation of the nodes which provides theoretical support for a variety of applications, such as visualization, link prediction, node classification, and recommendation. As the latest progress of the research, several algorithms based on random walks have been devised. Although those algorithms have drawn much attention for their high scores in learning efficiency and accuracy, there is still a lack of theoretical explanation, and the transparency of those algorithms has been doubted. Here, we propose an approach based on the open-flow network model to reveal the underlying flow structure and its hidden metric space of different random walk strategies on networks. We show that the essence of embedding based on random walks is the latent metric structure defined on the open-flow network. This not only deepens our understanding of random- walk-based embedding algorithms but also helps in finding new potential applications in network embedding.

  5. Assessing the Promise of Standards-Based Performance Evaluation for Principals: Results from a Randomized Trial

    Science.gov (United States)

    Kimball, Steven Miller; Milanowski, Anthony; McKinney, Sarah A.

    2009-01-01

    Principals (N = 76) in a large western U.S. school district were randomly assigned to be evaluated using either a new standards-based system or to continue with the old system. It was hypothesized that principals evaluated with the new system would report clearer performance expectations, better feedback, greater fairness and system satisfaction,…

  6. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  7. School-based cognitive behavioral interventions for anxious youth: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Haugland, Bente Storm Mowatt; Raknes, Solfrid; Haaland, Aashild Tellefsen; Wergeland, Gro Janne; Bjaastad, Jon Fauskanger; Baste, Valborg; Himle, Joe; Rapee, Ron; Hoffart, Asle

    2017-03-04

    Anxiety disorders are prevalent among adolescents and may have long-lasting negative consequences for the individual, the family and society. Cognitive behavioral therapy (CBT) is an effective treatment. However, many anxious youth do not seek treatment. Low-intensity CBT in schools may improve access to evidence-based services. We aim to investigate the efficacy of two CBT youth anxiety programs with different intensities (i.e., number and length of sessions), both group-based and administered as early interventions in a school setting. The objectives of the study are to examine the effects of school-based interventions for youth anxiety and to determine whether a less intensive intervention is non-inferior to a more intensive intervention. The present study is a randomized controlled trial comparing two CBT interventions to a waitlist control group. A total of 18 schools participate and we aim to recruit 323 adolescents (12-16 years). Youth who score above a cutoff on an anxiety symptom scale will be included in the study. School nurses recruit participants and deliver the interventions, with mental health workers as co-therapists and/or supervisors. Primary outcomes are level of anxiety symptoms and anxiety-related functional impairments. Secondary outcomes are level of depressive symptoms, quality of life and general psychosocial functioning. Non-inferiority between the two active interventions will be declared if a difference of 1.4 or less is found on the anxiety symptom measure post-intervention and a difference of 0.8 on the interference scale. Effects will be analyzed by mixed effect models, applying an intention to treat procedure. The present study extends previous research by comparing two programs with different intensity. A brief intervention, if effective, could more easily be subject to large-scale implementation in school health services. ClinicalTrials.gov, NCT02279251 . Registered on 15 October 2014. Retrospectively registered.

  8. SAR-based change detection using hypothesis testing and Markov random field modelling

    Science.gov (United States)

    Cao, W.; Martinis, S.

    2015-04-01

    The objective of this study is to automatically detect changed areas caused by natural disasters from bi-temporal co-registered and calibrated TerraSAR-X data. The technique in this paper consists of two steps: Firstly, an automatic coarse detection step is applied based on a statistical hypothesis test for initializing the classification. The original analytical formula as proposed in the constant false alarm rate (CFAR) edge detector is reviewed and rewritten in a compact form of the incomplete beta function, which is a builtin routine in commercial scientific software such as MATLAB and IDL. Secondly, a post-classification step is introduced to optimize the noisy classification result in the previous step. Generally, an optimization problem can be formulated as a Markov random field (MRF) on which the quality of a classification is measured by an energy function. The optimal classification based on the MRF is related to the lowest energy value. Previous studies provide methods for the optimization problem using MRFs, such as the iterated conditional modes (ICM) algorithm. Recently, a novel algorithm was presented based on graph-cut theory. This method transforms a MRF to an equivalent graph and solves the optimization problem by a max-flow/min-cut algorithm on the graph. In this study this graph-cut algorithm is applied iteratively to improve the coarse classification. At each iteration the parameters of the energy function for the current classification are set by the logarithmic probability density function (PDF). The relevant parameters are estimated by the method of logarithmic cumulants (MoLC). Experiments are performed using two flood events in Germany and Australia in 2011 and a forest fire on La Palma in 2009 using pre- and post-event TerraSAR-X data. The results show convincing coarse classifications and considerable improvement by the graph-cut post-classification step.

  9. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    Czech Academy of Sciences Publication Activity Database

    Pospíšil, Jaroslav; Jakubík, P.; Machala, L.

    2005-01-01

    Roč. 116, - (2005), s. 573-585 ISSN 0030-4026 Institutional research plan: CEZ:AV0Z10100522 Keywords : random-target measuring method * light-reflection white - noise target * digital video camera * modulation transfer function * power spectral density Subject RIV: BH - Optics, Masers, Lasers Impact factor: 0.395, year: 2005

  10. Community-based group exercise for persons with Parkinson disease: a randomized controlled trial.

    Science.gov (United States)

    Combs, Stephanie A; Diehl, M Dyer; Chrzastowski, Casey; Didrick, Nora; McCoin, Brittany; Mox, Nicholas; Staples, William H; Wayman, Jessica

    2013-01-01

    The purpose of this study was to compare group boxing training to traditional group exercise on function and quality of life in persons with Parkinson disease (PD). A convenience sample of adults with PD (n = 31) were randomly assigned to boxing training or traditional exercise for 24-36 sessions, each lasting 90 minutes, over 12 weeks. Boxing training included: stretching, boxing (e.g. lateral foot work, punching bags), resistance exercises, and aerobic training. Traditional exercise included: stretching, resistance exercises, aerobic training, and balance activities. Participants were tested before and after completion of training on balance, balance confidence, mobility, gait velocity, gait endurance, and quality of life. The traditional exercise group demonstrated significantly greater gains in balance confidence than the boxing group (p effect size for the gait endurance (d = 0.65). Both groups demonstrated significant improvements with the balance, mobility, and quality of life with large within-group effect sizes (d ≥ 0.80). While groups significantly differed in balance confidence after training, both groups demonstrated improvements in most outcome measures. Supporting options for long-term community-based group exercise for persons with PD will be an important future consideration for rehabilitation professionals.

  11. Oriented Markov random field based dendritic spine segmentation for fluorescence microscopy images.

    Science.gov (United States)

    Cheng, Jie; Zhou, Xiaobo; Miller, Eric L; Alvarez, Veronica A; Sabatini, Bernardo L; Wong, Stephen T C

    2010-10-01

    Dendritic spines have been shown to be closely related to various functional properties of the neuron. Usually dendritic spines are manually labeled to analyze their morphological changes, which is very time-consuming and susceptible to operator bias, even with the assistance of computers. To deal with these issues, several methods have been recently proposed to automatically detect and measure the dendritic spines with little human interaction. However, problems such as degraded detection performance for images with larger pixel size (e.g. 0.125 μm/pixel instead of 0.08 μm/pixel) still exist in these methods. Moreover, the shapes of detected spines are also distorted. For example, the "necks" of some spines are missed. Here we present an oriented Markov random field (OMRF) based algorithm which improves spine detection as well as their geometric characterization. We begin with the identification of a region of interest (ROI) containing all the dendrites and spines to be analyzed. For this purpose, we introduce an adaptive procedure for identifying the image background. Next, the OMRF model is discussed within a statistical framework and the segmentation is solved as a maximum a posteriori estimation (MAP) problem, whose optimal solution is found by a knowledge-guided iterative conditional mode (KICM) algorithm. Compared with the existing algorithms, the proposed algorithm not only provides a more accurate representation of the spine shape, but also improves the detection performance by more than 50% with regard to reducing both the misses and false detection.

  12. Laser-based measuring equipment controlled by microcomputer

    International Nuclear Information System (INIS)

    Miron, N.; Sporea, D.; Velculescu, V.G.; Petre, M.

    1988-03-01

    Some laser-based measuring equipment controlled by microcomputer developed for industrial and scientific purposes are described. These equipments are intended for dial indicators verification, graduated rules measurement, and for very accurate measurement of the gravitational constant. (authors)

  13. Multiple ECG Fiducial Points-Based Random Binary Sequence Generation for Securing Wireless Body Area Networks.

    Science.gov (United States)

    Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif

    2017-05-01

    Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.

  14. Random laser based on Rhodamine 6G (Rh6G doped poly(methyl methacrylate (PMMA films coating on ZnO nanorods synthesized by hydrothermal oxidation

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    Full Text Available Random laser based on Rh6G doped PMMA thin films coating on ZnO nanorods synthesized by a simple hydrothermal oxidation method has been demonstrated. This kind of random laser medium is based on waveguide structure consisting of ZnO nanorods, Rh6G doped PMMA film and air. By controlling the time of hydrothermal oxidation reaction, wheat-like and hexagonal prism ZnO nanorods have been successfully fabricated. The emission spectra of these gain mediums based on different ZnO nanorods are different. The one based on wheat-like ZnO nanorods mainly exhibits amplified spontaneous emission, and the other one based on hexagonal prism ZnO nanorods shows random laser emission. The threshold of the random laser medium is about 73.8 μJ/pulse, and the full width at half maximum (FWHM is around 2.1 nm. The emission spectra measured at different detecting angles reveal that the output direction is strongly confined in ±30° by the waveguide effect. Our experiments demonstrate a promising method to achieve organic random laser medium. Keywords: Random laser, ZnO nanorods, Hydrothermal oxidation, Rhodamine 6G (Rh6G, Poly(methyl methacrylate (PMMA

  15. Effects of Video Game Training on Behavioral and Electrophysiological Measures of Attention and Memory: Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Ruiz-Marquez, Eloisa; Prieto, Antonio; Toril, Pilar; Ponce de Leon, Laura; de Ceballos, Maria L; Reales Avilés, José Manuel

    2017-01-24

    Neuroplasticity-based approaches seem to offer promising ways of maintaining cognitive health in older adults and postponing the onset of cognitive decline symptoms. Although previous research suggests that training can produce transfer effects, this study was designed to overcome some limitations of previous studies by incorporating an active control group and the assessment of training expectations. The main objectives of this study are (1) to evaluate the effects of a randomized computer-based intervention consisting of training older adults with nonaction video games on brain and cognitive functions that decline with age, including attention and spatial working memory, using behavioral measures and electrophysiological recordings (event-related potentials [ERPs]) just after training and after a 6-month no-contact period; (2) to explore whether motivation, engagement, or expectations might account for possible training-related improvements; and (3) to examine whether inflammatory mechanisms assessed with noninvasive measurement of C-reactive protein in saliva impair cognitive training-induced effects. A better understanding of these mechanisms could elucidate pathways that could be targeted in the future by either behavioral or neuropsychological interventions. A single-blinded randomized controlled trial with an experimental group and an active control group, pretest, posttest, and 6-month follow-up repeated measures design is used in this study. A total of 75 cognitively healthy older adults were randomly distributed into experimental and active control groups. Participants in the experimental group received 16 1-hour training sessions with cognitive nonaction video games selected from Lumosity, a commercial brain training package. The active control group received the same number of training sessions with The Sims and SimCity, a simulation strategy game. We have recruited participants, have conducted the training protocol and pretest assessments, and are

  16. Water-Based Aerobic Training Successfully Improves Lipid Profile of Dyslipidemic Women: A Randomized Controlled Trial

    Science.gov (United States)

    Costa, Rochelle Rocha; Pilla, Carmen; Buttelli, Adriana Cristine Koch; Barreto, Michelle Flores; Vieiro, Priscila Azevedo; Alberton, Cristine Lima; Bracht, Cláudia Gomes; Kruel, Luiz Fernando Martins

    2018-01-01

    Purpose: This study aimed to investigate the effects of water-based aerobic training on the lipid profile and lipoprotein lipase (LPL) levels in premenopausal women with dyslipidemia. Method: Forty women were randomly assigned to: aquatic training (WA; n = 20) or a control group (CG; n = 20). The WA group underwent 12 weeks of water-based interval…

  17. Web-Based and Mobile Stress Management Intervention for Employees: A Randomized Controlled Trial

    OpenAIRE

    Heber, Elena; Lehr, Dirk; Ebert, David Daniel; Berking, Matthias; Riper, Heleen

    2016-01-01

    Background: Work-related stress is highly prevalent among employees and is associated with adverse mental health consequences. Web-based interventions offer the opportunity to deliver effective solutions on a large scale; however, the evidence is limited and the results conflicting. Objective: This randomized controlled trial evaluated the efficacy of guided Web-and mobile-based stress management training for employees. Methods: A total of 264 employees with elevated symptoms of stress (Perce...

  18. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  19. Parkinson's disease detection based on dysphonia measurements

    Science.gov (United States)

    Lahmiri, Salim

    2017-04-01

    Assessing dysphonic symptoms is a noninvasive and effective approach to detect Parkinson's disease (PD) in patients. The main purpose of this study is to investigate the effect of different dysphonia measurements on PD detection by support vector machine (SVM). Seven categories of dysphonia measurements are considered. Experimental results from ten-fold cross-validation technique demonstrate that vocal fundamental frequency statistics yield the highest accuracy of 88 % ± 0.04. When all dysphonia measurements are employed, the SVM classifier achieves 94 % ± 0.03 accuracy. A refinement of the original patterns space by removing dysphonia measurements with similar variation across healthy and PD subjects allows achieving 97.03 % ± 0.03 accuracy. The latter performance is larger than what is reported in the literature on the same dataset with ten-fold cross-validation technique. Finally, it was found that measures of ratio of noise to tonal components in the voice are the most suitable dysphonic symptoms to detect PD subjects as they achieve 99.64 % ± 0.01 specificity. This finding is highly promising for understanding PD symptoms.

  20. Web-based alcohol screening and brief intervention for university students: a randomized trial.

    Science.gov (United States)

    Kypri, Kypros; Vater, Tina; Bowe, Steven J; Saunders, John B; Cunningham, John A; Horton, Nicholas J; McCambridge, Jim

    2014-03-26

    Unhealthy alcohol use is a leading contributor to the global burden of disease, particularly among young people. Systematic reviews suggest efficacy of web-based alcohol screening and brief intervention and call for effectiveness trials in settings where it could be sustainably delivered. To evaluate a national web-based alcohol screening and brief intervention program. A multisite, double-blind, parallel-group, individually randomized trial was conducted at 7 New Zealand universities. In April and May of 2010, invitations containing hyperlinks to the Alcohol Use Disorders Identification Test-Consumption (AUDIT-C) screening test were e-mailed to 14,991 students aged 17 to 24 years. Participants who screened positive (AUDIT-C score ≥4) were randomized to undergo screening alone or to 10 minutes of assessment and feedback (including comparisons with medical guidelines and peer norms) on alcohol expenditure, peak blood alcohol concentration, alcohol dependence, and access to help and information. A fully automated 5-month follow-up assessment was conducted that measured 6 primary outcomes: consumption per typical occasion, drinking frequency, volume of alcohol consumed, an academic problems score, and whether participants exceeded medical guidelines for acute harm (binge drinking) and chronic harm (heavy drinking). A Bonferroni-corrected significance threshold of .0083 was used to account for the 6 comparisons and a sensitivity analysis was used to assess possible attrition bias. Of 5135 students screened, 3422 scored 4 or greater and were randomized, and 83% were followed up. There was a significant effect on 1 of the 6 prespecified outcomes. Relative to control participants, those who received intervention consumed less alcohol per typical drinking occasion (median 4 drinks [interquartile range {IQR}, 2-8] vs 5 drinks [IQR 2-8]; rate ratio [RR], 0.93 [99.17% CI, 0.86-1.00]; P = .005) but not less often (RR, 0.95 [99.17% CI, 0.88-1.03]; P = .08) or less

  1. A Community-Based Randomized Trial of Hepatitis B Screening Among High-Risk Vietnamese Americans.

    Science.gov (United States)

    Ma, Grace X; Fang, Carolyn Y; Seals, Brenda; Feng, Ziding; Tan, Yin; Siu, Philip; Yeh, Ming Chin; Golub, Sarit A; Nguyen, Minhhuyen T; Tran, Tam; Wang, Minqi

    2017-03-01

    To evaluate the effectiveness of a community-based liver cancer prevention program on hepatitis B virus (HBV) screening among low-income, underserved Vietnamese Americans at high risk. We conducted a cluster randomized trial involving 36 Vietnamese community-based organizations and 2337 participants in Pennsylvania, New Jersey, and New York City between 2009 and 2014. We randomly assigned 18 community-based organizations to a community-based multilevel HBV screening intervention (n = 1131). We randomly assigned the remaining 18 community-based organizations to a general cancer education program (n = 1206), which included information about HBV-related liver cancer prevention. We assessed HBV screening rates at 6-month follow-up. Intervention participants were significantly more likely to have undergone HBV screening (88.1%) than were control group participants (4.6%). In a Cochran-Mantel-Haenszel analysis, the intervention effect on screening outcomes remained statistically significant after adjustment for demographic and health care access variables, including income, having health insurance, having a regular health provider, and English proficiency. A community-based, culturally appropriate, multilevel HBV screening intervention effectively increases screening rates in a high-risk, hard-to-reach Vietnamese American population.

  2. Does Patient Preference Measurement in Decision Aids Improve Decisional Conflict? A Randomized Trial in Men with Prostate Cancer.

    Science.gov (United States)

    Shirk, Joseph D; Crespi, Catherine M; Saucedo, Josemanuel D; Lambrechts, Sylvia; Dahan, Ely; Kaplan, Robert; Saigal, Christopher

    2017-12-01

    Shared decision making (SDM) has been advocated as an approach to medical decision making that can improve decisional quality. Decision aids are tools that facilitate SDM in the context of limited physician time; however, many decision aids do not incorporate preference measurement. We aim to understand whether adding preference measurement to a standard patient educational intervention improves decisional quality and is feasible in a busy clinical setting. Men with incident localized prostate cancer (n = 122) were recruited from the Greater Los Angeles Veterans Affairs (VA) Medical Center urology clinic, Olive View UCLA Medical Center, and Harbor UCLA Medical Center from January 2011 to May 2015 and randomized to education with a brochure about prostate cancer treatment or software-based preference assessment in addition to the brochure. Men undergoing preference assessment received a report detailing the relative strength of their preferences for treatment outcomes used in review with their doctor. Participants completed instruments measuring decisional conflict, knowledge, SDM, and patient satisfaction with care before and/or after their cancer consultation. Baseline knowledge scores were low (mean 62%). The baseline mean total score on the Decisional Conflict Scale was 2.3 (±0.9), signifying moderate decisional conflict. Men undergoing preference assessment had a significantly larger decrease in decisional conflict total score (p = 0.023) and the Perceived Effective Decision Making subscale (p = 0.003) post consult compared with those receiving education only. Improvements in satisfaction with care, SDM, and knowledge were similar between groups. Individual-level preference assessment is feasible in the clinic setting. Patients with prostate cancer who undergo preference assessment are more certain about their treatment decisions and report decreased levels of decisional conflict when making these decisions.

  3. Validation and Reliability of a Smartphone Application for the International Prostate Symptom Score Questionnaire: A Randomized Repeated Measures Crossover Study

    Science.gov (United States)

    Shim, Sung Ryul; Sun, Hwa Yeon; Ko, Young Myoung; Chun, Dong-Il; Yang, Won Jae

    2014-01-01

    Background Smartphone-based assessment may be a useful diagnostic and monitoring tool for patients. There have been many attempts to create a smartphone diagnostic tool for clinical use in various medical fields but few have demonstrated scientific validity. Objective The purpose of this study was to develop a smartphone application of the International Prostate Symptom Score (IPSS) and to demonstrate its validity and reliability. Methods From June 2012 to May 2013, a total of 1581 male participants (≥40 years old), with or without lower urinary tract symptoms (LUTS), visited our urology clinic via the health improvement center at Soonchunhyang University Hospital (Republic of Korea) and were enrolled in this study. A randomized repeated measures crossover design was employed using a smartphone application of the IPSS and the conventional paper form of the IPSS. Paired t test under a hypothesis of non-inferior trial was conducted. For the reliability test, the intraclass correlation coefficient (ICC) was measured. Results The total score of the IPSS (P=.289) and each item of the IPSS (P=.157-1.000) showed no differences between the paper version and the smartphone version of the IPSS. The mild, moderate, and severe LUTS groups showed no differences between the two versions of the IPSS. A significant correlation was noted in the total group (ICC=.935, Psmartphones could participate. Conclusions The validity and reliability of the smartphone application version were comparable to the conventional paper version of the IPSS. The smartphone application of the IPSS could be an effective method for measuring lower urinary tract symptoms. PMID:24513507

  4. A robust random number generator based on differential comparison of chaotic laser signals.

    Science.gov (United States)

    Zhang, Jianzhong; Wang, Yuncai; Liu, Ming; Xue, Lugang; Li, Pu; Wang, Anbang; Zhang, Mingjiang

    2012-03-26

    We experimentally realize a robust real-time random number generator by differentially comparing the signal from a chaotic semiconductor laser and its delayed signal through a 1-bit analog-to-digital converter. The probability density distribution of the output chaotic signal based on the differential comparison method possesses an extremely small coefficient of Pearson's median skewness (1.5 × 10⁻⁶), which can yield a balanced random sequence much easily than the previously reported method that compares the signal from the chaotic laser with a certain threshold value. Moveover, we experimently demonstrate that our method can stably generate good random numbers at rates of 1.44 Gbit/s with excellent immunity from external perturbations while the previously reported method fails.

  5. Experimental study of a quantum random-number generator based on two independent lasers

    Science.gov (United States)

    Sun, Shi-Hai; Xu, Feihu

    2017-12-01

    A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.

  6. Possibility/Necessity-Based Probabilistic Expectation Models for Linear Programming Problems with Discrete Fuzzy Random Variables

    Directory of Open Access Journals (Sweden)

    Hideki Katagiri

    2017-10-01

    Full Text Available This paper considers linear programming problems (LPPs where the objective functions involve discrete fuzzy random variables (fuzzy set-valued discrete random variables. New decision making models, which are useful in fuzzy stochastic environments, are proposed based on both possibility theory and probability theory. In multi-objective cases, Pareto optimal solutions of the proposed models are newly defined. Computational algorithms for obtaining the Pareto optimal solutions of the proposed models are provided. It is shown that problems involving discrete fuzzy random variables can be transformed into deterministic nonlinear mathematical programming problems which can be solved through a conventional mathematical programming solver under practically reasonable assumptions. A numerical example of agriculture production problems is given to demonstrate the applicability of the proposed models to real-world problems in fuzzy stochastic environments.

  7. Benefits of Individualized Feedback in Internet-Based Interventions for Depression: A Randomized Controlled Trial.

    Science.gov (United States)

    Zagorscak, Pavle; Heinrich, Manuel; Sommer, Daniel; Wagner, Birgit; Knaevelsrud, Christine

    2018-01-01

    Even though there is an increasing number of studies on the efficacy of Internet-based interventions (IBI) for depression, experimental trials on the benefits of added guidance by clinicians are scarce and inconsistent. This study compared the efficacy of semistandardized feedback provided by psychologists with fully standardized feedback in IBI. Participants with mild-to-moderate depression (n = 1,089, 66% female) from the client pool of a health insurance company participated in a cognitive-behavioral IBI targeting depression over 6 weeks. Individuals were randomized to weekly semistandardized e-mail feedback from psychologists (individual counseling; IC) or to automated, standardized feedback where a psychologist could be contacted on demand (CoD). The contents and tasks were identical across conditions. The primary outcome was depression; secondary outcomes included anxiety, rumination, and well-being. Outcomes were assessed before and after the intervention and 3, 6, and 12 months later. Changes in outcomes were evaluated using latent change score modeling. Both interventions yielded large pre-post effects on depression (Beck Depression Inventory-II: dIC = 1.53, dCoD = 1.37; Patient Health Questionnaire-9: dIC = 1.20, dCoD = 1.04), as well as significant improvements of all other outcome measures. The effects remained significant after 3, 6, and 12 months. The groups differed with regard to attrition (IC: 17.3%, CoD: 25.8%, p = 0.001). Between-group effects were statistically nonsignificant across outcomes and measurement occasions. Adding semistandardized guidance in IBI for depression did not prove to be more effective than fully standardized feedback on primary and secondary outcomes, but it had positive effects on attrition. © 2018 S. Karger AG, Basel.

  8. The effect of community-based health management on the health of the elderly: a randomized controlled trial from China

    Directory of Open Access Journals (Sweden)

    Chao Jianqian

    2012-12-01

    Full Text Available Abstract Background An aging population poses significant challenges to health care in China. Health management has been implemented to reduce the costs of care, raise health service utilization, increase health knowledge and improve quality of life. Several studies have tried to verify the effectiveness of health management in achieving these goals worldwide. However, there have been insufficient randomized control trials (RCTs to draw reliable conclusions. The few small-scale studies conducted in China include mostly the general population rather than the elderly. Our study is designed to evaluate the impact of community-based health management on the health of the elderly through an RCT in Nanjing, China. Methods Two thousand four hundred participants, aged 60 or older and who gave informed consent, were randomly allocated 1:1 into management and control groups, the randomization schedule was concealed from community health service center staff until allocation. Community-based health management was applied in the former while the latter was only given usual care. After 18 months, three categories of variables (subjective grading health indices, objective health indices and health service utilization were measured based on a questionnaire, clinical monitoring and diagnostic measurements. Differences between the two groups were assessed before and after the intervention and analyzed with t-test, χ2-test, and multiple regression analysis. Results Compared with the control group, the management group demonstrated improvement on the following variables (P Conclusion Community-based health management improved both subjective grading health indices, objective health indices and decreased the number of outpatient clinic visits, demonstrating effectiveness in improving elderly health. Trial registration ChiCTR-OCH-11001716

  9. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  10. Who participates in a randomized trial of mindfulness-based stress reduction (MBSR) after breast cancer?

    DEFF Research Database (Denmark)

    Würtzen, Hanne; Oksbjerg Dalton, Susanne; Kaae Andersen, Klaus

    2013-01-01

    Danish population-based registries and clinical databases to determine differences in demographics, breast cancer and co-morbidity among 1208 women eligible for a randomized controlled trial (www.clinicaltrials.gov identifier: NCT00990977) of mindfulness-based stress reduction MBSR. RESULTS: Participants......BACKGROUND: Discussion regarding the necessity to identify patients with both the need and motivation for psychosocial intervention is ongoing. Evidence for an effect of mindfulness-based interventions among cancer patients is based on few studies with no systematic enrollment. METHODS: We used...

  11. Developing evidence-based dentistry skills: how to interpret randomized clinical trials and systematic reviews.

    Science.gov (United States)

    Kiriakou, Juliana; Pandis, Nikolaos; Madianos, Phoebus; Polychronopoulou, Argy

    2014-10-30

    Decision-making based on reliable evidence is more likely to lead to effective and efficient treatments. Evidence-based dentistry was developed, similarly to evidence-based medicine, to help clinicians apply current and valid research findings into their own clinical practice. Interpreting and appraising the literature is fundamental and involves the development of evidence-based dentistry (EBD) skills. Systematic reviews (SRs) of randomized controlled trials (RCTs) are considered to be evidence of the highest level in evaluating the effectiveness of interventions. Furthermore, the assessment of the report of a RCT, as well as a SR, can lead to an estimation of how the study was designed and conducted.

  12. Calculating radiotherapy margins based on Bayesian modelling of patient specific random errors

    International Nuclear Information System (INIS)

    Herschtal, A; Te Marvelde, L; Mengersen, K; Foroudi, F; Ball, D; Devereux, T; Pham, D; Greer, P B; Pichler, P; Eade, T; Kneebone, A; Bell, L; Caine, H; Hindson, B; Kron, T; Hosseinifard, Z

    2015-01-01

    Collected real-life clinical target volume (CTV) displacement data show that some patients undergoing external beam radiotherapy (EBRT) demonstrate significantly more fraction-to-fraction variability in their displacement (‘random error’) than others. This contrasts with the common assumption made by historical recipes for margin estimation for EBRT, that the random error is constant across patients. In this work we present statistical models of CTV displacements in which random errors are characterised by an inverse gamma (IG) distribution in order to assess the impact of random error variability on CTV-to-PTV margin widths, for eight real world patient cohorts from four institutions, and for different sites of malignancy. We considered a variety of clinical treatment requirements and penumbral widths. The eight cohorts consisted of a total of 874 patients and 27 391 treatment sessions. Compared to a traditional margin recipe that assumes constant random errors across patients, for a typical 4 mm penumbral width, the IG based margin model mandates that in order to satisfy the common clinical requirement that 90% of patients receive at least 95% of prescribed RT dose to the entire CTV, margins be increased by a median of 10% (range over the eight cohorts −19% to +35%). This substantially reduces the proportion of patients for whom margins are too small to satisfy clinical requirements. (paper)

  13. When to base clinical policies on observational versus randomized trial data.

    Science.gov (United States)

    Hornberger, J; Wrone, E

    1997-10-15

    Physicians must decide when the evidence is sufficient to adopt a new clinical policy. Analysis of large clinical and administrative databases is becoming an important source of evidence for changing clinical policies. Because such analysis cannot control for the effects of all potential confounding variables, physicians risk drawing the wrong conclusion about the cause-and-effect relation between a change in clinical policy and outcomes. Randomized studies offer protection against drawing a conclusion that would lead to adoption of an inferior policy. However, a randomized study may be difficult to justify because of the extra costs of collecting data for a randomized study and concerns that a study will not directly benefit the patients enrolled in the study. This article reviews the advantages and disadvantages of basing clinical policy on analysis of large databases compared with conducting a randomized study. A technique is described and illustrated for accessing the potential costs and benefits of conducting such a study. This type of analysis formed the basis for a physician-managed health care organization deciding to sponsor a randomized study among patients with end-stage renal disease as part of a quality-improvement initiative.

  14. FOG Random Drift Signal Denoising Based on the Improved AR Model and Modified Sage-Husa Adaptive Kalman Filter.

    Science.gov (United States)

    Sun, Jin; Xu, Xiaosu; Liu, Yiting; Zhang, Tao; Li, Yao

    2016-07-12

    In order to reduce the influence of fiber optic gyroscope (FOG) random drift error on inertial navigation systems, an improved auto regressive (AR) model is put forward in this paper. First, based on real-time observations at each restart of the gyroscope, the model of FOG random drift can be established online. In the improved AR model, the FOG measured signal is employed instead of the zero mean signals. Then, the modified Sage-Husa adaptive Kalman filter (SHAKF) is introduced, which can directly carry out real-time filtering on the FOG signals. Finally, static and dynamic experiments are done to verify the effectiveness. The filtering results are analyzed with Allan variance. The analysis results show that the improved AR model has high fitting accuracy and strong adaptability, and the minimum fitting accuracy of single noise is 93.2%. Based on the improved AR(3) model, the denoising method of SHAKF is more effective than traditional methods, and its effect is better than 30%. The random drift error of FOG is reduced effectively, and the precision of the FOG is improved.

  15. Web-Based Education Prior to Outpatient Orthopaedic Surgery Enhances Early Patient Satisfaction Scores: A Prospective Randomized Controlled Study.

    Science.gov (United States)

    van Eck, Carola F; Toor, Aneet; Banffy, Michael B; Gambardella, Ralph A

    2018-01-01

    A good patient-surgeon relationship relies on adequate preoperative education and counseling. Several multimedia resources, such as web-based education tools, have become available to enhance aspects of perioperative care. The purpose of this study was to evaluate the effect of an interactive web-based education tool on perioperative patient satisfaction scores after outpatient orthopaedic surgery. It was hypothesized that web-based education prior to outpatient orthopaedic surgery enhances patient satisfaction scores. Randomized controlled trial; Level of evidence, 1. All patients undergoing knee arthroscopy with meniscectomy, chondroplasty, or anterior cruciate ligament reconstruction or shoulder arthroscopy with rotator cuff repair were eligible for inclusion and were randomized to the study or control group. The control group received routine education by the surgeon, whereas the study group received additional web-based education. At the first postoperative visit, all patients completed the OAS CAHPS (Outpatient and Ambulatory Surgery Consumer Assessment of Healthcare Providers and Systems) survey. Differences in patient satisfaction scores between the study and control groups were determined with an independent t test. A total of 177 patients were included (104 [59%] males; mean age, 42 ± 14 years); 87 (49%) patients were randomized to receive additional web-based education. Total patient satisfaction score was significantly higher in the study group (97 ± 5) as compared with the control group (94 ± 8; P = .019), specifically for the OAS CAHPS core measure "recovery" (92 ± 13 vs 82 ± 23; P = .001). Age, sex, race, workers' compensation status, education level, overall health, emotional health, procedure type and complexity, and addition of a video did not influence patient satisfaction scores. Supplemental web-based patient education prior to outpatient orthopaedic surgery enhances patient satisfaction scores.

  16. A generalized complexity measure based on Rényi entropy

    Science.gov (United States)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  17. Effects of the X:IT smoking intervention: a school-based cluster randomized trial.

    Science.gov (United States)

    Andersen, Anette; Krølner, Rikker; Bast, Lotus Sofie; Thygesen, Lau Caspar; Due, Pernille

    2015-12-01

    Uptake of smoking in adolescence is still of major public health concern. Evaluations of school-based programmes for smoking prevention show mixed results. The aim of this study was to examine the effect of X:IT, a multi-component school-based programme to prevent adolescent smoking. Data from a Danish cluster randomized trial included 4041 year-7 students (mean age: 12.5) from 51 intervention and 43 control schools. Outcome measure 'current smoking' was dichotomized into smoking daily, weekly, monthly or more seldom vs do not smoke. Analyses were adjusted for baseline covariates: sex, family socioeconomic position (SEP), best friend's smoking and parental smoking. We performed multilevel, logistic regression analyses of available cases and intention-to-treat (ITT) analyses, replacing missing outcome values by multiple imputation. At baseline, 4.7% and 6.8% of the students at the intervention and the control schools smoked, respectively. After 1 year of the intervention, the prevalence was 7.9% and 10.7%, respectively. At follow-up, 553 students (13.7%) did not answer the question on smoking. Available case analyses: crude odds ratios (OR) for smoking at intervention schools compared with control schools: 0.65 (0.48-0.88) and adjusted: 0.70 (0.47-1.04). ITT analyses: crude OR for smoking at intervention schools compared with control schools: 0.67 (0.50-0.89) and adjusted: 0.61 (0.45-0.82). Students at intervention schools had a lower risk of smoking after a year of intervention in year 7. This multi-component intervention involving educational, parental and context-related intervention components seems to be efficient in lowering or postponing smoking uptake in Danish adolescents. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. Depression, Craving and Substance Use Following a Randomized Trial of Mindfulness-Based Relapse Prevention

    Science.gov (United States)

    Witkiewitz, Katie; Bowen, Sarah

    2012-01-01

    Objective A strong relation between negative affect and craving has been demonstrated in laboratory and clinical studies, with depressive symptomatology showing particularly strong links to craving and substance abuse relapse. Mindfulness-Based Relapse Prevention (MBRP), shown to be efficacious for reduction of substance use, uses mindfulness-based practices to teach alternative responses to emotional discomfort and lessen the conditioned response of craving in the presence of depressive symptoms. The goal of the current study was to examine the relation between measures of depressive symptoms, craving, and substance use following MBRP. Methods Individuals with substance use disorders (N=168; age 40.45, (SD=10.28); 36.3% female; 46.4% nonwhite) were recruited after intensive stabilization, then randomly assigned to either eight weekly sessions of MBRP or a treatment-as-usual control group. Approximately 73% of the sample was retained at the final four-month follow-up assessment. Results Results confirmed a moderated-mediation effect, whereby craving mediated the relation between depressive symptoms (Beck Depression Inventory) and substance use (Time Line Follow Back) among the treatment-as-usual group, but not among MBRP participants. Specifically, MBRP attenuated the relation between postintervention depressive symptoms and craving (Penn Alcohol Craving Scale) two months following the intervention (f2=.21). This moderation effect predicted substance use four-months following the intervention (f2=.18). Conclusion MBRP appears to influence cognitive and behavioral responses to depressive symptoms, partially explaining reductions in postintervention substance use among the MBRP group. Although preliminary, the current study provides evidence for the value of incorporating mindfulness practice into substance abuse treatment and identifies one potential mechanism of change following MBRP. PMID:20515211

  19. A permutation test to analyse systematic bias and random measurement errors of medical devices via boosting location and scale models.

    Science.gov (United States)

    Mayr, Andreas; Schmid, Matthias; Pfahlberg, Annette; Uter, Wolfgang; Gefeller, Olaf

    2017-06-01

    Measurement errors of medico-technical devices can be separated into systematic bias and random error. We propose a new method to address both simultaneously via generalized additive models for location, scale and shape (GAMLSS) in combination with permutation tests. More precisely, we extend a recently proposed boosting algorithm for GAMLSS to provide a test procedure to analyse potential device effects on the measurements. We carried out a large-scale simulation study to provide empirical evidence that our method is able to identify possible sources of systematic bias as well as random error under different conditions. Finally, we apply our approach to compare measurements of skin pigmentation from two different devices in an epidemiological study.

  20. A Copula Based Approach for Design of Multivariate Random Forests for Drug Sensitivity Prediction.

    Science.gov (United States)

    Haider, Saad; Rahman, Raziur; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Modeling sensitivity to drugs based on genetic characterizations is a significant challenge in the area of systems medicine. Ensemble based approaches such as Random Forests have been shown to perform well in both individual sensitivity prediction studies and team science based prediction challenges. However, Random Forests generate a deterministic predictive model for each drug based on the genetic characterization of the cell lines and ignores the relationship between different drug sensitivities during model generation. This application motivates the need for generation of multivariate ensemble learning techniques that can increase prediction accuracy and improve variable importance ranking by incorporating the relationships between different output responses. In this article, we propose a novel cost criterion that captures the dissimilarity in the output response structure between the training data and node samples as the difference in the two empirical copulas. We illustrate that copulas are suitable for capturing the multivariate structure of output responses independent of the marginal distributions and the copula based multivariate random forest framework can provide higher accuracy prediction and improved variable selection. The proposed framework has been validated on genomics of drug sensitivity for cancer and cancer cell line encyclopedia database.

  1. Classification of high resolution remote sensing image based on geo-ontology and conditional random fields

    Science.gov (United States)

    Hong, Liang

    2013-10-01

    The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.

  2. Cryptographic analysis on the key space of optical phase encryption algorithm based on the design of discrete random phase mask

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Li, Zengyan

    2013-07-01

    The key space of phase encryption algorithm using discrete random phase mask is investigated by numerical simulation in this paper. Random phase mask with finite and discrete phase levels is considered as the core component in most practical optical encryption architectures. The key space analysis is based on the design criteria of discrete random phase mask. The role of random amplitude mask and random phase mask in optical encryption system is identified from the perspective of confusion and diffusion. The properties of discrete random phase mask in a practical double random phase encoding scheme working in both amplitude encoding (AE) and phase encoding (PE) modes are comparably analyzed. The key space of random phase encryption algorithm is evaluated considering both the encryption quality and the brute-force attack resistibility. A method for enlarging the key space of phase encryption algorithm is also proposed to enhance the security of optical phase encryption techniques.

  3. A randomized, controlled trial of team-based competition to increase learner participation in quality-improvement education.

    Science.gov (United States)

    Scales, Charles D; Moin, Tannaz; Fink, Arlene; Berry, Sandra H; Afsar-Manesh, Nasim; Mangione, Carol M; Kerfoot, B Price

    2016-04-01

    Several barriers challenge resident engagement in learning quality improvement (QI). We investigated whether the incorporation of team-based game mechanics into an evidence-based online learning platform could increase resident participation in a QI curriculum. Randomized, controlled trial. Tertiary-care medical center residency training programs. Resident physicians (n = 422) from nine training programs (anesthesia, emergency medicine, family medicine, internal medicine, ophthalmology, orthopedics, pediatrics, psychiatry and general surgery) randomly allocated to a team competition environment (n = 200) or the control group (n = 222). Specialty-based team assignment with leaderboards to foster competition, and alias assignment to de-identify individual participants. Participation in online learning, as measured by percentage of questions attempted (primary outcome) and additional secondary measures of engagement (i.e. response time). Changes in participation measures over time between groups were assessed with a repeated measures ANOVA framework. Residents in the intervention arm demonstrated greater participation than the control group. The percentage of questions attempted at least once was greater in the competition group (79% [SD ± 32] versus control, 68% [SD ± 37], P= 0.03). Median response time was faster in the competition group (P= 0.006). Differences in participation continued to increase over the duration of the intervention, as measured by average response time and cumulative percent of questions attempted (each Ponline course delivering QI content. Medical educators should consider game mechanics to optimize participation when designing learning experiences. Published by Oxford University Press in association with the International Society for Quality in Health Care 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  4. Ground-based measurements of ionospheric dynamics

    Science.gov (United States)

    Kouba, Daniel; Chum, Jaroslav

    2018-05-01

    Different methods are used to research and monitor the ionospheric dynamics using ground measurements: Digisonde Drift Measurements (DDM) and Continuous Doppler Sounding (CDS). For the first time, we present comparison between both methods on specific examples. Both methods provide information about the vertical drift velocity component. The DDM provides more information about the drift velocity vector and detected reflection points. However, the method is limited by the relatively low time resolution. In contrast, the strength of CDS is its high time resolution. The discussed methods can be used for real-time monitoring of medium scale travelling ionospheric disturbances. We conclude that it is advantageous to use both methods simultaneously if possible. The CDS is then applied for the disturbance detection and analysis, and the DDM is applied for the reflection height control.

  5. Mindfulness-based cognitive therapy (MBCT) for multiple chemical sensitivity (MCS): Results from a randomized controlled trial with 1-year follow-up

    DEFF Research Database (Denmark)

    Hauge, Christian Riise; Rasmussen, Alice; Piet, Jacob

    2015-01-01

    the effects of mindfulness-based cognitive therapy (MBCT) for individuals with MCS. Methods The intention-to-treat sample (ITT) included 69 individuals who had been randomized to either MBCT or treatment as usual (TAU). The primary outcome measure was the Quick Environmental Exposure and Sensitivity Inventory...

  6. Statistical Measures for Usage-Based Linguistics

    Science.gov (United States)

    Gries, Stefan Th.; Ellis, Nick C.

    2015-01-01

    The advent of usage-/exemplar-based approaches has resulted in a major change in the theoretical landscape of linguistics, but also in the range of methodologies that are brought to bear on the study of language acquisition/learning, structure, and use. In particular, methods from corpus linguistics are now frequently used to study distributional…

  7. Automated seismic detection of landslides at regional scales: a Random Forest based detection algorithm

    Science.gov (United States)

    Hibert, C.; Michéa, D.; Provost, F.; Malet, J. P.; Geertsema, M.

    2017-12-01

    Detection of landslide occurrences and measurement of their dynamics properties during run-out is a high research priority but a logistical and technical challenge. Seismology has started to help in several important ways. Taking advantage of the densification of global, regional and local networks of broadband seismic stations, recent advances now permit the seismic detection and location of landslides in near-real-time. This seismic detection could potentially greatly increase the spatio-temporal resolution at which we study landslides triggering, which is critical to better understand the influence of external forcings such as rainfalls and earthquakes. However, detecting automatically seismic signals generated by landslides still represents a challenge, especially for events with small mass. The low signal-to-noise ratio classically observed for landslide-generated seismic signals and the difficulty to discriminate these signals from those generated by regional earthquakes or anthropogenic and natural noises are some of the obstacles that have to be circumvented. We present a new method for automatically constructing instrumental landslide catalogues from continuous seismic data. We developed a robust and versatile solution, which can be implemented in any context where a seismic detection of landslides or other mass movements is relevant. The method is based on a spectral detection of the seismic signals and the identification of the sources with a Random Forest machine learning algorithm. The spectral detection allows detecting signals with low signal-to-noise ratio, while the Random Forest algorithm achieve a high rate of positive identification of the seismic signals generated by landslides and other seismic sources. The processing chain is implemented to work in a High Performance Computers centre which permits to explore years of continuous seismic data rapidly. We present here the preliminary results of the application of this processing chain for years

  8. Novel measurement-based indoor cellular radio system design

    OpenAIRE

    Aragón-Zavala, A

    2008-01-01

    A scaleable, measurement-based radio methodology has been created to use for the design, planing and optimisation of in door cellular radio systems. The development of this measurement-based methodology was performed having in mind that measurements are of ten required to valiate radio coverage in a building. Therefore, the concept of using care fully calibrated measurements to design and optimise a system is feasible since these measurements can easily be obtained prior to system deployment ...

  9. Synthesis and characterization of sugar-based methacrylates and their random copolymers by ATRP

    Directory of Open Access Journals (Sweden)

    G. Acik

    2017-10-01

    Full Text Available Various sugar-based methacrylate monomers have been prepared and randomly copolymerized with methyl methacrylate (MMA using classical atom transfer radical polymerization (ATRP. Firstly, four different sugar-based methacrylates are synthesized by two-step method: (i etherification of protected monosaccharides with epichlorohydrin and (ii following ring-opening reaction of obtained epoxides with methacrylic acid (MAA in the presence of triethylamine. Next, these monomers are copolymerized with MMA via ATRP at 90 °C to obtain corresponding random copolymers. The molecular weights of the copolymers are determined by both GPC (gel permeation chromatography and 1H-NMR (nuclear magnetic resonance spectroscopy analyses and found as 10600~16800 and 12200~18500 g/mol, respectively. Moreover, the copolymer compositions are also determined by 1H-NMR analysis using characteristic signals of the monomers and found as about 94.1~97.8%, which are good agreement with feeding ratio. In addition, the glass transition temperatures of copolymers are found as 101.2~102.9 °C by changing type and composition of sugar-based methacrylate monomers. Overall, a series of well-defined random copolymers comprising different sugar-based methacrylates and methyl methacrylates were successfully synthesized by classical ATRP method.

  10. Measurement Error Correction Formula for Cluster-Level Group Differences in Cluster Randomized and Observational Studies

    Science.gov (United States)

    Cho, Sun-Joo; Preacher, Kristopher J.

    2016-01-01

    Multilevel modeling (MLM) is frequently used to detect cluster-level group differences in cluster randomized trial and observational studies. Group differences on the outcomes (posttest scores) are detected by controlling for the covariate (pretest scores) as a proxy variable for unobserved factors that predict future attributes. The pretest and…

  11. A Unified 3D Mesh Segmentation Framework Based on Markov Random Field

    OpenAIRE

    Z.F. Shi; L.Y. Lu; D. Le; X.M. Niu

    2012-01-01

    3D Mesh segmentation has become an important research field in computer graphics during the past decades. Many geometry based and semantic oriented approaches for 3D mesh segmentation has been presented. In this paper, we present a definition of mesh segmentation according to labeling problem. Inspired by the Markov Random Field (MRF) based image segmentation, we propose a new framework of 3D mesh segmentation based on MRF and use graph cuts to solve it. Any features of 3D mesh can be integra...

  12. Miniaturized diffraction based interferometric distance measurement sensor

    Science.gov (United States)

    Kim, Byungki

    In this thesis, new metrology hardware is designed, fabricated, and tested to provide improvements over current MEMS metrology. The metrology system is a micromachined scanning interferometer (muSI) having a sub-nm resolution in a compact design. The proposed microinterferometer forms a phase sensitive diffraction grating with interferomeric sensitivity, while adding the capability of better lateral resolution by focusing the laser to a smaller spot size. A detailed diffraction model of the microinterferometer was developed to simulate the device performance and to suggest the location of photo detectors for integrated optoelectronics. A particular device is fabricated on a fused silica substrate using aluminum to form the deformable diffraction grating fingers and AZ P4620 photo resist (PR) for the microlens. The details of the fabrication processes are presented. The structure also enables optoelectronics to be integrated so that the interferometer with photo detectors can fit in an area that is 1 mm x 1 mm. The scanning results using a fixed grating muSI demonstrated that it could measure vibration profile as well as static vertical (less than a half wave length) and lateral dimension of MEMS. The muSI, which is integrated with photo diodes, demonstrated its operation by scanning a cMUT. The PID control has been tested and resulted in improvement in scanned images. The integrated muSI demonstrated that the deformable grating could be used to tune the measurement keep the interferometer in quadrature for highest sensitivity.

  13. Development of microcontroller based water flow measurement

    Science.gov (United States)

    Munir, Muhammad Miftahul; Surachman, Arif; Fathonah, Indra Wahyudin; Billah, Muhammad Aziz; Khairurrijal, Mahfudz, Hernawan; Rimawan, Ririn; Lestari, Slamet

    2015-04-01

    A digital instrument for measuring water flow was developed using an AT89S52 microcontroller, DS1302 real time clock (RTC), and EEPROM for an external memory. The sensor used for probing the current was a propeller that will rotate if immersed in a water flow. After rotating one rotation, the sensor sends one pulse and the number of pulses are counted for a certain time of counting. The measurement data, i.e. the number of pulses per unit time, are converted into water flow velocity (m/s) through a mathematical formula. The microcontroller counts the pulse sent by the sensor and the number of counted pulses are stored into the EEPROM memory. The time interval for counting is provided by the RTC and can be set by the operator. The instrument was tested under various time intervals ranging from 10 to 40 seconds and several standard propellers owned by Experimental Station for Hydraulic Structure and Geotechnics (BHGK), Research Institute for Water Resources (Pusair). Using the same propellers and water flows, it was shown that water flow velocities obtained from the developed digital instrument and those found by the provided analog one are almost similar.

  14. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    Science.gov (United States)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  15. A simple heuristic for Internet-based evidence search in primary care: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Eberbach A

    2016-08-01

    Full Text Available Andreas Eberbach,1 Annette Becker,1 Justine Rochon,2 Holger Finkemeler,1Achim Wagner,3 Norbert Donner-Banzhoff1 1Department of Family and Community Medicine, Philipp University of Marburg, Marburg, Germany; 2Institute of Medical Biometry and Informatics, University of Heidelberg, Heidelberg, Germany; 3Department of Sport Medicine, Justus-Liebig-University of Giessen, Giessen, Germany Background: General practitioners (GPs are confronted with a wide variety of clinical questions, many of which remain unanswered. Methods: In order to assist GPs in finding quick, evidence-based answers, we developed a learning program (LP with a short interactive workshop based on a simple ­three-step-heuristic to improve their search and appraisal competence (SAC. We evaluated the LP ­effectiveness with a randomized controlled trial (RCT. Participants (intervention group [IG] n=20; ­control group [CG] n=31 rated acceptance and satisfaction and also answered 39 ­knowledge ­questions to assess their SAC. We controlled for previous knowledge in content areas covered by the test. Results: Main outcome – SAC: within both groups, the pre–post test shows significant (P=0.00 improvements in correctness (IG 15% vs CG 11% and confidence (32% vs 26% to find evidence-based answers. However, the SAC difference was not significant in the RCT. Other measures: Most workshop participants rated “learning atmosphere” (90%, “skills acquired” (90%, and “relevancy to my practice” (86% as good or very good. The ­LP-recommendations were implemented by 67% of the IG, whereas 15% of the CG already conformed to LP recommendations spontaneously (odds ratio 9.6, P=0.00. After literature search, the IG showed a (not significantly higher satisfaction regarding “time spent” (IG 80% vs CG 65%, “quality of information” (65% vs 54%, and “amount of information” (53% vs 47%.Conclusion: Long-standing established GPs have a good SAC. Despite high acceptance, strong

  16. Do cognitive measures and brain circuitry predict outcomes of exercise in Parkinson Disease: a randomized clinical trial.

    Science.gov (United States)

    King, L A; Peterson, D S; Mancini, M; Carlson-Kuhta, P; Fling, B W; Smulders, K; Nutt, J G; Dale, M; Carter, J; Winters-Stone, K M; Horak, F B

    2015-10-24

    There is emerging research detailing the relationship between balance/gait/falls and cognition. Imaging studies also suggest a link between structural and functional changes in the frontal lobe (a region commonly associated with cognitive function) and mobility. People with Parkinson's disease have important changes in cognitive function that may impact rehabilitation efficacy. Our underlying hypothesis is that cognitive function and frontal lobe connections with the basal ganglia and brainstem posture/locomotor centers are responsible for postural deficits in people with Parkinson's disease and play a role in rehabilitation efficacy. The purpose of this study is to 1) determine if people with Parkinson's disease can improve mobility and/or cognition after partaking in a cognitively challenging mobility exercise program and 2) determine if cognition and brain circuitry deficits predict responsiveness to exercise rehabilitation. This study is a randomized cross-over controlled intervention to take place at a University Balance Disorders Laboratory. The study participants will be people with Parkinson's disease who meet inclusion criteria for the study. The intervention will be 6 weeks of group exercise (case) and 6 weeks of group education (control). The exercise is a cognitively challenging program based on the Agility Boot Camp for people with PD. The education program is a 6-week program to teach people how to better live with a chronic disease. The primary outcome measure is the MiniBESTest and the secondary outcomes are measures of mobility, cognition and neural imaging. The results from this study will further our understanding of the relationship between cognition and mobility with a focus on brain circuitry as it relates to rehabilitation potential. This trial is registered at clinical trials.gov (NCT02231073).

  17. Effectiveness of a theory-based intervention to increase colorectal cancer screening among Iranian health club members: a randomized trial.

    Science.gov (United States)

    Salimzadeh, Hamideh; Eftekhar, Hassan; Majdzadeh, Reza; Montazeri, Ali; Delavari, Alireza

    2014-10-01

    Colorectal cancer is the third most commonly diagnosed cancer and the fourth leading cause of death in the world. There are few published studies that have used theory-based interventions designed to increase colorectal cancer screening in community lay health organizations. The present study was guided by the theoretical concepts of the preventive health model. Twelve health clubs of a municipal district in Tehran were randomized to two study groups with equal ratio. The control group received usual services throughout the study while the intervention group also received a theory-based educational program on colorectal cancer screening plus a reminder call. Screening behavior, the main outcome, was assessed 4 months after randomization. A total of 360 members aged 50 and older from 12 health clubs completed a baseline survey. Participants in the intervention group reported increased knowledge of colorectal cancer and screening tests at 4 months follow-up (p's theory-based intervention significantly improved self-efficacy, perceived susceptibility, efficacy of screening, social support, and intention to be screened for colorectal cancer, from baseline to 4 months follow-up (p's theory-based intervention was found to have a significant effect on colorectal cancer screening use as measured by self-report. The findings could have implications for colorectal cancer screening program development and implementation in primary health care settings and through other community organizations.

  18. Bridge continuous deformation measurement technology based on fiber optic gyro

    Science.gov (United States)

    Gan, Weibing; Hu, Wenbin; Liu, Fang; Tang, Jianguang; Li, Sheng; Yang, Yan

    2016-03-01

    Bridge is an important part of modern transportation systems and deformation is a key index for bridge's safety evaluation. To achieve the long span bridge curve measurement rapidly and timely and accurately locate the bridge maximum deformation, the continuous deformation measurement system (CDMS) based on inertial platform is presented and validated in this paper. Firstly, based on various bridge deformation measurement methods, the method of deformation measurement based on the fiber optic gyro (FOG) is introduced. Secondly, the basic measurement principle based on FOG is presented and the continuous curve trajectory is derived by the formula. Then the measurement accuracy is analyzed in theory and the relevant factors are presented to ensure the measurement accuracy. Finally, the deformation measurement experiments are conducted on a bridge across the Yangtze River. Experimental results show that the presented deformation measurement method is feasible, practical, and reliable; the system can accurately and quickly locate the maximum deformation and has extensive and broad application prospects.

  19. A Transdermal Measurement Platform Based on Microfluidics

    Directory of Open Access Journals (Sweden)

    Wen-Ying Huang

    2017-01-01

    Full Text Available The Franz diffusion cell is one of the most widely used devices to evaluate transdermal drug delivery. However, this static and nonflowing system has some limitations, such as a relatively large solution volume and skin area and the development of gas bubbles during sampling. To overcome these disadvantages, this study provides a proof of concept for miniaturizing models of transdermal delivery by using a microfluidic chip combined with a diffusion cell. The proposed diffusion microchip system requires only 80 μL of sample solution and provides flow circulation. Two model compounds, Coomassie Brilliant Blue G-250 and potassium ferricyanide, were successfully tested for transdermal delivery experiments. The diffusion rate is high for a high sample concentration or a large membrane pore size. The developed diffusion microchip system, which is feasible, can be applied for transdermal measurement in the future.

  20. Physically transient photonics: random versus distributed feedback lasing based on nanoimprinted DNA.

    Science.gov (United States)

    Camposeo, Andrea; Del Carro, Pompilio; Persano, Luana; Cyprych, Konrad; Szukalski, Adam; Sznitko, Lech; Mysliwiec, Jaroslaw; Pisignano, Dario

    2014-10-28

    Room-temperature nanoimprinted, DNA-based distributed feedback (DFB) laser operation at 605 nm is reported. The laser is made of a pure DNA host matrix doped with gain dyes. At high excitation densities, the emission of the untextured dye-doped DNA films is characterized by a broad emission peak with an overall line width of 12 nm and superimposed narrow peaks, characteristic of random lasing. Moreover, direct patterning of the DNA films is demonstrated with a resolution down to 100 nm, enabling the realization of both surface-emitting and edge-emitting DFB lasers with a typical line width of <0.3 nm. The resulting emission is polarized, with a ratio between the TE- and TM-polarized intensities exceeding 30. In addition, the nanopatterned devices dissolve in water within less than 2 min. These results demonstrate the possibility of realizing various physically transient nanophotonics and laser architectures, including random lasing and nanoimprinted devices, based on natural biopolymers.

  1. Synchronization of random bit generators based on coupled chaotic lasers and application to cryptography.

    Science.gov (United States)

    Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang

    2010-08-16

    Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.

  2. Improving Pediatric Basic Life Support Performance Through Blended Learning With Web-Based Virtual Patients: Randomized Controlled Trial.

    Science.gov (United States)

    Lehmann, Ronny; Thiessen, Christiane; Frick, Barbara; Bosse, Hans Martin; Nikendei, Christoph; Hoffmann, Georg Friedrich; Tönshoff, Burkhard; Huwendiek, Sören

    2015-07-02

    E-learning and blended learning approaches gain more and more popularity in emergency medicine curricula. So far, little data is available on the impact of such approaches on procedural learning and skill acquisition and their comparison with traditional approaches. This study investigated the impact of a blended learning approach, including Web-based virtual patients (VPs) and standard pediatric basic life support (PBLS) training, on procedural knowledge, objective performance, and self-assessment. A total of 57 medical students were randomly assigned to an intervention group (n=30) and a control group (n=27). Both groups received paper handouts in preparation of simulation-based PBLS training. The intervention group additionally completed two Web-based VPs with embedded video clips. Measurements were taken at randomization (t0), after the preparation period (t1), and after hands-on training (t2). Clinical decision-making skills and procedural knowledge were assessed at t0 and t1. PBLS performance was scored regarding adherence to the correct algorithm, conformance to temporal demands, and the quality of procedural steps at t1 and t2. Participants' self-assessments were recorded in all three measurements. Procedural knowledge of the intervention group was significantly superior to that of the control group at t1. At t2, the intervention group showed significantly better adherence to the algorithm and temporal demands, and better procedural quality of PBLS in objective measures than did the control group. These aspects differed between the groups even at t1 (after VPs, prior to practical training). Self-assessments differed significantly only at t1 in favor of the intervention group. Training with VPs combined with hands-on training improves PBLS performance as judged by objective measures.

  3. Thermal behavior for a nanoscale two ferromagnetic phase system based on random anisotropy model

    International Nuclear Information System (INIS)

    Muraca, D.; Sanchez, F.H.; Pampillo, L.G.; Saccone, F.D.

    2010-01-01

    Advances in theory that explain the magnetic behavior as function of temperature for two phase nanocrystalline soft magnetic materials are presented. The theory developed is based on the well known random anisotropy model, which includes the crystalline exchange stiffness and anisotropy energies in both amorphous and crystalline phases. The phenomenological behavior of the coercivity was obtained in the temperature range between the amorphous phase Curie temperature and the crystalline phase one.

  4. Post-stratification based on a choice of a randomization device

    Directory of Open Access Journals (Sweden)

    Sarjinder Singh

    2014-06-01

    Full Text Available In this paper, we use the idea of post-stratification based on the respondents’ choice of a particular randomization device in order to estimate the population proportion of a sensitive characteristic. The proposed idea gives full freedom to the respondents and is expected to result in greater cooperation from them as well as to provide some increase in the relative efficiency of the newly proposed estimator.

  5. Computer-Based Cognitive Training for Mild Cognitive Impairment: Results from a Pilot Randomized, Controlled Trial

    OpenAIRE

    Barnes, Deborah E.; Yaffe, Kristine; Belfor, Nataliya; Jagust, William J.; DeCarli, Charles; Reed, Bruce R.; Kramer, Joel H.

    2009-01-01

    We performed a pilot randomized, controlled trial of intensive, computer-based cognitive training in 47 subjects with mild cognitive impairment (MCI). The intervention group performed exercises specifically designed to improve auditory processing speed and accuracy for 100 minutes/day, 5 days/week for 6 weeks; the control group performed more passive computer activities (reading, listening, visuospatial game) for similar amounts of time. Subjects had a mean age of 74 years and 60% were men; 7...

  6. The Effectiveness of School-Based Nutritional Education Program among Obese Adolescents: A Randomized Controlled Study

    OpenAIRE

    In-Iw, Supinya; Saetae, Tridsanun; Manaboriboon, Boonying

    2012-01-01

    The purpose of the study was to determine the change in body weight and body mass index (BMI), as well as diet behaviors at 4 months after intervention between obese adolescent girls who participated in the school-based nutritional education program, addressed by pediatrician, compared to those who attended regular nutritional class. Methods. 49 obese girls were recruited from a secondary school. Those, were randomized into 2 groups of intervention and control. The intensive interactive nutri...

  7. Reducing procrastination using a smartphone-based treatment program: A randomized controlled pilot study

    OpenAIRE

    Christian Aljoscha Lukas; Matthias Berking

    2018-01-01

    Background: Procrastination affects a large number of individuals and is associated with significant mental health problems. Despite the deleterious consequences individuals afflicted with procrastination have to bear, there is a surprising paucity of well-researched treatments for procrastination. To fill this gap, this study evaluated the efficacy of an easy-to-use smartphone-based treatment for procrastination. Method: N=31 individuals with heightened procrastination scores were randomly a...

  8. Mindfulness-Based Cognitive Therapy as a Treatment for Chronic Tinnitus: A Randomized Controlled Trial

    OpenAIRE

    McKenna, L.; Marks, E. M.; Hallsworth, C. A.; Schaette, R.

    2017-01-01

    BACKGROUND: Tinnitus is experienced by up to 15% of the population and can lead to significant disability and distress. There is rarely a medical or surgical target and psychological therapies are recommended. We investigated whether mindfulness-based cognitive therapy (MBCT) could offer an effective new therapy for tinnitus. METHODS: This single-site randomized controlled trial compared MBCT to intensive relaxation training (RT) for chronic, distressing tinnitus in adults. Both treatments in...

  9. Randomized clinical trial of Appendicitis Inflammatory Response score-based management of patients with suspected appendicitis.

    Science.gov (United States)

    Andersson, M; Kolodziej, B; Andersson, R E

    2017-10-01

    The role of imaging in the diagnosis of appendicitis is controversial. This prospective interventional study and nested randomized trial analysed the impact of implementing a risk stratification algorithm based on the Appendicitis Inflammatory Response (AIR) score, and compared routine imaging with selective imaging after clinical reassessment. Patients presenting with suspicion of appendicitis between September 2009 and January 2012 from age 10 years were included at 21 emergency surgical centres and from age 5 years at three university paediatric centres. Registration of clinical characteristics, treatments and outcomes started during the baseline period. The AIR score-based algorithm was implemented during the intervention period. Intermediate-risk patients were randomized to routine imaging or selective imaging after clinical reassessment. The baseline period included 1152 patients, and the intervention period 2639, of whom 1068 intermediate-risk patients were randomized. In low-risk patients, use of the AIR score-based algorithm resulted in less imaging (19·2 versus 34·5 per cent; P appendicitis (6·8 versus 9·7 per cent; P = 0·034). Intermediate-risk patients randomized to the imaging and observation groups had the same proportion of negative appendicectomies (6·4 versus 6·7 per cent respectively; P = 0·884), number of admissions, number of perforations and length of hospital stay, but routine imaging was associated with an increased proportion of patients treated for appendicitis (53·4 versus 46·3 per cent; P = 0·020). AIR score-based risk classification can safely reduce the use of diagnostic imaging and hospital admissions in patients with suspicion of appendicitis. Registration number: NCT00971438 ( http://www.clinicaltrials.gov). © 2017 BJS Society Ltd Published by John Wiley & Sons Ltd.

  10. Self-Powered Random Number Generator Based on Coupled Triboelectric and Electrostatic Induction Effects at the Liquid-Dielectric Interface.

    Science.gov (United States)

    Yu, Aifang; Chen, Xiangyu; Cui, Haotian; Chen, Libo; Luo, Jianjun; Tang, Wei; Peng, Mingzeng; Zhang, Yang; Zhai, Junyi; Wang, Zhong Lin

    2016-12-27

    Modern cryptography increasingly employs random numbers generated from physical sources in lieu of conventional software-based pseudorandom numbers, primarily owing to the great demand of unpredictable, indecipherable cryptographic keys from true random numbers for information security. Thus, far, the sole demonstration of true random numbers has been generated through thermal noise and/or quantum effects, which suffers from expensive and complex equipment. In this paper, we demonstrate a method for self-powered creation of true random numbers by using triboelectric technology to collect random signals from nature. This random number generator based on coupled triboelectric and electrostatic induction effects at the liquid-dielectric interface includes an elaborately designed triboelectric generator (TENG) with an irregular grating structure, an electronic-optical device, and an optical-electronic device. The random characteristics of raindrops are harvested through TENG and consequently transformed and converted by electronic-optical device and an optical-electronic device with a nonlinear characteristic. The cooperation of the mechanical, electrical, and optical signals ensures that the generator possesses complex nonlinear input-output behavior and contributes to increased randomness. The random number sequences are deduced from final electrical signals received by an optical-electronic device using a familiar algorithm. These obtained random number sequences exhibit good statistical characteristics, unpredictability, and unrepeatability. Our study supplies a simple, practical, and effective method to generate true random numbers, which can be widely used in cryptographic protocols, digital signatures, authentication, identification, and other information security fields.

  11. Competency-Based Education: A Framework for Measuring Quality Courses

    Science.gov (United States)

    Krause, Jackie; Dias, Laura Portolese; Schedler, Chris

    2015-01-01

    The growth of competency-based education in an online environment requires the development and measurement of quality competency-based courses. While quality measures for online courses have been developed and standardized, they do not directly align with emerging best practices and principles in the design of quality competency-based online…

  12. Gamma radiation effects on random copolymers based on poly(butylene succinate) for packaging applications

    Science.gov (United States)

    Negrin, M.; Macerata, E.; Consolati, G.; Quasso, F.; Genovese, L.; Soccio, M.; Giola, M.; Lotti, N.; Munari, A.; Mariani, M.

    2018-01-01

    Within the context of new bioplastic materials, poly(butylene succinate) (PBS) and four novel poly(butylene/thiodiethylene succinate) random copolymers (PBS-PTDGS), in sheets as well as in films, were exposed to gamma radiation, in air and in water, and their behavior along with the effect on their biodegradability was investigated. The molecular weight data obtained from gel permeation chromatography indicate that the sensibility to radiation increases with the amount of sulfur-containing co-unit (TDGS). At 200 kGy the average molecular weight of PBS film halves, while for P(BS60TDGS40) the residual molecular weight is about 20%. The calculated intermolecular crosslink Gx and scissioning Gs yields confirmed that degradation is predominant over crosslink for all the aliphatic systems. As shown by thermal analyses, gamma radiation affects the thermal properties, leading to an increased crystallinity of the systems, remarkable for PBS, and lower decomposition temperatures. Variations of crystallinity with the increasing absorbed dose were confirmed also by PALS analyses. Water contact angle measurements revealed post-irradiation wettability alterations that could positively affect polymer biodegradability. In particular, when irradiated in water at 100 kGy PBS film exhibits a water contact angle decrease of about 17%, indicating an enhanced wettability. After degradation in compost, changes in the surface morphology were observed by means of SEM and sample weight losses were determined, at different extent, according to the irradiation environment. Interestingly, after 52 days in compost PBS films, both pristine and irradiated in air at 25 kGy, showed a residual weight of about 60%, while the ones irradiated in water at 25 kGy of about 44%. Experimental data confirmed that gamma irradiation could represent a viable treatment to enhance biodegradation in compost of PBS and PBS-based copolymers.

  13. Strong Hearts, Healthy Communities: A Community-Based Randomized Trial for Rural Women.

    Science.gov (United States)

    Seguin, Rebecca A; Paul, Lynn; Folta, Sara C; Nelson, Miriam E; Strogatz, David; Graham, Meredith L; Diffenderfer, Anna; Eldridge, Galen; Parry, Stephen A

    2018-05-01

    The aim of this study was to evaluate a multilevel cardiovascular disease (CVD) prevention program for rural women. This 6-month, community-based, randomized trial enrolled 194 sedentary rural women aged 40 or older with BMI ≥ 25 kg/m 2 . Intervention participants attended 6 months of twice-weekly exercise, nutrition, and heart health classes (48 total) that included individual-, social-, and environment-level components. An education-only control program included didactic healthy lifestyle classes once a month (six total). The primary outcome measures were change in BMI and weight. Within-group and between-group multivariate analyses revealed that only intervention participants decreased BMI (-0.85 units; 95% CI: -1.32 to -0.39; P = 0.001) and weight (-2.24 kg; 95% CI: -3.49 to -0.99; P = 0.002). Compared with controls, intervention participants decreased BMI (difference: -0.71 units; 95% CI: -1.35 to -0.08; P = 0.03) and weight (1.85 kg; 95% CI: -3.55 to -0.16; P = 0.03) and improved C-reactive protein (difference: -1.15 mg/L; 95% CI: -2.16 to -0.15; P = 0.03) and Simple 7, a composite CVD risk score (difference: 0.67; 95% CI: 0.14 to 1.21; P = 0.01). Cholesterol decreased among controls but increased in the intervention group (-7.85 vs. 3.92 mg/dL; difference: 11.77; 95% CI: 0.57 to 22.96; P = 0.04). The multilevel intervention demonstrated modest but superior and meaningful improvements in BMI and other CVD risk factors compared with the control program. © 2018 The Obesity Society.

  14. Gender differences and a school-based obesity prevention program in Argentina: a randomized trial.

    Science.gov (United States)

    Rausch Herscovici, Cecile; Kovalskys, Irina; De Gregorio, María José

    2013-08-01

    To evaluate the impact of a school-based obesity prevention program that seeks to change food intake among students at schools in Rosario, Argentina. This was a prospective study involving 405 children 9-11 years of age at six schools in the poor areas of Rosario, Argentina, in May-October 2008. After matching for socioeconomic status, schools were selected by simple randomization; participants were assessed at baseline (T1) and again 6 months later, after completion of the intervention (T2). The program focused on increasing the children's knowledge of healthy nutrition and exercise through four workshops; educating the parents/caregivers; and offering healthy options at the school snack bar. The main outcome measures were the children's intake of healthy and unhealthy foods (assessed with a weekly food frequency questionnaire) and their body mass index (BMI). Of the 387 children assessed at T1, 369 were reassessed at T2 (205 intervention; 164 control). Girls at the schools where the intervention occurred increased their intake of three of the five healthy food items promoted by the program (fruits, vegetables, low-sugar cereals). Statistical significance was reached for skim milk (P = 0.03) and for pure orange juice (P = 0.05). Boys of both the intervention and control groups failed to improve their intake of healthy foods, but those of the intervention arm significantly reduced their intake of hamburgers and hot dogs (P = 0.001). Girls were more amenable to improving their dietary intake. Overall, the program was more likely to increase consumption of healthy food than to decrease intake of unhealthy foods. Gender differences should be taken into account when designing preventive interventions.

  15. Gender differences and a school-based obesity prevention program in Argentina: a randomized trial

    Directory of Open Access Journals (Sweden)

    Cecile Rausch Herscovici

    2013-08-01

    Full Text Available OBJECTIVE: To evaluate the impact of a school-based obesity prevention program that seeks to change food intake among students at schools in Rosario, Argentina. METHODS: This was a prospective study involving 405 children 9-11 years of age at six schools in the poor areas of Rosario, Argentina, in May-October 2008. After matching for socioeconomic status, schools were selected by simple randomization; participants were assessed at baseline (T1 and again 6 months later, after completion of the intervention (T2. The program focused on increasing the children's knowledge of healthy nutrition and exercise through four workshops; educating the parents/caregivers; and offering healthy options at the school snack bar. The main outcome measures were the children's intake of healthy and unhealthy foods (assessed with a weekly food frequency questionnaire and their body mass index (BMI. RESULTS: Of the 387 children assessed at T1, 369 were reassessed at T2 (205 intervention; 164 control. Girls at the schools where the intervention occurred increased their intake of three of the five healthy food items promoted by the program (fruits, vegetables, low-sugar cereals. Statistical significance was reached for skim milk (P = 0.03 and for pure orange juice (P = 0.05. Boys of both the intervention and control groups failed to improve their intake of healthy foods, but those of the intervention arm significantly reduced their intake of hamburgers and hot dogs (P = 0.001. CONCLUSIONS: Girls were more amenable to improving their dietary intake. Overall, the program was more likely to increase consumption of healthy food than to decrease intake of unhealthy foods. Gender differences should be taken into account when designing preventive interventions.

  16. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  17. A randomized controlled trial of a community-based nutrition education program for low-income parents.

    Science.gov (United States)

    Dollahite, Jamie S; Pijai, Erika I; Scott-Pierce, Michelle; Parker, Carol; Trochim, William

    2014-01-01

    Assess effectiveness of the Expanded Food and Nutrition Education Program on nutrition behaviors post-education and longitudinally. Switching replications randomized experimental design. Participants randomly assigned to immediate education (IE) or delayed education (DE). Participants in IE received intervention the first 8 weeks, and those in DE the second 8 weeks, with no intervention during alternate periods. Data were collected in 3 repeated measures. Parents (n = 168 randomized; n = 134 completed) of children in 2 Head Start and 6 low-income schools. Eight weekly workshops, based on Eating Right is Basic-Enhanced adapted to incorporate dialogue approach with experiential learning. Ten-item self-reported behavior checklist on nutrition, food resource management, food safety, and food security; responses on a 5-point scale reporting frequency of behavior. Chi-square, analysis of variance, and multiple regression. Groups were demographically similar. Both groups reported improved behaviors pre- to post-education (P vs T2). Changed IE behavior was retained T2 to T3. A multiple regression model of overall change, controlling for T1 score and educator, showed significant improvement (n = 134, β = 5.72, P < .001). Positive outcomes were supported by this experimental study in a usual program context, with reported behavior changes retained at least 2 months. Copyright © 2014 Society for Nutrition Education and Behavior. All rights reserved.

  18. Home based telemedicine intervention for patients with uncontrolled hypertension: - a real life - non-randomized study

    Science.gov (United States)

    2014-01-01

    Background Control of blood pressure is frequently inadequate in spite of availability of several classes of well tolerated and effective antihypertensive drugs. Several factors, including the use of suboptimal doses of drugs, inadequate or ineffective treatments and poor drug compliance may be the reason for this phenomenon. The aim of the current non- randomized study was to evaluate the effectiveness of a Home-Based Telemedicine service in patients with uncontrolled hypertension. Methods 74 patients were enrolled in a Home Based Telemedicine group and 94 patients in the Usual Care group. At baseline and at the end of the study, patients in both groups were seen in a cardiology office. Patients in Home Based Telemedicine group additionally were followed by a physician-nurse, through scheduled and unscheduled telephone appointments. These patients also received a blood pressure measuring device that could transmit the readings to a central data monitor via secure data connection. Results During the study period (80 ± 25 days), a total of 17401 blood pressure measurements were taken in the Home Based Telemedicine group corresponding to 236 ± 136 readings per patient and a mean daily measurement of 3 ± 1.7. The scheduled telephone contacts (initiated by the nurse) equaled to 5.2 ± 4.3/patient (370 in total) and the unscheduled telephone contacts (initiated by the patients) were 0.4 ± 0.9/patient (30 in total). The mean systolic blood pressure values decreased from 153 ± 19 mmHg to 130 ± 15 mmHg (p < 0.0001) at the end of the study and diastolic blood pressure values decreased from 89 ± 10 mmHg to 76 ± 11 mmHg (p < 0.0001). In the Usual Care group, the mean systolic blood pressure values decreased from 156 ± 16 mmHg to 149 ± 17 mmHg (p < 0.05) at the end of the study and diastolic blood pressure values decreased from 90 ± 8 mmHg to 86 ± 9 mmHg (p < 0.05). The changes in drug

  19. COMPARISON OF EIGENMODE-BASED AND RANDOM FIELD-BASED IMPERFECTION MODELING FOR THE STOCHASTIC BUCKLING ANALYSIS OF I-SECTION BEAM–COLUMNS

    KAUST Repository

    STAVREV, A.

    2013-03-01

    The uncertainty of geometric imperfections in a series of nominally equal I-beams leads to a variability of corresponding buckling loads. Its analysis requires a stochastic imperfection model, which can be derived either by the simple variation of the critical eigenmode with a scalar random variable, or with the help of the more advanced theory of random fields. The present paper first provides a concise review of the two different modeling approaches, covering theoretical background, assumptions and calibration, and illustrates their integration into commercial finite element software to conduct stochastic buckling analyses with the Monte-Carlo method. The stochastic buckling behavior of an example beam is then simulated with both stochastic models, calibrated from corresponding imperfection measurements. The simulation results show that for different load cases, the response statistics of the buckling load obtained with the eigenmode-based and the random field-based models agree very well. A comparison of our simulation results with corresponding Eurocode 3 limit loads indicates that the design standard is very conservative for compression dominated load cases. © 2013 World Scientific Publishing Company.

  20. Evaluating a Web-Based Social Anxiety Intervention Among University Students: Randomized Controlled Trial.

    Science.gov (United States)

    McCall, Hugh Cameron; Richardson, Chris G; Helgadottir, Fjola Dogg; Chen, Frances S

    2018-03-21

    Treatment rates for social anxiety, a prevalent and potentially debilitating condition, remain among the lowest of all major mental disorders today. Although computer-delivered interventions are well poised to surmount key barriers to the treatment of social anxiety, most are only marginally effective when delivered as stand-alone treatments. A new, Web-based cognitive behavioral therapy (CBT) intervention called Overcome Social Anxiety was recently created to address the limitations of prior computer-delivered interventions. Users of Overcome Social Anxiety are self-directed through various CBT modules incorporating cognitive restructuring and behavioral experiments. The intervention is personalized to each user's symptoms, and automatic email reminders and time limits are used to encourage adherence. The purpose of this study was to conduct a randomized controlled trial to investigate the effectiveness of Overcome Social Anxiety in reducing social anxiety symptoms in a nonclinical sample of university students. As a secondary aim, we also investigated whether Overcome Social Anxiety would increase life satisfaction in this sample. Following eligibility screening, participants were randomly assigned to a treatment condition or a wait-list control condition. Only those assigned to the treatment condition were given access to Overcome Social Anxiety; they were asked to complete the program within 4 months. The social interaction anxiety scale (SIAS), the fear of negative evaluation scale (FNE), and the quality of life enjoyment and satisfaction questionnaire-short form (Q-LES-Q-SF) were administered to participants from both conditions during baseline and 4-month follow-up lab visits. Over the course of the study, participants assigned to the treatment condition experienced a significant reduction in social anxiety (SIAS: Psocial anxiety in the 2 conditions over the course of the study showed that those assigned to the treatment condition experienced significantly

  1. Calibration Base Lines for Electronic Distance Measuring Instruments (EDMI)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A calibration base line (CBL) is a precisely measured, straight-line course of approximately 1,400 m used to calibrate Electronic Distance Measuring Instruments...

  2. Simultaneous spacecraft orbit estimation and control based on GPS measurements via extended Kalman filter

    Directory of Open Access Journals (Sweden)

    Tamer Mekky Ahmed Habib

    2013-06-01

    Full Text Available The primary aim of this work is to provide simultaneous spacecraft orbit estimation and control based on the global positioning system (GPS measurements suitable for application to the next coming Egyptian remote sensing satellites. Disturbance resulting from earth’s oblateness till the fourth order (i.e., J4 is considered. In addition, aerodynamic drag and random disturbance effects are taken into consideration.

  3. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    Directory of Open Access Journals (Sweden)

    Seul-Ki Yeom

    Full Text Available Event-related potential (ERP-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC-based paradigm with our approach that combines a random set presentation paradigm with (non- self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  4. Internet-based self-help treatment for depression in multiple sclerosis: study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Boeschoten Rosa E

    2012-09-01

    Full Text Available Abstract Background Depression in MS patients is frequent but often not treated adequately. An important underlying factor may be physical limitations that preclude face-to-face contact. Internet-based treatment showed to be effective for depressive symptoms in general and could thus be a promising tool for treatment in MS. Methods/design Here, we present a study protocol to investigate the effectiveness of a 5 week Internet-based self-help problem solving treatment (PST for depressive symptoms in MS patients in a randomized controlled trial. We aim to include 166 MS patients with moderate to severe depressive symptoms who will be randomly assigned to an Internet-based intervention (with or without supportive text-messages or waiting list control group. The primary outcome is the change in depressive symptoms defined by a change in the sum score on the Beck Depression Inventory (BDI-II. Secondary outcomes will include measures of anxiety, fatigue, cognitive functioning, physical and psychological impact of MS, quality of life, problem solving skills, social support, mastery, satisfaction and compliance rate. Assessments will take place at baseline (T0, within a week after the intervention (T1, at four months (T2 and at ten months follow-up (T3: only the intervention group. The control group will be measured at the same moments in time. Analysis will be based on the intention-to-treat principle. Discussion If shown to be effective, Internet-based PST will offer new possibilities to reach and treat MS patients with depressive symptoms and to improve the quality of care. Trial Registration The Dutch Cochrane Center, NTR2772

  5. Fourier transform based scalable image quality measure.

    Science.gov (United States)

    Narwaria, Manish; Lin, Weisi; McLoughlin, Ian; Emmanuel, Sabu; Chia, Liang-Tien

    2012-08-01

    We present a new image quality assessment (IQA) algorithm based on the phase and magnitude of the 2D (twodimensional) Discrete Fourier Transform (DFT). The basic idea is to compare the phase and magnitude of the reference and distorted images to compute the quality score. However, it is well known that the Human Visual Systems (HVSs) sensitivity to different frequency components is not the same. We accommodate this fact via a simple yet effective strategy of nonuniform binning of the frequency components. This process also leads to reduced space representation of the image thereby enabling the reduced-reference (RR) prospects of the proposed scheme. We employ linear regression to integrate the effects of the changes in phase and magnitude. In this way, the required weights are determined via proper training and hence more convincing and effective. Lastly, using the fact that phase usually conveys more information than magnitude, we use only the phase for RR quality assessment. This provides the crucial advantage of further reduction in the required amount of reference image information. The proposed method is therefore further scalable for RR scenarios. We report extensive experimental results using a total of 9 publicly available databases: 7 image (with a total of 3832 distorted images with diverse distortions) and 2 video databases (totally 228 distorted videos). These show that the proposed method is overall better than several of the existing fullreference (FR) algorithms and two RR algorithms. Additionally, there is a graceful degradation in prediction performance as the amount of reference image information is reduced thereby confirming its scalability prospects. To enable comparisons and future study, a Matlab implementation of the proposed algorithm is available at http://www.ntu.edu.sg/home/wslin/reduced_phase.rar.

  6. A Novel Approach for Multi Class Fault Diagnosis in Induction Machine Based on Statistical Time Features and Random Forest Classifier

    Science.gov (United States)

    Sonje, M. Deepak; Kundu, P.; Chowdhury, A.

    2017-08-01

    Fault diagnosis and detection is the important area in health monitoring of electrical machines. This paper proposes the recently developed machine learning classifier for multi class fault diagnosis in induction machine. The classification is based on random forest (RF) algorithm. Initially, stator currents are acquired from the induction machine under various conditions. After preprocessing the currents, fourteen statistical time features are estimated for each phase of the current. These parameters are considered as inputs to the classifier. The main scope of the paper is to evaluate effectiveness of RF classifier for individual and mixed fault diagnosis in induction machine. The stator, rotor and mixed faults (stator and rotor faults) are classified using the proposed classifier. The obtained performance measures are compared with the multilayer perceptron neural network (MLPNN) classifier. The results show the much better performance measures and more accurate than MLPNN classifier. For demonstration of planned fault diagnosis algorithm, experimentally obtained results are considered to build the classifier more practical.

  7. Random Measurement Error as a Source of Discrepancies between the Reports of Wives and Husbands Concerning Marital Power and Task Allocation.

    Science.gov (United States)

    Quarm, Daisy

    1981-01-01

    Findings for couples (N=119) show wife's work, money, and spare time low between-spouse correlations are due in part to random measurement error. Suggests that increasing reliability of measures by creating multi-item indices can also increase correlations. Car purchase, vacation, and child discipline were not accounted for by random measurement…

  8. An effective approach to attenuate random noise based on compressive sensing and curvelet transform

    International Nuclear Information System (INIS)

    Liu, Wei; Cao, Siyuan; Zu, Shaohuan; Chen, Yangkang

    2016-01-01

    Random noise attenuation is an important step in seismic data processing. In this paper, we propose a novel denoising approach based on compressive sensing and the curvelet transform. We formulate the random noise attenuation problem as an L _1 norm regularized optimization problem. We propose to use the curvelet transform as the sparse transform in the optimization problem to regularize the sparse coefficients in order to separate signal and noise and to use the gradient projection for sparse reconstruction (GPSR) algorithm to solve the formulated optimization problem with an easy implementation and a fast convergence. We tested the performance of our proposed approach on both synthetic and field seismic data. Numerical results show that the proposed approach can effectively suppress the distortion near the edge of seismic events during the noise attenuation process and has high computational efficiency compared with the traditional curvelet thresholding and iterative soft thresholding based denoising methods. Besides, compared with f-x deconvolution, the proposed denoising method is capable of eliminating the random noise more effectively while preserving more useful signals. (paper)

  9. Threshold-Based Random Charging Scheme for Decentralized PEV Charging Operation in a Smart Grid.

    Science.gov (United States)

    Kwon, Ojin; Kim, Pilkee; Yoon, Yong-Jin

    2016-12-26

    Smart grids have been introduced to replace conventional power distribution systems without real time monitoring for accommodating the future market penetration of plug-in electric vehicles (PEVs). When a large number of PEVs require simultaneous battery charging, charging coordination techniques have become one of the most critical factors to optimize the PEV charging performance and the conventional distribution system. In this case, considerable computational complexity of a central controller and exchange of real time information among PEVs may occur. To alleviate these problems, a novel threshold-based random charging (TBRC) operation for a decentralized charging system is proposed. Using PEV charging thresholds and random access rates, the PEVs themselves can participate in the charging requests. As PEVs with a high battery state do not transmit the charging requests to the central controller, the complexity of the central controller decreases due to the reduction of the charging requests. In addition, both the charging threshold and the random access rate are statistically calculated based on the average of supply power of the PEV charging system that do not require a real time update. By using the proposed TBRC with a tolerable PEV charging degradation, a 51% reduction of the PEV charging requests is achieved.

  10. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    Science.gov (United States)

    Galelli, S.; Castelletti, A.

    2013-07-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  11. Triangulation-based edge measurement using polyview optics

    Science.gov (United States)

    Li, Yinan; Kästner, Markus; Reithmeier, Eduard

    2018-04-01

    Laser triangulation sensors as non-contact measurement devices are widely used in industry and research for profile measurements and quantitative inspections. Some technical applications e.g. edge measurements usually require a configuration of a single sensor and a translation stage or a configuration of multiple sensors, so that they can measure a large measurement range that is out of the scope of a single sensor. However, the cost of both configurations is high, due to the additional rotational axis or additional sensor. This paper provides a special measurement system for measurement of great curved surfaces based on a single sensor configuration. Utilizing a self-designed polyview optics and calibration process, the proposed measurement system allows an over 180° FOV (field of view) with a precise measurement accuracy as well as an advantage of low cost. The detailed capability of this measurement system based on experimental data is discussed in this paper.

  12. Home-based balance training using the Wii balance board: a randomized, crossover pilot study in multiple sclerosis.

    Science.gov (United States)

    Prosperini, Luca; Fortuna, Deborah; Giannì, Costanza; Leonardi, Laura; Marchetti, Maria Rita; Pozzilli, Carlo

    2013-01-01

    To evaluate the effectiveness of a home-based rehabilitation of balance using the Nintendo Wii Balance Board System (WBBS) in patients affected by multiple sclerosis (MS). In this 24-week, randomized, 2-period crossover pilot study, 36 patients having an objective balance disorder were randomly assigned in a 1:1 ratio to 2 counterbalanced arms. Group A started a 12-week period of home-based WBBS training followed by a 12-week period without any intervention; group B received the treatment in reverse order. As endpoints, we considered the mean difference (compared with baseline) in force platform measures (i.e., the displacement of body center of pressure in 30 seconds), 4-step square test (FSST), 25-foot timed walking test (25-FWT), and 29-item MS Impact Scale (MSIS-29), as evaluated after 12 weeks and at the end of the 24-week study period. The 2 groups did not differ in baseline characteristics. Repeated-measures analyses of variance showed significant time × treatment effects, indicating that WBBS was effective in ameliorating force platform measures (F = 4.608, P = .016), FSST (F = 3.745, P = .034), 25-FWT (F = 3.339, P = .048), and MSIS-29 (F = 4.282, P = .023). Five adverse events attributable to the WBSS training (knee or low back pain) were recorded, but only 1 patient had to retire from the study. A home-based WBBS training might potentially provide an effective, engaging, balance rehabilitation solution for people with MS. However, the risk of WBBS training-related injuries should be carefully balanced with benefits. Further studies, including cost-effectiveness analyses, are warranted to establish whether WBBS may be useful in the home setting.

  13. Mapping Deforestation in North Korea Using Phenology-Based Multi-Index and Random Forest

    Directory of Open Access Journals (Sweden)

    Yihua Jin

    2016-12-01

    Full Text Available Phenology-based multi-index with the random forest (RF algorithm can be used to overcome the shortcomings of traditional deforestation mapping that involves pixel-based classification, such as ISODATA or decision trees, and single images. The purpose of this study was to investigate methods to identify specific types of deforestation in North Korea, and to increase the accuracy of classification, using phenological characteristics extracted with multi-index and random forest algorithms. The mapping of deforestation area based on RF was carried out by merging phenology-based multi-indices (i.e., normalized difference vegetation index (NDVI, normalized difference water index (NDWI, and normalized difference soil index (NDSI derived from MODIS (Moderate Resolution Imaging Spectroradiometer products and topographical variables. Our results showed overall classification accuracy of 89.38%, with corresponding kappa coefficients of 0.87. In particular, for forest and farm land categories with similar phenological characteristic (e.g., paddy, plateau vegetation, unstocked forest, hillside field, this approach improved the classification accuracy in comparison with pixel-based methods and other classes. The deforestation types were identified by incorporating point data from high-resolution imagery, outcomes of image classification, and slope data. Our study demonstrated that the proposed methodology could be used for deciding on the restoration priority and monitoring the expansion of deforestation areas.

  14. Controlling Chronic Diseases Through Evidence-Based Decision Making: A Group-Randomized Trial.

    Science.gov (United States)

    Brownson, Ross C; Allen, Peg; Jacob, Rebekah R; deRuyter, Anna; Lakshman, Meenakshi; Reis, Rodrigo S; Yan, Yan

    2017-11-30

    Although practitioners in state health departments are ideally positioned to implement evidence-based interventions, few studies have examined how to build their capacity to do so. The objective of this study was to explore how to increase the use of evidence-based decision-making processes at both the individual and organization levels. We conducted a 2-arm, group-randomized trial with baseline data collection and follow-up at 18 to 24 months. Twelve state health departments were paired and randomly assigned to intervention or control condition. In the 6 intervention states, a multiday training on evidence-based decision making was conducted from March 2014 through March 2015 along with a set of supplemental capacity-building activities. Individual-level outcomes were evidence-based decision making skills of public health practitioners; organization-level outcomes were access to research evidence and participatory decision making. Mixed analysis of covariance models was used to evaluate the intervention effect by accounting for the cluster randomized trial design. Analysis was performed from March through May 2017. Participation 18 to 24 months after initial training was 73.5%. In mixed models adjusted for participant and state characteristics, the intervention group improved significantly in the overall skill gap (P = .01) and in 6 skill areas. Among the 4 organizational variables, only access to evidence and skilled staff showed an intervention effect (P = .04). Tailored and active strategies are needed to build capacity at the individual and organization levels for evidence-based decision making. Our study suggests several dissemination interventions for consideration by leaders seeking to improve public health practice.

  15. A cluster randomized control field trial of the ABRACADABRA web-based literacy intervention: Replication and extension of basic findings.

    Directory of Open Access Journals (Sweden)

    Noella Angele Piquette

    2014-12-01

    Full Text Available The present paper reports a cluster randomized control trial evaluation of teaching using ABRACADABRA (ABRA, an evidence-based and web-based literacy intervention (http://abralite.concordia.ca with 107 kindergarten and 96 grade 1 children in 24 classes (12 intervention 12 control classes from all 12 elementary schools in one school district in Canada. Children in the intervention condition received 10-12 hours of whole class instruction using ABRA between pre- and post-test. Hierarchical linear modeling of post-test results showed significant gains in letter-sound knowledge for intervention classrooms over control classrooms. In addition, medium effect sizes were evident for three of five outcome measures favoring the intervention: letter-sound knowledge (d = +.66, phonological blending (d = +.52, and word reading (d = +.52, over effect sizes for regular teaching. It is concluded that regular teaching with ABRA technology adds significantly to literacy in the early elementary years.

  16. Impulse attack-free four random phase mask encryption based on a 4-f optical system.

    Science.gov (United States)

    Kumar, Pramod; Joseph, Joby; Singh, Kehar

    2009-04-20

    Optical encryption methods based on double random phase encryption (DRPE) have been shown to be vulnerable to different types of attacks. The Fourier plane random phase mask (RPM), which is the most important key, can be cracked with a single impulse function attack. Such an attack is viable because the Fourier transform of a delta function is a unity function. Formation of a unity function can be avoided if RPMs are placed in front of both lenses in a 4-f optical setup, thereby protecting the DRPE from an impulse attack. We have performed numerical simulations to verify the proposed scheme. Resistance of this scheme is checked against the brute force and the impulse function attacks. The experimental results validate the feasibility of the scheme.

  17. A novel root-index based prioritized random access scheme for 5G cellular networks

    Directory of Open Access Journals (Sweden)

    Taehoon Kim

    2015-12-01

    Full Text Available Cellular networks will play an important role in realizing the newly emerging Internet-of-Everything (IoE. One of the challenging issues is to support the quality of service (QoS during the access phase, while accommodating a massive number of machine nodes. In this paper, we show a new paradigm of multiple access priorities in random access (RA procedure and propose a novel root-index based prioritized random access (RIPRA scheme that implicitly embeds the access priority in the root index of the RA preambles. The performance evaluation shows that the proposed RIPRA scheme can successfully support differentiated performance for different access priority levels, even though there exist a massive number of machine nodes.

  18. Research on electricity consumption forecast based on mutual information and random forests algorithm

    Science.gov (United States)

    Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu

    2018-02-01

    Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.

  19. Study on Stationarity of Random Load Spectrum Based on the Special Road

    Science.gov (United States)

    Yan, Huawen; Zhang, Weigong; Wang, Dong

    2017-09-01

    In the special road quality assessment method, there is a method using a wheel force sensor, the essence of this method is collecting the load spectrum of the car to reflect the quality of road. According to the definition of stochastic process, it is easy to find that the load spectrum is a stochastic process. However, the analysis method and application range of different random processes are very different, especially in engineering practice, which will directly affect the design and development of the experiment. Therefore, determining the type of a random process has important practical significance. Based on the analysis of the digital characteristics of road load spectrum, this paper determines that the road load spectrum in this experiment belongs to a stationary stochastic process, paving the way for the follow-up modeling and feature extraction of the special road.

  20. Analysis in nuclear power accident emergency based on random network and particle swarm optimization

    International Nuclear Information System (INIS)

    Gong Dichen; Fang Fang; Ding Weicheng; Chen Zhi

    2014-01-01

    The GERT random network model of nuclear power accident emergency was built in this paper, and the intelligent computation was combined with the random network based on the analysis of Fukushima nuclear accident in Japan. The emergency process was divided into the series link and parallel link, and the parallel link was the part of series link. The overall allocation of resources was firstly optimized, and then the parallel link was analyzed. The effect of the resources for emergency used in different links was analyzed, and it was put forward that the corresponding particle velocity vector was limited under the condition of limited emergency resources. The resource-constrained particle swarm optimization was obtained by using velocity projection matrix to correct the motion of particles. The optimized allocation of resources in emergency process was obtained and the time consumption of nuclear power accident emergency was reduced. (authors)

  1. Random Access for Machine-Type Communication based on Bloom Filtering

    DEFF Research Database (Denmark)

    Pratas, Nuno; Stefanovic, Cedomir; Madueño, Germán Corrales

    2016-01-01

    utilizes the system resources more efficiently and achieves similar or lower latency of connection establishment in case of synchronous arrivals, compared to the variant of the LTE-A access protocol that is optimized for MTC traffic. A dividend of the proposed method is that allows the base station (BS......We present a random access method inspired on Bloom filters that is suited for Machine-Type Communications (MTC). Each accessing device sends a signature during the contention process. A signature is constructed using the Bloom filtering method and contains information on the device identity...... and the connection establishment cause. We instantiate the proposed method over the current LTE-A access protocol. However, the method is applicable to a more general class of random access protocols that use preambles or other reservation sequences, as expected to be the case in 5G systems. We show that our method...

  2. History and measurement of the base and derived units

    CERN Document Server

    Treese, Steven A

    2018-01-01

    This book discusses how and why historical measurement units developed, and reviews useful methods for making conversions as well as situations in which dimensional analysis can be used. It starts from the history of length measurement, which is one of the oldest measures used by humans. It highlights the importance of area measurement, briefly discussing the methods for determining areas mathematically and by measurement. The book continues on to detail the development of measures for volume, mass, weight, time, temperature, angle, electrical units, amounts of substances, and light intensity. The seven SI/metric base units are highlighted, as well as a number of other units that have historically been used as base units. Providing a comprehensive reference for interconversion among the commonly measured quantities in the different measurement systems with engineering accuracy, it also examines the relationships among base units in fields such as mechanical/thermal, electromagnetic and physical flow rates and...

  3. A novel image encryption algorithm based on synchronized random bit generated in cascade-coupled chaotic semiconductor ring lasers

    Science.gov (United States)

    Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun

    2018-03-01

    In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.

  4. A 1-year videoconferencing-based psychoeducational group intervention following bariatric surgery: results of a randomized controlled study.

    Science.gov (United States)

    Wild, Beate; Hünnemeyer, Katharina; Sauer, Helene; Hain, Bernhard; Mack, Isabelle; Schellberg, Dieter; Müller-Stich, Beat Peter; Weiner, Rudolf; Meile, Tobias; Rudofsky, Gottfried; Königsrainer, Alfred; Zipfel, Stephan; Herzog, Wolfgang; Teufel, Martin

    2015-01-01

    For severely obese patients, bariatric surgery has been recommended as an effective therapy. The Bariataric Surgery and Education (BaSE) study aimed to assess the efficacy of a videoconferencing-based psychoeducational group intervention in patients after bariatric surgery. The BaSE study is a randomized, controlled multicenter clinical trial involving 117 patients undergoing bariatric surgery (mean preoperative body mass index [BMI] 49.9 kg/m(2), SD 6.4). Patients were enrolled between May 2009 and November 2012 and were randomly assigned to receive either conventional postsurgical visits or, in addition, a videoconferencing-based 1-year group program. Primary outcome measures were weight in kilograms, health-related quality of life (HRQOL), and general self-efficacy (GSE). Secondary outcome measures were depression symptoms and eating behavior. 94% of the patients completed the study. Mean weight loss for all patients was 45.9 kg (SD 16.4) 1 year after surgery (mean excess weight loss [EWL] 63%). Intention-to-treat analyses revealed no differences in weight loss, EWL, HRQOL, or self-efficacy between study groups at 1 year after surgery. However, patients with clinically significant depression symptoms (CSD) at baseline assigned to the intervention group (n = 29) had a significantly better HRQOL (P = .03), lower depression scores (P = .02), and a trend for a better EWL (.06) 1 year after surgery compared with the control group (n = 20). We could not prove the efficacy of the group program for the whole study sample. However, results indicate that the intervention is effective for the important subgroup of patients with CSD. Copyright © 2015 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  5. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    Science.gov (United States)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  6. A mindfulness-based stress prevention training for medical students (MediMind): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Kuhlmann, Sophie Merle; Bürger, Arne; Esser, Günter; Hammerle, Florian

    2015-02-08

    Medical training is very demanding and associated with a high prevalence of psychological distress. Compared to the general population, medical students are at a greater risk of developing a psychological disorder. Various attempts of stress management training in medical school have achieved positive results on minimizing psychological distress; however, there are often limitations. Therefore, the use of a rigorous scientific method is needed. The present study protocol describes a randomized controlled trial to examine the effectiveness of a specifically developed mindfulness-based stress prevention training for medical students that includes selected elements of cognitive behavioral strategies (MediMind). This study protocol presents a prospective randomized controlled trial, involving four assessment time points: baseline, post-intervention, one-year follow-up and five-year follow-up. The aims include evaluating the effect on stress, coping, psychological morbidity and personality traits with validated measures. Participants are allocated randomly to one of three conditions: MediMind, Autogenic Training or control group. Eligible participants are medical or dental students in the second or eighth semester of a German university. They form a population of approximately 420 students in each academic term. A final total sample size of 126 (at five-year follow-up) is targeted. The trainings (MediMind and Autogenic Training) comprise five weekly sessions lasting 90 minutes each. MediMind will be offered to participants of the control group once the five-year follow-up is completed. The allotment is randomized with a stratified allocation ratio by course of studies, semester, and gender. After descriptive statistics have been evaluated, inferential statistical analysis will be carried out with a repeated measures ANOVA-design with interactions between time and group. Effect sizes will be calculated using partial η-square values. Potential limitations of this study

  7. Mesoscale model response to random, surface-based perturbations — A sea-breeze experiment

    Science.gov (United States)

    Garratt, J. R.; Pielke, R. A.; Miller, W. F.; Lee, T. J.

    1990-09-01

    The introduction into a mesoscale model of random (in space) variations in roughness length, or random (in space and time) surface perturbations of temperature and friction velocity, produces a measurable, but barely significant, response in the simulated flow dynamics of the lower atmosphere. The perturbations are an attempt to include the effects of sub-grid variability into the ensemble-mean parameterization schemes used in many numerical models. Their magnitude is set in our experiments by appeal to real-world observations of the spatial variations in roughness length and daytime surface temperature over the land on horizontal scales of one to several tens of kilometers. With sea-breeze simulations, comparisons of a number of realizations forced by roughness-length and surface-temperature perturbations with the standard simulation reveal no significant change in ensemble mean statistics, and only small changes in the sea-breeze vertical velocity. Changes in the updraft velocity for individual runs, of up to several cms-1 (compared to a mean of 14 cms-1), are directly the result of prefrontal temperature changes of 0.1 to 0.2K, produced by the random surface forcing. The correlation and magnitude of the changes are entirely consistent with a gravity-current interpretation of the sea breeze.

  8. Dispatching Plan Based on Route Optimization Model Considering Random Wind for Aviation Emergency Rescue

    Directory of Open Access Journals (Sweden)

    Ming Zhang

    2016-01-01

    Full Text Available Aviation emergency rescue is an effective means of nature disaster relief that is widely used in many countries. The dispatching plan of aviation emergency rescue guarantees the efficient implementation of this relief measure. The conventional dispatching plan that does not consider random wind factors leads to a nonprecise quick-responsive scheme and serious safety issues. In this study, an aviation emergency rescue framework that considers the influence of random wind at flight trajectory is proposed. In this framework, the predicted wind information for a disaster area is updated by using unscented Kalman filtering technology. Then, considering the practical scheduling problem of aircraft emergency rescue at present, a multiobjective model is established in this study. An optimization model aimed at maximizing the relief supply satisfaction, rescue priority satisfaction, and minimizing total rescue flight distance is formulated. Finally, the transport times of aircraft with and without the influence of random wind are analyzed on the basis of the data of an earthquake disaster area. Results show that the proposed dispatching plan that considers the constraints of updated wind speed and direction is highly applicable in real operations.

  9. Video- or text-based e-learning when teaching clinical procedures? A randomized controlled trial.

    Science.gov (United States)

    Buch, Steen Vigh; Treschow, Frederik Philip; Svendsen, Jesper Brink; Worm, Bjarne Skjødt

    2014-01-01

    This study investigated the effectiveness of two different levels of e-learning when teaching clinical skills to medical students. Sixty medical students were included and randomized into two comparable groups. The groups were given either a video- or text/picture-based e-learning module and subsequently underwent both theoretical and practical examination. A follow-up test was performed 1 month later. The students in the video group performed better than the illustrated text-based group in the practical examination, both in the primary test (Pvideo group performed better on the follow-up test (P=0.04). Video-based e-learning is superior to illustrated text-based e-learning when teaching certain practical clinical skills.

  10. School-based intervention to reduce anxiety in children: study protocol for a randomized controlled trial (PACES

    Directory of Open Access Journals (Sweden)

    Stallard Paul

    2012-11-01

    Full Text Available Abstract Background Emotional problems such as anxiety and low mood in children are common, impair everyday functioning and increase the risk of severe mental health disorders in adulthood. Relatively few children with emotional health problems are identified and referred for treatment indicating the need to investigate preventive approaches. Methods/Design The study is designed to be a pragmatic cluster randomized controlled trial evaluating the effectiveness of an efficacious school-based cognitive behavior therapy (CBT prevention program (FRIENDS on symptoms of anxiety and low mood in children 9 to 10 years of age. The unit of allocation is schools which are assigned to one of three conditions: school-led FRIENDS, health-led FRIENDS or treatment as usual. Assessments will be undertaken at baseline, 6 months and 12 months. The primary outcome measure is change on the Revised Child Anxiety and Depression Scale. Secondary outcome measures assess changes in self-esteem, worries, bullying and life satisfaction. An economic evaluation will be undertaken. Discussion As of September 2011, 41 schools have been recruited and randomized. Final 12-month assessments are scheduled to be completed by May 2013. Trial Registration ISRCTN23563048

  11. Coordinate based random effect size meta-analysis of neuroimaging studies.

    Science.gov (United States)

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    C. Li

    2012-07-01

    Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  13. Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation

    Science.gov (United States)

    Li, C.

    2012-07-01

    POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  14. Can Team-Based Care Improve Patient Satisfaction? A Systematic Review of Randomized Controlled Trials

    Science.gov (United States)

    Wen, Jin; Schulman, Kevin A.

    2014-01-01

    Background Team-based approaches to patient care are a relatively recent innovation in health care delivery. The effectiveness of these approaches on patient outcomes has not been well documented. This paper reports a systematic review of the relationship between team-based care and patient satisfaction. Methods We searched MEDLINE, EMBASE, Cochrane Library, CINAHL, and PSYCHOINFO for eligible studies dating from inception to October 8, 2012. Eligible studies reported (1) a randomized controlled trial, (2) interventions including both team-based care and non-team-based care (or usual care), and (3) outcomes including an assessment of patient satisfaction. Articles with different settings between intervention and control were excluded, as were trial protocols. The reference lists of retrieved papers were also evaluated for inclusion. Results The literature search yielded 319 citations, of which 77 were screened for further full-text evaluation. Of these, 27 articles were included in the systematic review. The 26 trials with a total of 15,526 participants were included in this systematic review. The pooling result of dichotomous data (number of studies: 10) showed that team-based care had a positive effect on patient satisfaction compared with usual care (odds ratio, 2.09; 95% confidence interval, 1.54 to 2.84); however, combined continuous data (number of studies: 7) demonstrated that there was no significant difference in patient satisfaction between team-based care and usual care (standardized mean difference, −0.02; 95% confidence interval, −0.40 to 0.36). Conclusions Some evidence showed that team-based care is better than usual care in improving patient satisfaction. However, considering the pooling result of continuous data, along with the suboptimal quality of included trials, further large-scale and high-quality randomized controlled trials comparing team-based care and usual care are needed. PMID:25014674

  15. Measuring gas-residence times in large municipal incinerators, by means of a pseudo-random binary signal tracer technique

    International Nuclear Information System (INIS)

    Nasserzadeh, V.; Swithenbank, J.; Jones, B.

    1995-01-01

    The problem of measuring gas-residence time in large incinerators was studied by the pseudo-random binary sequence (PRBS) stimulus tracer response technique at the Sheffield municipal solid-waste incinerator (35 MW plant). The steady-state system was disturbed by the superimposition of small fluctuations in the form of a pseudo-random binary sequence of methane pulses, and the response of the incinerator was determined from the CO 2 concentration in flue gases at the boiler exit, measured with a specially developed optical gas analyser with a high-frequency response. For data acquisition, an on-line PC computer was used together with the LAB Windows software system; the output response was then cross-correlated with the perturbation signal to give the impulse response of the incinerator. There was very good agreement between the gas-residence time for the Sheffield MSW incinerator as calculated by computational fluid dynamics (FLUENT Model) and gas-residence time at the plant as measured by the PRBS tracer technique. The results obtained from this research programme clearly demonstrate that the PRBS stimulus tracer response technique can be successfully and economically used to measure gas-residence times in large incinerator plants. It also suggests that the common commercial practice of characterising the incinerator operation by a single-residence-time parameter may lead to a misrepresentation of the complexities involved in describing the operation of the incineration system. (author)

  16. Performance-Based Measurement: Action for Organizations and HPT Accountability

    Science.gov (United States)

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  17. A Cluster Randomized Controlled Trial Testing the Effectiveness of Houvast: A Strengths-Based Intervention for Homeless Young Adults

    Science.gov (United States)

    Krabbenborg, Manon A. M.; Boersma, Sandra N.; van der Veld, William M.; van Hulst, Bente; Vollebergh, Wilma A. M.; Wolf, Judith R. L. M.

    2017-01-01

    Objective: To test the effectiveness of Houvast: a strengths-based intervention for homeless young adults. Method: A cluster randomized controlled trial was conducted with 10 Dutch shelter facilities randomly allocated to an intervention and a control group. Homeless young adults were interviewed when entering the facility and when care ended.…

  18. Can group-based reassuring information alter low back pain behavior? A cluster-randomized controlled trial

    DEFF Research Database (Denmark)

    Frederiksen, Pernille; Indahl, Aage; Andersen, Lars L

    2017-01-01

    -randomized controlled trial. METHODS: Publically employed workers (n = 505) from 11 Danish municipality centers were randomized at center-level (cluster) to either intervention (two 1-hour group-based talks at the workplace) or control. The talks provided reassuring information together with a simple non...

  19. A cluster randomized controlled trial testing the effectiveness of Houvast: A strengths-based intervention for homeless young adults

    NARCIS (Netherlands)

    Krabbenborg, M.A.M.; Boersma, S.N.; Veld, W.M. van der; Hulst, B. van; Vollebergh, W.A.M.; Wolf, J.R.L.M.

    2017-01-01

    Objective: To test the effectiveness of Houvast: a strengths-based intervention for homeless young adults. Method: A cluster randomized controlled trial was conducted with 10 Dutch shelter facilities randomly allocated to an intervention and a control group. Homeless young adults were interviewed

  20. Effectiveness of aquatic exercise and balneotherapy: a summary of systematic reviews based on randomized controlled trials of water immersion therapies.

    Science.gov (United States)

    Kamioka, Hiroharu; Tsutani, Kiichiro; Okuizumi, Hiroyasu; Mutoh, Yoshiteru; Ohta, Miho; Handa, Shuichi; Okada, Shinpei; Kitayuguchi, Jun; Kamada, Masamitsu; Shiozawa, Nobuyoshi; Honda, Takuya

    2010-01-01

    The objective of this review was to summarize findings on aquatic exercise and balneotherapy and to assess the quality of systematic reviews based on randomized controlled trials. Studies were eligible if they were systematic reviews based on randomized clinical trials (with or without a meta-analysis) that included at least 1 treatment group that received aquatic exercise or balneotherapy. We searched the following databases: Cochrane Database Systematic Review, MEDLINE, CINAHL, Web of Science, JDream II, and Ichushi-Web for articles published from the year 1990 to August 17, 2008. We found evidence that aquatic exercise had small but statistically significant effects on pain relief and related outcome measures of locomotor diseases (eg, arthritis, rheumatoid diseases, and low back pain). However, long-term effectiveness was unclear. Because evidence was lacking due to the poor methodological quality of balneotherapy studies, we were unable to make any conclusions on the effects of intervention. There were frequent flaws regarding the description of excluded RCTs and the assessment of publication bias in several trials. Two of the present authors independently assessed the quality of articles using the AMSTAR checklist. Aquatic exercise had a small but statistically significant short-term effect on locomotor diseases. However, the effectiveness of balneotherapy in curing disease or improving health remains unclear.

  1. Subcopula-based measure of asymmetric association for contingency tables.

    Science.gov (United States)

    Wei, Zheng; Kim, Daeyoung

    2017-10-30

    For the analysis of a two-way contingency table, a new asymmetric association measure is developed. The proposed method uses the subcopula-based regression between the discrete variables to measure the asymmetric predictive powers of the variables of interest. Unlike the existing measures of asymmetric association, the subcopula-based measure is insensitive to the number of categories in a variable, and thus, the magnitude of the proposed measure can be interpreted as the degree of asymmetric association in the contingency table. The theoretical properties of the proposed subcopula-based asymmetric association measure are investigated. We illustrate the performance and advantages of the proposed measure using simulation studies and real data examples. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Effects of mobile phone-based app learning compared to computer-based web learning on nursing students: pilot randomized controlled trial.

    Science.gov (United States)

    Lee, Myung Kyung

    2015-04-01

    This study aimed to determine the effect of mobile-based discussion versus computer-based discussion on self-directed learning readiness, academic motivation, learner-interface interaction, and flow state. This randomized controlled trial was conducted at one university. Eighty-six nursing students who were able to use a computer, had home Internet access, and used a mobile phone were recruited. Participants were randomly assigned to either the mobile phone app-based discussion group (n = 45) or a computer web-based discussion group (n = 41). The effect was measured at before and after an online discussion via self-reported surveys that addressed academic motivation, self-directed learning readiness, time distortion, learner-learner interaction, learner-interface interaction, and flow state. The change in extrinsic motivation on identified regulation in the academic motivation (p = 0.011) as well as independence and ability to use basic study (p = 0.047) and positive orientation to the future in self-directed learning readiness (p = 0.021) from pre-intervention to post-intervention was significantly more positive in the mobile phone app-based group compared to the computer web-based discussion group. Interaction between learner and interface (p = 0.002), having clear goals (p = 0.012), and giving and receiving unambiguous feedback (p = 0.049) in flow state was significantly higher in the mobile phone app-based discussion group than it was in the computer web-based discussion group at post-test. The mobile phone might offer more valuable learning opportunities for discussion teaching and learning methods in terms of self-directed learning readiness, academic motivation, learner-interface interaction, and the flow state of the learning process compared to the computer.

  3. Modeling random telegraph signal noise in CMOS image sensor under low light based on binomial distribution

    International Nuclear Information System (INIS)

    Zhang Yu; Wang Guangyi; Lu Xinmiao; Hu Yongcai; Xu Jiangtao

    2016-01-01

    The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result, the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated, and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures. (paper)

  4. An Improved Fast Compressive Tracking Algorithm Based on Online Random Forest Classifier

    Directory of Open Access Journals (Sweden)

    Xiong Jintao

    2016-01-01

    Full Text Available The fast compressive tracking (FCT algorithm is a simple and efficient algorithm, which is proposed in recent years. But, it is difficult to deal with the factors such as occlusion, appearance changes, pose variation, etc in processing. The reasons are that, Firstly, even if the naive Bayes classifier is fast in training, it is not robust concerning the noise. Secondly, the parameters are required to vary with the unique environment for accurate tracking. In this paper, we propose an improved fast compressive tracking algorithm based on online random forest (FCT-ORF for robust visual tracking. Firstly, we combine ideas with the adaptive compressive sensing theory regarding the weighted random projection to exploit both local and discriminative information of the object. The second reason is the online random forest classifier for online tracking which is demonstrated with more robust to the noise adaptively and high computational efficiency. The experimental results show that the algorithm we have proposed has a better performance in the field of occlusion, appearance changes, and pose variation than the fast compressive tracking algorithm’s contribution.

  5. Mindfulness-based cognitive therapy for multiple chemical sensitivity: a study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Hauge Christian Riise

    2012-09-01

    Full Text Available Abstract Background Multiple chemical sensitivity (MCS is a condition characterized by recurrent, self-reported symptoms from multiple organ systems, attributable to exposure to a wide range of chemically unrelated substances at low levels. The pathophysiology is unknown, and affected individuals generally favor avoidance of the symptom triggering substances as a coping strategy. The impact of MCS on daily life may thus be severe. An intervention that may effectively reduce the impact of MCS, alleviate the symptoms and the psychological distress associated with the condition is therefore highly needed. In this study we will assess the effects of a mindfulness-based cognitive (MBCT program on MCS. Methods/Design Using a randomized controlled design (RCT, we will compare MBCT with treatment as usual (TAU. The MBCT intervention will include 8 weekly 2.5 hour sessions, and 45 minutes of mindfulness home practice 6 days each week. Participants will be asked to complete questionnaires at baseline, post-treatment, and at 6 and 12 months’ follow-up. Based on sample size estimation, 82 participants will be randomized to either the MBCT intervention or to TAU. The primary outcome will be a measure of the impact of MCS on the participants’ lives. The secondary outcome measures are physical symptoms of psychological distress, perceived stress, illness perceptions, QOL, and work ability. Lastly, we will assess whether any effect of MBCT on the primary effect measure is mediated by level of mindfulness, self-compassion, perceived stress, and rumination. Discussion This trial will provide important information on the effects of MBCT on MCS. Trials registration Clinical trials identifier NCT01240395

  6. [Effects of a neuropsychology program based on mindfulness on Alzheimer's disease: randomized double-blind clinical study].

    Science.gov (United States)

    Quintana Hernández, Domingo Jesús; Miró Barrachina, María Teresa; Ibáñez Fernández, Ignacio; del Pino, Angelo Santana; García Rodríguez, Javie r; Hernández, Jaime Rojas

    2014-01-01

    The purpose of this research was to assess effects of a mindfulness based neuropsychological intervention on the clinical course of Alzheimer's disease. A two year randomized and double blind clinical trial was conducted on 127 probable Alzheimer's disease patients, according to NINCDS-ADRDA scale. Patients were grouped into three experimental groups (cognitive stimulation, progressive muscular relaxation, and mindfulness) plus a control group. All participants were receiving donepezil. Cognitive skills were assessed with CAMCOG and MMSE, functional area with RDRS-2, and NPI was used for psychopathology screening. Three treatment sessions per week were carried out for two years, and follow up measurements were taken every six months. The global cognitive function, functionality and behavioral disorders measurements indicated that patients from the experimental group based on mindfulness were stable during the two years, while patients from the control group, as well as the other experimental groups, showed a mild but significant worsening of their mental capacities. The mindfulness based neuropsychological program showed better cognitive and functional stability, as well as significant improvement in the psychopathological condition of mild to moderate Alzheimer' patients. These results support the idea that a mindfulness based intervention can produce a clinically relevant improvement in the treatment of dementia. More research is needed to confirm these data. Copyright © 2013 SEGG. Published by Elsevier Espana. All rights reserved.

  7. Randomized Trial of ConquerFear: A Novel, Theoretically Based Psychosocial Intervention for Fear of Cancer Recurrence.

    Science.gov (United States)

    Butow, Phyllis N; Turner, Jane; Gilchrist, Jemma; Sharpe, Louise; Smith, Allan Ben; Fardell, Joanna E; Tesson, Stephanie; O'Connell, Rachel; Girgis, Afaf; Gebski, Val J; Asher, Rebecca; Mihalopoulos, Cathrine; Bell, Melanie L; Zola, Karina Grunewald; Beith, Jane; Thewes, Belinda

    2017-12-20

    Purpose Fear of cancer recurrence (FCR) is prevalent, distressing, and long lasting. This study evaluated the impact of a theoretically/empirically based intervention (ConquerFear) on FCR. Methods Eligible survivors had curable breast or colorectal cancer or melanoma, had completed treatment (not including endocrine therapy) 2 months to 5 years previously, were age > 18 years, and had scores above the clinical cutoff on the FCR Inventory (FCRI) severity subscale at screening. Participants were randomly assigned at a one-to-one ratio to either five face-to-face sessions of ConquerFear (attention training, metacognitions, acceptance/mindfulness, screening behavior, and values-based goal setting) or an attention control (Taking-it-Easy relaxation therapy). Participants completed questionnaires at baseline (T0), immediately post-therapy (T1), and 3 (T2) and 6 months (T3) later. The primary outcome was FCRI total score. Results Of 704 potentially eligible survivors from 17 sites and two online databases, 533 were contactable, of whom 222 (42%) consented; 121 were randomly assigned to intervention and 101 to control. Study arms were equivalent at baseline on all measured characteristics. ConquerFear participants had clinically and statistically greater improvements than control participants from T0 to T1 on FCRI total ( P psychological distress, and triggers) as well as in general anxiety, cancer-specific distress (total), and mental quality of life and metacognitions (total). Differences in FCRI psychological distress and cancer-specific distress (total) remained significantly different at T3. Conclusion This randomized trial demonstrated efficacy of ConquerFear compared with attention control (Taking-it-Easy) in reduction of FCRI total scores immediately post-therapy and 3 and 6 months later and in many secondary outcomes immediately post-therapy. Cancer-specific distress (total) remained more improved at 3- and 6-month follow-up.

  8. Calibrated delivery drape versus indirect gravimetric technique for the measurement of blood loss after delivery: a randomized trial.

    Science.gov (United States)

    Ambardekar, Shubha; Shochet, Tara; Bracken, Hillary; Coyaji, Kurus; Winikoff, Beverly

    2014-08-15

    Trials of interventions for PPH prevention and treatment rely on different measurement methods for the quantification of blood loss and identification of PPH. This study's objective was to compare measures of blood loss obtained from two different measurement protocols frequently used in studies. Nine hundred women presenting for vaginal delivery were randomized to a direct method (a calibrated delivery drape) or an indirect method (a shallow bedpan placed below the buttocks and weighing the collected blood and blood-soaked gauze/pads). Blood loss was measured from immediately after delivery for at least one hour or until active bleeding stopped. Significantly greater mean blood loss was recorded by the direct than by the indirect measurement technique (253.9 mL and 195.3 mL, respectively; difference = 58.6 mL (95% CI: 31-86); p 500 mL (8.7% vs. 4.7%, p = 0.02). The study suggests a real and significant difference in blood loss measurement between these methods. Research using blood loss measurement as an endpoint needs to be interpreted taking measurement technique into consideration. This study has been registered at clinicaltrials.gov as NCT01885845.

  9. Dynamic approach to space and habitat use based on biased random bridges.

    Directory of Open Access Journals (Sweden)

    Simon Benhamou

    Full Text Available BACKGROUND: Although habitat use reflects a dynamic process, most studies assess habitat use statically as if an animal's successively recorded locations reflected a point rather than a movement process. By relying on the activity time between successive locations instead of the local density of individual locations, movement-based methods can substantially improve the biological relevance of utilization distribution (UD estimates (i.e. the relative frequencies with which an animal uses the various areas of its home range, HR. One such method rests on Brownian bridges (BBs. Its theoretical foundation (purely and constantly diffusive movements is paradoxically inconsistent with both HR settlement and habitat selection. An alternative involves movement-based kernel density estimation (MKDE through location interpolation, which may be applied to various movement behaviours but lacks a sound theoretical basis. METHODOLOGY/PRINCIPAL FINDINGS: I introduce the concept of a biased random (advective-diffusive bridge (BRB and show that the MKDE method is a practical means to estimate UDs based on simplified (isotropically diffusive BRBs. The equation governing BRBs is constrained by the maximum delay between successive relocations warranting constant within-bridge advection (allowed to vary between bridges but remains otherwise similar to the BB equation. Despite its theoretical inconsistencies, the BB method can therefore be applied to animals that regularly reorientate within their HRs and adapt their movements to the habitats crossed, provided that they were relocated with a high enough frequency. CONCLUSIONS/SIGNIFICANCE: Biased random walks can approximate various movement types at short times from a given relocation. Their simplified form constitutes an effective trade-off between too simple, unrealistic movement models, such as Brownian motion, and more sophisticated and realistic ones, such as biased correlated random walks (BCRWs, which are too

  10. Pigmented skin lesion detection using random forest and wavelet-based texture

    Science.gov (United States)

    Hu, Ping; Yang, Tie-jun

    2016-10-01

    The incidence of cutaneous malignant melanoma, a disease of worldwide distribution and is the deadliest form of skin cancer, has been rapidly increasing over the last few decades. Because advanced cutaneous melanoma is still incurable, early detection is an important step toward a reduction in mortality. Dermoscopy photographs are commonly used in melanoma diagnosis and can capture detailed features of a lesion. A great variability exists in the visual appearance of pigmented skin lesions. Therefore, in order to minimize the diagnostic errors that result from the difficulty and subjectivity of visual interpretation, an automatic detection approach is required. The objectives of this paper were to propose a hybrid method using random forest and Gabor wavelet transformation to accurately differentiate which part belong to lesion area and the other is not in a dermoscopy photographs and analyze segmentation accuracy. A random forest classifier consisting of a set of decision trees was used for classification. Gabor wavelets transformation are the mathematical model of visual cortical cells of mammalian brain and an image can be decomposed into multiple scales and multiple orientations by using it. The Gabor function has been recognized as a very useful tool in texture analysis, due to its optimal localization properties in both spatial and frequency domain. Texture features based on Gabor wavelets transformation are found by the Gabor filtered image. Experiment results indicate the following: (1) the proposed algorithm based on random forest outperformed the-state-of-the-art in pigmented skin lesions detection (2) and the inclusion of Gabor wavelet transformation based texture features improved segmentation accuracy significantly.

  11. Performance of medical residents in sterile techniques during central vein catheterization: randomized trial of efficacy of simulation-based training.

    Science.gov (United States)

    Khouli, Hassan; Jahnes, Katherine; Shapiro, Janet; Rose, Keith; Mathew, Joseph; Gohil, Amit; Han, Qifa; Sotelo, Andre; Jones, James; Aqeel, Adnan; Eden, Edward; Fried, Ethan

    2011-01-01

    Catheter-related bloodstream infection (CRBSI) is a preventable cause of a potentially lethal ICU infection. The optimal method to teach health-care providers correct sterile techniques during central vein catheterization (CVC) remains unclear. We randomly assigned second- and third-year internal medicine residents trained by a traditional apprenticeship model to simulation-based plus video training or video training alone from December 2007 to January 2008, with a follow-up period to examine CRBSI ending in July 2009. During the follow-up period, a simulation-based training program in sterile techniques during CVC was implemented in the medical ICU (MICU). A surgical ICU (SICU) where no residents received study interventions was used for comparison. The primary outcome measures were median residents' scores in sterile techniques and rates of CRBSI per 1,000 catheter-days. Of the 47 enrolled residents, 24 were randomly assigned to the simulation-based plus video training group and 23 to the video training group. Median baseline scores in both groups were equally poor: 12.5 to 13 (52%-54%) out of maximum score of 24 (P = .95; median difference, 0; 95% CI, 0.2-2.0). After training, median score was significantly higher for the simulation-based plus video training group: 22 (92%) vs 18 (75%) for the video training group (P training in sterile techniques during CVC is superior to traditional training or video training alone and is associated with decreased rate of CRBSI. Simulation-based training in CVC should be routinely used to reduce iatrogenic risk. ClinicalTrials.gov; No.: NCT00612131; URL: clinicaltrials.gov.

  12. A stylistic classification of Russian-language texts based on the random walk model

    Science.gov (United States)

    Kramarenko, A. A.; Nekrasov, K. A.; Filimonov, V. V.; Zhivoderov, A. A.; Amieva, A. A.

    2017-09-01

    A formal approach to text analysis is suggested that is based on the random walk model. The frequencies and reciprocal positions of the vowel letters are matched up by a process of quasi-particle migration. Statistically significant difference in the migration parameters for the texts of different functional styles is found. Thus, a possibility of classification of texts using the suggested method is demonstrated. Five groups of the texts are singled out that can be distinguished from one another by the parameters of the quasi-particle migration process.

  13. Introducing two Random Forest based methods for cloud detection in remote sensing images

    Science.gov (United States)

    Ghasemian, Nafiseh; Akhoondzadeh, Mehdi

    2018-07-01

    Cloud detection is a necessary phase in satellite images processing to retrieve the atmospheric and lithospheric parameters. Currently, some cloud detection methods based on Random Forest (RF) model have been proposed but they do not consider both spectral and textural characteristics of the image. Furthermore, they have not been tested in the presence of snow/ice. In this paper, we introduce two RF based algorithms, Feature Level Fusion Random Forest (FLFRF) and Decision Level Fusion Random Forest (DLFRF) to incorporate visible, infrared (IR) and thermal spectral and textural features (FLFRF) including Gray Level Co-occurrence Matrix (GLCM) and Robust Extended Local Binary Pattern (RELBP_CI) or visible, IR and thermal classifiers (DLFRF) for highly accurate cloud detection on remote sensing images. FLFRF first fuses visible, IR and thermal features. Thereafter, it uses the RF model to classify pixels to cloud, snow/ice and background or thick cloud, thin cloud and background. DLFRF considers visible, IR and thermal features (both spectral and textural) separately and inserts each set of features to RF model. Then, it holds vote matrix of each run of the model. Finally, it fuses the classifiers using the majority vote method. To demonstrate the effectiveness of the proposed algorithms, 10 Terra MODIS and 15 Landsat 8 OLI/TIRS images with different spatial resolutions are used in this paper. Quantitative analyses are based on manually selected ground truth data. Results show that after adding RELBP_CI to input feature set cloud detection accuracy improves. Also, the average cloud kappa values of FLFRF and DLFRF on MODIS images (1 and 0.99) are higher than other machine learning methods, Linear Discriminate Analysis (LDA), Classification And Regression Tree (CART), K Nearest Neighbor (KNN) and Support Vector Machine (SVM) (0.96). The average snow/ice kappa values of FLFRF and DLFRF on MODIS images (1 and 0.85) are higher than other traditional methods. The

  14. HIGH QUALITY FACADE SEGMENTATION BASED ON STRUCTURED RANDOM FOREST, REGION PROPOSAL NETWORK AND RECTANGULAR FITTING

    Directory of Open Access Journals (Sweden)

    K. Rahmani

    2018-05-01

    Full Text Available In this paper we present a pipeline for high quality semantic segmentation of building facades using Structured Random Forest (SRF, Region Proposal Network (RPN based on a Convolutional Neural Network (CNN as well as rectangular fitting optimization. Our main contribution is that we employ features created by the RPN as channels in the SRF.We empirically show that this is very effective especially for doors and windows. Our pipeline is evaluated on two datasets where we outperform current state-of-the-art methods. Additionally, we quantify the contribution of the RPN and the rectangular fitting optimization on the accuracy of the result.

  15. Information hiding based on double random-phase encoding and public-key cryptography.

    Science.gov (United States)

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  16. Using rapidly-exploring random tree-based algorithms to find smooth and optimal trajectories

    CSIR Research Space (South Africa)

    Matebese, B

    2012-10-01

    Full Text Available -exploring random tree-based algorithms to fi nd smooth and optimal trajectories B MATEBESE1, MK BANDA2 AND S UTETE1 1CSIR Modelling and Digital Science, PO Box 395, Pretoria, South Africa, 0001 2Department of Applied Mathematics, Stellenbosch University... and complex environments. The RRT algorithm is the most popular and has the ability to find a feasible solution faster than other algorithms. The drawback of using RRT is that, as the number of samples increases, the probability that the algorithm converges...

  17. Phase-only asymmetric optical cryptosystem based on random modulus decomposition

    Science.gov (United States)

    Xu, Hongfeng; Xu, Wenhui; Wang, Shuaihua; Wu, Shaofan

    2018-06-01

    We propose a phase-only asymmetric optical cryptosystem based on random modulus decomposition (RMD). The cryptosystem is presented for effectively improving the capacity to resist various attacks, including the attack of iterative algorithms. On the one hand, RMD and phase encoding are combined to remove the constraints that can be used in the attacking process. On the other hand, the security keys (geometrical parameters) introduced by Fresnel transform can increase the key variety and enlarge the key space simultaneously. Numerical simulation results demonstrate the strong feasibility, security and robustness of the proposed cryptosystem. This cryptosystem will open up many new opportunities in the application fields of optical encryption and authentication.

  18. Fault diagnosis in spur gears based on genetic algorithm and random forest

    Science.gov (United States)

    Cerrada, Mariela; Zurita, Grover; Cabrera, Diego; Sánchez, René-Vinicio; Artés, Mariano; Li, Chuan

    2016-03-01

    There are growing demands for condition-based monitoring of gearboxes, and therefore new methods to improve the reliability, effectiveness, accuracy of the gear fault detection ought to be evaluated. Feature selection is still an important aspect in machine learning-based diagnosis in order to reach good performance of the diagnostic models. On the other hand, random forest classifiers are suitable models in industrial environments where large data-samples are not usually available for training such diagnostic models. The main aim of this research is to build up a robust system for the multi-class fault diagnosis in spur gears, by selecting the best set of condition parameters on time, frequency and time-frequency domains, which are extracted from vibration signals. The diagnostic system is performed by using genetic algorithms and a classifier based on random forest, in a supervised environment. The original set of condition parameters is reduced around 66% regarding the initial size by using genetic algorithms, and still get an acceptable classification precision over 97%. The approach is tested on real vibration signals by considering several fault classes, one of them being an incipient fault, under different running conditions of load and velocity.

  19. Adaptive Markov Random Fields for Example-Based Super-resolution of Faces

    Science.gov (United States)

    Stephenson, Todd A.; Chen, Tsuhan

    2006-12-01

    Image enhancement of low-resolution images can be done through methods such as interpolation, super-resolution using multiple video frames, and example-based super-resolution. Example-based super-resolution, in particular, is suited to images that have a strong prior (for those frameworks that work on only a single image, it is more like image restoration than traditional, multiframe super-resolution). For example, hallucination and Markov random field (MRF) methods use examples drawn from the same domain as the image being enhanced to determine what the missing high-frequency information is likely to be. We propose to use even stronger prior information by extending MRF-based super-resolution to use adaptive observation and transition functions, that is, to make these functions region-dependent. We show with face images how we can adapt the modeling for each image patch so as to improve the resolution.

  20. Simulation-based camera navigation training in laparoscopy-a randomized trial

    DEFF Research Database (Denmark)

    Nilsson, Cecilia; Sørensen, Jette Led; Konge, Lars

    2017-01-01

    patient safety. The objectives of this trial were to examine how to train laparoscopic camera navigation and to explore the transfer of skills to the operating room. MATERIALS AND METHODS: A randomized, single-center superiority trial with three groups: The first group practiced simulation-based camera...... navigation tasks (camera group), the second group practiced performing a simulation-based cholecystectomy (procedure group), and the third group received no training (control group). Participants were surgical novices without prior laparoscopic experience. The primary outcome was assessment of camera.......033), had a higher score. CONCLUSIONS: Simulation-based training improves the technical skills required for camera navigation, regardless of practicing camera navigation or the procedure itself. Transfer to the clinical setting could, however, not be demonstrated. The control group demonstrated higher...

  1. Adaptive Markov Random Fields for Example-Based Super-resolution of Faces

    Directory of Open Access Journals (Sweden)

    Stephenson Todd A

    2006-01-01

    Full Text Available Image enhancement of low-resolution images can be done through methods such as interpolation, super-resolution using multiple video frames, and example-based super-resolution. Example-based super-resolution, in particular, is suited to images that have a strong prior (for those frameworks that work on only a single image, it is more like image restoration than traditional, multiframe super-resolution. For example, hallucination and Markov random field (MRF methods use examples drawn from the same domain as the image being enhanced to determine what the missing high-frequency information is likely to be. We propose to use even stronger prior information by extending MRF-based super-resolution to use adaptive observation and transition functions, that is, to make these functions region-dependent. We show with face images how we can adapt the modeling for each image patch so as to improve the resolution.

  2. SEM based overlay measurement between resist and buried patterns

    Science.gov (United States)

    Inoue, Osamu; Okagawa, Yutaka; Hasumi, Kazuhisa; Shao, Chuanyu; Leray, Philippe; Lorusso, Gian; Baudemprez, Bart

    2016-03-01

    With the continuous shrink in pattern size and increased density, overlay control has become one of the most critical issues in semiconductor manufacturing. Recently, SEM based overlay of AEI (After Etch Inspection) wafer has been used for reference and optimization of optical overlay (both Image Based Overlay (IBO) and Diffraction Based Overlay (DBO)). Overlay measurement at AEI stage contributes monitor and forecast the yield after formation by etch and calibrate optical measurement tools. however those overlay value seems difficult directly for feedback to a scanner. Therefore, there is a clear need to have SEM based overlay measurements of ADI (After Develop Inspection) wafers in order to serve as reference for optical overlay and make necessary corrections before wafers go to etch. Furthermore, to make the corrections as accurate as possible, actual device like feature dimensions need to be measured post ADI. This device size measurement is very unique feature of CDSEM , which can be measured with smaller area. This is currently possible only with the CD-SEM. This device size measurement is very unique feature of CD-SEM , which can be measured with smaller area. In this study, we assess SEM based overlay measurement of ADI and AEI wafer by using a sample from an N10 process flow. First, we demonstrate SEM based overlay performance at AEI by using dual damascene process for Via 0 (V0) and metal 1 (M1) layer. We also discuss the overlay measurements between litho-etch-litho stages of a triple patterned M1 layer and double pattern V0. Second, to illustrate the complexities in image acquisition and measurement we will measure overlay between M1B resist and buried M1A-Hard mask trench. Finally, we will show how high accelerating voltage can detect buried pattern information by BSE (Back Scattering Electron). In this paper we discuss the merits of this method versus standard optical metrology based corrections.

  3. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  4. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    Science.gov (United States)

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Effect of Yoga Based Lifestyle Intervention on Patients With Knee Osteoarthritis: A Randomized Controlled Trial

    Science.gov (United States)

    Deepeshwar, Singh; Tanwar, Monika; Kavuri, Vijaya; Budhi, Rana B.

    2018-01-01

    Objective: To investigate the effect of integrated approach of yoga therapy (IAYT) intervention in individual with knee Osteoarthritis. Design: Randomized controlled clincial trail. Participants: Sixty-six individual prediagnosed with knee osteoarthritis aged between 30 and 75 years were randomized into two groups, i.e., Yoga (n = 31) and Control (n = 35). Yoga group received IAYT intervention for 1 week at yoga center of S-VYASA whereas Control group maintained their normal lifestyle. Outcome measures: The Falls Efficacy Scale (FES), Handgrip Strength test (left hand LHGS and right hand RHGS), Timed Up and Go Test (TUG), Sit-to-Stand (STS), and right & left extension and flexion were measured on day 1 and day 7. Results: There were a significant reduction in TUG (p Yoga group. Conclusion: IAYT practice showed an improvement in TUG, STS, HGS, and Goniometer test, which suggest improved muscular strength, flexibility, and functional mobility. CTRI Registration Number: http://ctri.nic.in/Clinicaltrials, identifier CTRI/2017/10/010141. PMID:29867604

  6. Gearbox fault diagnosis based on deep random forest fusion of acoustic and vibratory signals

    Science.gov (United States)

    Li, Chuan; Sanchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego; Vásquez, Rafael E.

    2016-08-01

    Fault diagnosis is an effective tool to guarantee safe operations in gearboxes. Acoustic and vibratory measurements in such mechanical devices are all sensitive to the existence of faults. This work addresses the use of a deep random forest fusion (DRFF) technique to improve fault diagnosis performance for gearboxes by using measurements of an acoustic emission (AE) sensor and an accelerometer that are used for monitoring the gearbox condition simultaneously. The statistical parameters of the wavelet packet transform (WPT) are first produced from the AE signal and the vibratory signal, respectively. Two deep Boltzmann machines (DBMs) are then developed for deep representations of the WPT statistical parameters. A random forest is finally suggested to fuse the outputs of the two DBMs as the integrated DRFF model. The proposed DRFF technique is evaluated using gearbox fault diagnosis experiments under different operational conditions, and achieves 97.68% of the classification rate for 11 different condition patterns. Compared to other peer algorithms, the addressed method exhibits the best performance. The results indicate that the deep learning fusion of acoustic and vibratory signals may improve fault diagnosis capabilities for gearboxes.

  7. Home-based diabetes self-management coaching delivered by paraprofessionals: A randomized controlled trial.

    Science.gov (United States)

    Pauley, Tim; Gargaro, Judith; Chenard, Glen; Cavanagh, Helen; McKay, Sandra M

    2016-01-01

    This study evaluated paraprofessional-led diabetes self-management coaching (DSMC) among 94 clients with type 2 diabetes recruited from a Community Care Access Centre in Ontario, Canada. Subjects were randomized to standard care or standard care plus coaching. Measures included the Diabetes Self-Efficacy Scale (DSES), Insulin Management Diabetes Self-Efficacy Scale (IMDSES), and Hospital Anxiety and Depression Scale (HADS). Both groups showed improvement in DSES (6.6 + 1.5 vs. 7.2 + 1.5, p  .05 for all) or depression scores (p > .05 for all), or anxiety (p > .05 for all) or depression (p > .05 for all) categories at baseline, postintervention, or follow-up. While all subjects demonstrated significant improvements in self-efficacy measures, there is no evidence to support paraprofessional-led DSMC as an intervention which conveys additional benefits over standard care.

  8. Markov chain beam randomization: a study of the impact of PLANCK beam measurement errors on cosmological parameter estimation

    Science.gov (United States)

    Rocha, G.; Pagano, L.; Górski, K. M.; Huffenberger, K. M.; Lawrence, C. R.; Lange, A. E.

    2010-04-01

    We introduce a new method to propagate uncertainties in the beam shapes used to measure the cosmic microwave background to cosmological parameters determined from those measurements. The method, called markov chain beam randomization (MCBR), randomly samples from a set of templates or functions that describe the beam uncertainties. The method is much faster than direct numerical integration over systematic “nuisance” parameters, and is not restricted to simple, idealized cases as is analytic marginalization. It does not assume the data are normally distributed, and does not require Gaussian priors on the specific systematic uncertainties. We show that MCBR properly accounts for and provides the marginalized errors of the parameters. The method can be generalized and used to propagate any systematic uncertainties for which a set of templates is available. We apply the method to the Planck satellite, and consider future experiments. Beam measurement errors should have a small effect on cosmological parameters as long as the beam fitting is performed after removal of 1/f noise.

  9. Bayesian reconstruction of seafloor shape from side-scan sonar measurements using a Markov Random Field

    OpenAIRE

    Woock, P.; Pak, Alexey

    2014-01-01

    To explore the seafloor, a side-scan sonar emits a directed acoustic signal and then records the returning (reflected) signal intensity as a function of time. The inversion of that process is not unique: multiple shapes may lead to identical measured responses. In this work, we suggest a Bayesian approach to reconstructing the 3D shape of the seafloor from multiple sonar measurements, inspired by the state-of-the-art methods of inverse raytracing that originated in computer vision. The space ...

  10. Internet-based photoaging within Australian pharmacies to promote smoking cessation: randomized controlled trial.

    Science.gov (United States)

    Burford, Oksana; Jiwa, Moyez; Carter, Owen; Parsons, Richard; Hendrie, Delia

    2013-03-26

    Tobacco smoking leads to death or disability and a drain on national resources. The literature suggests that cigarette smoking continues to be a major modifiable risk factor for a variety of diseases and that smokers aged 18-30 years are relatively resistant to antismoking messages due to their widely held belief that they will not be lifelong smokers. To conduct a randomized controlled trial (RCT) of a computer-generated photoaging intervention to promote smoking cessation among young adult smokers within a community pharmacy setting. A trial was designed with 80% power based on the effect size observed in a published pilot study; 160 subjects were recruited (80 allocated to the control group and 80 to the intervention group) from 8 metropolitan community pharmacies located around Perth city center in Western Australia. All participants received standardized smoking cessation advice. The intervention group participants were also digitally photoaged by using the Internet-based APRIL Face Aging software so they could preview images of themselves as a lifelong smoker and as a nonsmoker. Due to the nature of the intervention, the participants and researcher could not be blinded to the study. The main outcome measure was quit attempts at 6-month follow-up, both self-reported and biochemically validated through testing for carbon monoxide (CO), and nicotine dependence assessed via the Fagerström scale. At 6-month follow-up, 5 of 80 control group participants (6.3%) suggested they had quit smoking, but only 1 of 80 control group participants (1.3%) consented to, and was confirmed by, CO validation. In the intervention group, 22 of 80 participants (27.5%) reported quitting, with 11 of 80 participants (13.8%) confirmed by CO testing. This difference in biochemically confirmed quit attempts was statistically significant (χ(2) 1=9.0, P=.003). A repeated measures analysis suggested the average intervention group smoking dependence score had also significantly dropped

  11. Vaginal Swab Test Compared With the Urethral Q-tip Test for Urethral Mobility Measurement: A Randomized Controlled Trial.

    Science.gov (United States)

    Meyer, Isuzu; Szychowski, Jeff M; Illston, Jana D; Parden, Alison M; Richter, Holly E

    2016-02-01

    To assess whether use of a vaginal cotton-tipped swab is equivalent to the standard Q-tip test regarding urethral mobility. Secondarily, to examine whether both tests agree in hypermobility diagnosis, discomfort level, and patients' preference. In this randomized crossover trial, women with stress urinary incontinence without prolapse beyond the hymen were randomized to undergo either a vaginal or urethral mobility test first followed by the alternate approach. The primary outcome was the difference in rotation angle, from resting to maximum strain, between tests. The equivalence margin was ±10°. The secondary outcome was agreement in hypermobility diagnosis using two definitions: 1) maximum straining angle of 30° or greater from the horizontal plane; and 2) rotation angle 30° or greater. Discomfort was assessed using a 0-10 visual analog scale. Using 90% power assuming a standard deviation of 20°, 36 and 139 patients were needed for 10° and 5° equivalence margins, respectively. From January 2014 to March 2015, 140 women were randomized. The mean difference between the two tests was 5.1° (95% confidence interval 3.2-6.9°), meeting the predefined equivalence criteria. In the hypermobility diagnosis, the urethral and vaginal tests had no disagreement using definition 1 (P=.23), whereas the two tests disagreed using definition 2 (P=.03). The urethral approach had a higher discomfort level (Pstandard Q-tip test in measuring urethral mobility with less discomfort and is preferred by patients.

  12. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  13. Behavioral measures to reduce non-adherence in renal transplant recipients: a prospective randomized controlled trial.

    Science.gov (United States)

    Garcia, Márcia Fátima Faraldo Martinez; Bravin, Ariane Moyses; Garcia, Paula Dalsoglio; Contti, Mariana Moraes; Nga, Hong Si; Takase, Henrique Mochida; de Andrade, Luis Gustavo Modelli

    2015-11-01

    Solid-organ transplant recipients present a high rate of non-adherence to drug treatment. Few interventional studies have included approaches aimed at increasing adherence. The objective of this study was to evaluate the impact of an educational and behavioral strategy on treatment adherence of kidney transplant recipients. In a randomized prospective study, incident renal transplant patients (n = 111) were divided into two groups: control group (received usual transplant patient education) and treatment group (usual transplant patient education plus ten additional weekly 30-min education/counseling sessions about immunosuppressive drugs and behavioral changes). Treatment adherence was assessed using ITAS adherence questionnaire after 3 months. Renal function at 3, 6, and 12 months, and the incidence of transplant rejection were evaluated. The non-adherence rates were 46.4 and 14.5 % in the control and treatment groups (p = 0.001), respectively. The relative risk for non-adherence was 2.59 times (CI 1.38-4.88) higher in the control group. Multivariate analysis demonstrated a 5.84 times (CI 1.8-18.8, p = 0.003) higher risk of non-adherence in the control group. There were no differences in renal function and rejection rates between groups. A behavioral and educational strategy addressing the patient's perceptions and knowledge about the anti-rejection drugs significantly improved the short-term adherence to immunosuppressive therapy.

  14. Assessing the Effectiveness of Case-Based Collaborative Learning via Randomized Controlled Trial.

    Science.gov (United States)

    Krupat, Edward; Richards, Jeremy B; Sullivan, Amy M; Fleenor, Thomas J; Schwartzstein, Richard M

    2016-05-01

    Case-based collaborative learning (CBCL) is a novel small-group approach that borrows from team-based learning principles and incorporates elements of problem-based learning (PBL) and case-based learning. CBCL includes a preclass readiness assurance process and case-based in-class activities in which students respond to focused, open-ended questions individually, discuss their answers in groups of 4, and then reach consensus in larger groups of 16. This study introduces CBCL and assesses its effectiveness in one course at Harvard Medical School. In a 2013 randomized controlled trial, 64 medical and dental student volunteers were assigned randomly to one of four 8-person PBL tutorial groups (control; n = 32) or one of two 16-person CBCL tutorial groups (experimental condition; n = 32) as part of a required first-year physiology course. Outcomes for the PBL and CBCL groups were compared using final exam scores, student responses to a postcourse survey, and behavioral coding of portions of video-recorded class sessions. Overall, the course final exam scores for CBCL and PBL students were not significantly different. However, CBCL students whose mean exam performance in prior courses was below the participant median scored significantly higher than their PBL counterparts on the physiology course final exam. The most common adjectives students used to describe CBCL were "engaging," "fun," and "thought-provoking." Coding of observed behaviors indicated that individual affect was significantly higher in the CBCL groups than in the PBL groups. CBCL is a viable, engaging, active learning method. It may particularly benefit students with lower academic performance.

  15. An Attribute Involved Public Key Cryptosystem Based on P-Sylow Subgroups and Randomization

    Directory of Open Access Journals (Sweden)

    Sumalatha GUNNALA

    2018-04-01

    Full Text Available The Asymmetric Key Cryptosystem (AKC or Public Key Encryption (PKE is a mechanism used to encrypt the messages by using public key and decrypt the enciphered messages by using private key. Of late, the Attribute-Based Encryption (ABE is an expansion of asymmetric key encryption scheme that allows users to encrypt and decrypt the plaintext messages using the key based on the user’s credentials, called attributes, like social security number, PAN (Permanent Account Number, email ids or Aadhar number etc. Most of the existing ABE schemes rely on the multiple attributes from which the access control policies are derived. These policies define the users’ private keys, required for the decryption process and access to the confidential information. In this paper, we proposed a new attribute based asymmetric cryptosystem that uses the features of both the schemes: PKE and ABE. Here, we used a value of an attribute, personal to the user, for the encryption and the decryption process. This scheme assures that the receiver will only be able to access the secret data if recipient is shared with the valid attribute value. The asymmetric nature is this scheme is based on the concept of p-sylow sub-group assumption. In addition, the randomization factor is used in the encipherment process to strengthen the cipher further. The development of this cryptosystem is an embodiment where the merits of randomized asymmetric encryption technique and the attribute based encryption are integrated to achieve the authentication on top of confidentiality to secure the information transmission over the public networks.

  16. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  17. Cognitive-Behavioral-Based Physical Therapy for Patients With Chronic Pain Undergoing Lumbar Spine Surgery: A Randomized Controlled Trial.

    Science.gov (United States)

    Archer, Kristin R; Devin, Clinton J; Vanston, Susan W; Koyama, Tatsuki; Phillips, Sharon E; George, Steven Z; McGirt, Matthew J; Spengler, Dan M; Aaronson, Oran S; Cheng, Joseph S; Wegener, Stephen T

    2016-01-01

    The purpose of this study was to determine the efficacy of a cognitive-behavioral-based physical therapy (CBPT) program for improving outcomes in patients after lumbar spine surgery. A randomized controlled trial was conducted on 86 adults undergoing a laminectomy with or without arthrodesis for a lumbar degenerative condition. Patients were screened preoperatively for high fear of movement using the Tampa Scale for Kinesiophobia. Randomization to either CBPT or an education program occurred at 6 weeks after surgery. Assessments were completed pretreatment, posttreatment and at 3-month follow-up. The primary outcomes were pain and disability measured by the Brief Pain Inventory and Oswestry Disability Index. Secondary outcomes included general health (SF-12) and performance-based tests (5-Chair Stand, Timed Up and Go, 10-Meter Walk). Multivariable linear regression analyses found that CBPT participants had significantly greater decreases in pain and disability and increases in general health and physical performance compared with the education group at the 3-month follow-up. Results suggest a targeted CBPT program may result in significant and clinically meaningful improvement in postoperative outcomes. CBPT has the potential to be an evidence-based program that clinicians can recommend for patients at risk for poor recovery after spine surgery. This study investigated a targeted cognitive-behavioral-based physical therapy program for patients after lumbar spine surgery. Findings lend support to the hypothesis that incorporating cognitive-behavioral strategies into postoperative physical therapy may address psychosocial risk factors and improve pain, disability, general health, and physical performance outcomes. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  18. A mindfulness-based intervention to control weight after bariatric surgery: Preliminary results from a randomized controlled pilot trial.

    Science.gov (United States)

    Chacko, Sara A; Yeh, Gloria Y; Davis, Roger B; Wee, Christina C

    2016-10-01

    This study aimed to develop and test a novel mindfulness-based intervention (MBI) designed to control weight after bariatric surgery. Randomized, controlled pilot trial. Beth Israel Deaconess Medical Center, Boston, MA, USA. Bariatric patients 1-5 years post-surgery (n=18) were randomized to receive a 10-week MBI or a standard intervention. Primary outcomes were feasibility and acceptability of the MBI. Secondary outcomes included changes in weight, eating behaviors, psychosocial outcomes, and metabolic and inflammatory biomarkers. Qualitative exit interviews were conducted post-intervention. Major themes were coded and extracted. Attendance was excellent (6 of 9 patients attended ≥7 of 10 classes). Patients reported high satisfaction and overall benefit of the MBI. The intervention was effective in reducing emotional eating at 6 months (-4.9±13.7 in mindfulness vs. 6.2±28.4 in standard, p for between-group difference=0.03) but not weight. We also observed a significant increase in HbA1C (0.34±0.38 vs. -0.06±0.31, p=0.03). Objective measures suggested trends of an increase in perceived stress and symptoms of depression, although patients reported reduced stress reactivity, improved eating behaviors, and a desire for continued mindfulness-based support in qualitative interviews. This novel mindfulness-based approach is highly acceptable to bariatric patients post-surgery and may be effective for reducing emotional eating, although it did not improve weight or glycemic control in the short term. Longer-term studies of mindfulness-based approaches may be warranted in this population. ClinicalTrials.gov identifier NCT02603601. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Community-based participatory research to design a faith-enhanced diabetes prevention program: The Better Me Within randomized trial.

    Science.gov (United States)

    Kitzman, Heather; Dodgen, Leilani; Mamun, Abdullah; Slater, J Lee; King, George; Slater, Donna; King, Alene; Mandapati, Surendra; DeHaven, Mark

    2017-11-01

    Reducing obesity positively impacts diabetes and cardiovascular risk; however, evidence-based lifestyle programs, such as the diabetes prevention program (DPP), show reduced effectiveness in African American (AA) women. In addition to an attenuated response to lifestyle programs, AA women also demonstrate high rates of obesity, diabetes, and cardiovascular disease. To address these disparities, enhancements to evidence-based lifestyle programs for AA women need to be developed and evaluated with culturally relevant and rigorous study designs. This study describes a community-based participatory research (CBPR) approach to design a novel faith-enhancement to the DPP for AA women. A long-standing CBPR partnership designed the faith-enhancement from focus group data (N=64 AA adults) integrating five components: a brief pastor led sermon, memory verse, in class or take-home faith activity, promises to remember, and scripture and prayer integrated into participant curriculum and facilitator materials. The faith components were specifically linked to weekly DPP learning objectives to strategically emphasize behavioral skills with religious principles. Using a CBPR approach, the Better Me Within trial was able to enroll 12 churches, screen 333 AA women, and randomize 221 (M age =48.8±11.2; M BMI =36.7±8.4; 52% technical or high school) after collection of objective eligibility measures. A prospective, randomized, nested by church, design will be used to evaluate the faith-enhanced DPP as compared to a standard DPP on weight, diabetes and cardiovascular risk, over a 16-week intervention and 10-month follow up. This study will provide essential data to guide enhancements to evidence-based lifestyle programs for AA women who are at high risk for chronic disease. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. A story about estimation of a random field of boulders from incomplete seismic measurements

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2005-01-01

    deposits along the tunnel line. By use of this important distribution information and of the observed homogeneity of the seismic point source field together with the physical properties of diffraction it became possible to make the wanted prediction. During the excavation the found boulders were counted......This paper reports on the statistical interpretation of seismic diffraction measurements of boulder locations. The measurements are made in a corridor along the planned tunnel line for the later realized bored tunnel through the till deposits under the East Channel of the Great Belt in Denmark...... graphical registrations on seismograms do not make a proper interpretation possible without detailed knowledge about the joint distribution of the primary dimensions of the boulders. Therefore separate measurements were made of the dimensions of boulders deposited visibly on the cliff beaches of the Great...

  1. Random pulse generator

    International Nuclear Information System (INIS)

    Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing

    2007-01-01

    Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)

  2. Bias Correction and Random Error Characterization for the Assimilation of HRDI Line-of-Sight Wind Measurements

    Science.gov (United States)

    Tangborn, Andrew; Menard, Richard; Ortland, David; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A new approach to the analysis of systematic and random observation errors is presented in which the error statistics are obtained using forecast data rather than observations from a different instrument type. The analysis is carried out at an intermediate retrieval level, instead of the more typical state variable space. This method is carried out on measurements made by the High Resolution Doppler Imager (HRDI) on board the Upper Atmosphere Research Satellite (UARS). HRDI, a limb sounder, is the only satellite instrument measuring winds in the stratosphere, and the only instrument of any kind making global wind measurements in the upper atmosphere. HRDI measures doppler shifts in the two different O2 absorption bands (alpha and B) and the retrieved products are tangent point Line-of-Sight wind component (level 2 retrieval) and UV winds (level 3 retrieval). This analysis is carried out on a level 1.9 retrieval, in which the contributions from different points along the line-of-sight have not been removed. Biases are calculated from O-F (observed minus forecast) LOS wind components and are separated into a measurement parameter space consisting of 16 different values. The bias dependence on these parameters (plus an altitude dependence) is used to create a bias correction scheme carried out on the level 1.9 retrieval. The random error component is analyzed by separating the gamma and B band observations and locating observation pairs where both bands are very nearly looking at the same location at the same time. It is shown that the two observation streams are uncorrelated and that this allows the forecast error variance to be estimated. The bias correction is found to cut the effective observation error variance in half.

  3. Web app based patient education in psoriasis - a randomized controlled trial.

    Science.gov (United States)

    Hawkins, Spencer D; Barilla, Steven; Feldman, Steven R

    2017-04-15

    Patients report wanting more information about psoriasis and clear expectations from the onset of therapy. Dermatologists do not think patients receive or internalize adequate information. There isa need for further explanation of treatment regimens to increase knowledge, compliance, and patient satisfaction. Recent advancements in web technology have the potential to improve these psoriasis outcomes. A web based application was created to educate psoriasis patients using video, graphics, and textual information. An investigator blinded, randomized, controlled study evaluated the website's efficacy in 50 psoriasis patients at Wake Forest Baptist Health Dermatology. Patients were randomized into two groups: Group 1 received a link to the educational web app and a survey following their visit; Group 2 received a link to the survey with no educational web app. The survey assessed patient knowledge, self reported adherence to medication, and adequacy of addressing concerns. Twenty two patients completed the study. Patients in the web app group scored an average of 11/14 on the psoriasis knowledge quiz, whereas patients in the control group scored an average of 9/14 for an improvement of roughly 18% (p=0.008, n=22). Web app based education via DermPatientEd.Com is an efficient way to improve knowledge, but we did not demonstrate improvements in self-reported medication adherence or the ability to address concerns of psoriasis patients.

  4. Random Finite Set Based Bayesian Filtering with OpenCL in a Heterogeneous Platform

    Directory of Open Access Journals (Sweden)

    Biao Hu

    2017-04-01

    Full Text Available While most filtering approaches based on random finite sets have focused on improving performance, in this paper, we argue that computation times are very important in order to enable real-time applications such as pedestrian detection. Towards this goal, this paper investigates the use of OpenCL to accelerate the computation of random finite set-based Bayesian filtering in a heterogeneous system. In detail, we developed an efficient and fully-functional pedestrian-tracking system implementation, which can run under real-time constraints, meanwhile offering decent tracking accuracy. An extensive evaluation analysis was carried out to ensure the fulfillment of sufficient accuracy requirements. This was followed by extensive profiling analysis to spot the potential bottlenecks in terms of execution performance, which were then targeted to come up with an OpenCL accelerated application. Video-throughput improvements from roughly 15 fps to 100 fps (6× were observed on average while processing typical MOT benchmark videos. Moreover, the worst-case frame processing yielded an 18× advantage from nearly 2 fps to 36 fps, thereby comfortably meeting the real-time constraints. Our implementation is released as open-source code.

  5. Can evidence change the rate of back surgery? A randomized trial of community-based education.

    Science.gov (United States)

    Goldberg, H I; Deyo, R A; Taylor, V M; Cheadle, A D; Conrad, D A; Loeser, J D; Heagerty, P J; Diehr, P

    2001-01-01

    Timely adoption of clinical practice guidelines is more likely to happen when the guidelines are used in combination with adjuvant educational strategies that address social as well as rational influences. To implement the conservative, evidence-based approach to low-back pain recommended in national guidelines, with the anticipated effect of reducing population-based rates of surgery. A randomized, controlled trial. Ten communities in western Washington State with annual rates of back surgery above the 1990 national average (158 operations per 100,000 adults). Spine surgeons, primary care physicians, patients who were surgical candidates, and hospital administrators. The five communities randomized to the intervention group received a package of six educational activities tailored to local needs by community planning groups. Surgeon study groups, primary care continuing medical education conferences, administrative consensus processes, videodisc-aided patient decision making, surgical outcomes management, and generalist academic detailing were serially implemented over a 30-month intervention period. Quarterly observations of surgical rates. After implementation of the intervention, surgery rates declined in the intervention communities but increased slightly in the control communities. The net effect of the intervention is estimated to be a decline of 20.9 operations per 100,000, a relative reduction of 8.9% (P = 0.01). We were able to use scientific evidence to engender voluntary change in back pain practice patterns across entire communities.

  6. Modeling spreading of oil slicks based on random walk methods and Voronoi diagrams

    International Nuclear Information System (INIS)

    Durgut, İsmail; Reed, Mark

    2017-01-01

    We introduce a methodology for representation of a surface oil slick using a Voronoi diagram updated at each time step. The Voronoi cells scale the Gaussian random walk procedure representing the spreading process by individual particle stepping. The step length of stochastically moving particles is based on a theoretical model of the spreading process, establishing a relationship between the step length of diffusive spreading and the thickness of the slick at the particle locations. The Voronoi tessellation provides the areal extent of the slick particles and in turn the thicknesses of the slick and the diffusive-type spreading length for all particles. The algorithm successfully simulates the spreading process and results show very good agreement with the analytical solution. Moreover, the results are robust for a wide range of values for computational time step and total number of particles. - Highlights: • A methodology for representation of a surface oil slick using a Voronoi diagram • An algorithm simulating the spreading of oil slick with the Voronoi diagram representation • The algorithm employs the Gaussian random walk method through individual particle stepping. • The diffusive spreading is based on a theoretical model of the spreading process. • Algorithm is computationally robust and successfully reproduces analytical solutions to the spreading process.

  7. Intelligent Fault Diagnosis of HVCB with Feature Space Optimization-Based Random Forest.

    Science.gov (United States)

    Ma, Suliang; Chen, Mingxuan; Wu, Jianwen; Wang, Yuhao; Jia, Bowen; Jiang, Yuan

    2018-04-16

    Mechanical faults of high-voltage circuit breakers (HVCBs) always happen over long-term operation, so extracting the fault features and identifying the fault type have become a key issue for ensuring the security and reliability of power supply. Based on wavelet packet decomposition technology and random forest algorithm, an effective identification system was developed in this paper. First, compared with the incomplete description of Shannon entropy, the wavelet packet time-frequency energy rate (WTFER) was adopted as the input vector for the classifier model in the feature selection procedure. Then, a random forest classifier was used to diagnose the HVCB fault, assess the importance of the feature variable and optimize the feature space. Finally, the approach was verified based on actual HVCB vibration signals by considering six typical fault classes. The comparative experiment results show that the classification accuracy of the proposed method with the origin feature space reached 93.33% and reached up to 95.56% with optimized input feature vector of classifier. This indicates that feature optimization procedure is successful, and the proposed diagnosis algorithm has higher efficiency and robustness than traditional methods.

  8. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  9. Global industrial impact coefficient based on random walk process and inter-country input-output table

    Science.gov (United States)

    Xing, Lizhi; Dong, Xianlei; Guan, Jun

    2017-04-01

    Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.

  10. Apatite fission track analysis: geological thermal history analysis based on a three-dimensional random process of linear radiation damage

    International Nuclear Information System (INIS)

    Galbraith, R.F.; Laslett, G.M.; Green, P.F.; Duddy, I.R.

    1990-01-01

    Spontaneous fission of uranium atoms over geological time creates a random process of linearly shaped features (fission tracks) inside an apatite crystal. The theoretical distributions associated with this process are governed by the elapsed time and temperature history, but other factors are also reflected in empirical measurements as consequences of sampling by plane section and chemical etching. These include geometrical biases leading to over-representation of long tracks, the shape and orientation of host features when sampling totally confined tracks, and 'gaps' in heavily annealed tracks. We study the estimation of geological parameters in the presence of these factors using measurements on both confined tracks and projected semi-tracks. Of particular interest is a history of sedimentation, uplift and erosion giving rise to a two-component mixture of tracks in which the parameters reflect the current temperature, the maximum temperature and the timing of uplift. A full likelihood analysis based on all measured densities, lengths and orientations is feasible, but because some geometrical biases and measurement limitations are only partly understood it seems preferable to use conditional likelihoods given numbers and orientations of confined tracks. (author)

  11. Computer-Based Driving in Dementia Decision Tool With Mail Support: Cluster Randomized Controlled Trial.

    Science.gov (United States)

    Rapoport, Mark J; Zucchero Sarracini, Carla; Kiss, Alex; Lee, Linda; Byszewski, Anna; Seitz, Dallas P; Vrkljan, Brenda; Molnar, Frank; Herrmann, Nathan; Tang-Wai, David F; Frank, Christopher; Henry, Blair; Pimlott, Nicholas; Masellis, Mario; Naglie, Gary

    2018-05-25

    Physicians often find significant challenges in assessing automobile driving in persons with mild cognitive impairment and mild dementia and deciding when to report to transportation administrators. Care must be taken to balance the safety of patients and other road users with potential negative effects of issuing such reports. The aim of this study was to assess whether a computer-based Driving in Dementia Decision Tool (DD-DT) increased appropriate reporting of patients with mild dementia or mild cognitive impairment to transportation administrators. The study used a parallel-group cluster nonblinded randomized controlled trial design to test a multifaceted knowledge translation intervention. The intervention included a computer-based decision support system activated by the physician-user, which provides a recommendation about whether to report patients with mild dementia or mild cognitive impairment to transportation administrators, based on an algorithm derived from earlier work. The intervention also included a mailed educational package and Web-based specialized reporting forms. Specialists and family physicians with expertise in dementia or care of the elderly were stratified by sex and randomized to either use the DD-DT or a control version of the tool that required identical data input as the intervention group, but instead generated a generic reminder about the reporting legislation in Ontario, Canada. The trial ran from September 9, 2014 to January 29, 2016, and the primary outcome was the number of reports made to the transportation administrators concordant with the algorithm. A total of 69 participating physicians were randomized, and 36 of these used the DD-DT; 20 of the 35 randomized to the intervention group used DD-DT with 114 patients, and 16 of the 34 randomized to the control group used it with 103 patients. The proportion of all assessed patients reported to the transportation administrators concordant with recommendation did not differ

  12. Social support and education groups for single mothers: a randomized controlled trial of a community-based program.

    Science.gov (United States)

    Lipman, Ellen L; Boyle, Michael H

    2005-12-06

    Members of families headed by single mothers are at increased risk of psychosocial disadvantage and mental health problems. We assessed the effect of a community-based program of social support and education groups for single mothers of young children on maternal well-being and parenting. We recruited 116 single mothers of children 3 to 9 years old through community advertisements. Eligible mothers were randomly assigned either to participate in a 10-week program of group sessions (1.5 hours per week) offering social support and education, with a parallel children's activity group, or to receive a standard list of community resources and the option to participate in group sessions at the end of the follow-up period. Interviewers blinded to the randomization collected assessment data from all mothers at baseline and at 3 follow-up visits (immediately after the intervention and at 3 and 6 months after the intervention). Outcome measures were self-reported mood, self-esteem, social support and parenting. Between February 2000 and April 2003, the program was offered to 9 groups of single mothers. Most of the mothers in the trial reported high levels of financial and mental health problems. In the short term (after the intervention), mothers in the intervention group had improved scores for mood (p effect = 0.55) and self-esteem (p effect = 0.29) compared with mothers in the control group; scores for the other 2 measures did not differ between the groups. Growth curve analysis of program effects over the follow-up period showed improvement in all 4 outcomes, with no significant difference between the intervention and control groups. This community-based program of group sessions offering social support and education to low-income single mothers had positive short-term effects on mood and self-esteem but not on social support and parenting. Longer follow-up showed attenuation of these effects.

  13. Updated teaching techniques improve CPR performance measures: a cluster randomized, controlled trial.

    Science.gov (United States)

    Ettl, Florian; Testori, Christoph; Weiser, Christoph; Fleischhackl, Sabine; Mayer-Stickler, Monika; Herkner, Harald; Schreiber, Wolfgang; Fleischhackl, Roman

    2011-06-01

    The first-aid training necessary for obtaining a drivers license in Austria has a regulated and predefined curriculum but has been targeted for the implementation of a new course structure with less theoretical input, repetitive training in cardiopulmonary resuscitation (CPR) and structured presentations using innovative media. The standard and a new course design were compared with a prospective, participant- and observer-blinded, cluster-randomized controlled study. Six months after the initial training, we evaluated the confidence of the 66 participants in their skills, CPR effectiveness parameters and correctness of their actions. The median self-confidence was significantly higher in the interventional group [IG, visual analogue scale (VAS:"0" not-confident at all,"100" highly confident):57] than in the control group (CG, VAS:41). The mean chest compression rate in the IG (98/min) was closer to the recommended 100 bpm than in the CG (110/min). The time to the first chest compression (IG:25s, CG:36s) and time to first defibrillator shock (IG:86s, CG:92s) were significantly shorter in the IG. Furthermore, the IG participants were safer in their handling of the defibrillator and started with countermeasures against developing shock more often. The management of an unconscious person and of heavy bleeding did not show a difference between the two groups even after shortening the lecture time. Motivation and self-confidence as well as skill retention after six months were shown to be dependent on the teaching methods and the time for practical training. Courses may be reorganized and content rescheduled, even within predefined curricula, to improve course outcomes. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  14. Generalized index for spatial data sets as a measure of complete spatial randomness

    Science.gov (United States)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  15. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    Science.gov (United States)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  16. Quorum system and random based asynchronous rendezvous protocol for cognitive radio ad hoc networks

    Directory of Open Access Journals (Sweden)

    Sylwia Romaszko

    2013-12-01

    Full Text Available This paper proposes a rendezvous protocol for cognitive radio ad hoc networks, RAC2E-gQS, which utilizes (1 the asynchronous and randomness properties of the RAC2E protocol, and (2 channel mapping protocol, based on a grid Quorum System (gQS, and taking into account channel heterogeneity and asymmetric channel views. We show that the combination of the RAC2E protocol with the grid-quorum based channel mapping can yield a powerful RAC2E-gQS rendezvous protocol for asynchronous operation in a distributed environment assuring a rapid rendezvous between the cognitive radio nodes having available both symmetric and asymmetric channel views. We also propose an enhancement of the protocol, which uses a torus QS for a slot allocation, dealing with the worst case scenario, a large number of channels with opposite ranking lists.

  17. Markov random field and Gaussian mixture for segmented MRI-based partial volume correction in PET

    International Nuclear Information System (INIS)

    Bousse, Alexandre; Thomas, Benjamin A; Erlandsson, Kjell; Hutton, Brian F; Pedemonte, Stefano; Ourselin, Sébastien; Arridge, Simon

    2012-01-01

    In this paper we propose a segmented magnetic resonance imaging (MRI) prior-based maximum penalized likelihood deconvolution technique for positron emission tomography (PET) images. The model assumes the existence of activity classes that behave like a hidden Markov random field (MRF) driven by the segmented MRI. We utilize a mean field approximation to compute the likelihood of the MRF. We tested our method on both simulated and clinical data (brain PET) and compared our results with PET images corrected with the re-blurred Van Cittert (VC) algorithm, the simplified Guven (SG) algorithm and the region-based voxel-wise (RBV) technique. We demonstrated our algorithm outperforms the VC algorithm and outperforms SG and RBV corrections when the segmented MRI is inconsistent (e.g. mis-segmentation, lesions, etc) with the PET image. (paper)

  18. Modified polarized geometrical attenuation model for bidirectional reflection distribution function based on random surface microfacet theory.

    Science.gov (United States)

    Liu, Hong; Zhu, Jingping; Wang, Kai

    2015-08-24

    The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization.

  19. Estimators of the Relations of Equivalence, Tolerance and Preference Based on Pairwise Comparisons with Random Errors

    Directory of Open Access Journals (Sweden)

    Leszek Klukowski

    2012-01-01

    Full Text Available This paper presents a review of results of the author in the area of estimation of the relations of equivalence, tolerance and preference within a finite set based on multiple, independent (in a stochastic way pairwise comparisons with random errors, in binary and multivalent forms. These estimators require weaker assumptions than those used in the literature on the subject. Estimates of the relations are obtained based on solutions to problems from discrete optimization. They allow application of both types of comparisons - binary and multivalent (this fact relates to the tolerance and preference relations. The estimates can be verified in a statistical way; in particular, it is possible to verify the type of the relation. The estimates have been applied by the author to problems regarding forecasting, financial engineering and bio-cybernetics. (original abstract

  20. Synthesis of Polystyrene-Based Random Copolymers with Balanced Number of Basic or Acidic Functional Groups

    DEFF Research Database (Denmark)

    Dimitrov, Ivaylo; Jankova Atanasova, Katja; Hvilsted, Søren

    2010-01-01

    for the functionalization were applied. The first one involved direct functionalization of the template backbone through alkylation of the phenolic groups with suitable reagents. The second modification approach was based on "click" chemistry, where the introduction of alkyne groups onto the template backbone was followed......Pairs of polystyrene-based random copolymers with balanced number of pendant basic or acidic groups were synthesized utilizing the template strategy. The same poly[(4-hydroxystyrene)-ran-styrene] was used as a template backbone for modification. Two different synthetic approaches...... by copper-catalyzed 1,3 cycloaddition of aliphatic sulfonate- or amine-contaning azides. Both synthetic approaches proved to be highly efficient as evidenced by H-1-NMR analyses. The thermal properties were evaluated by differential scanning calorimetry and thermal gravimetric analyses and were influenced...

  1. The effectiveness of xylitol in a school-based cluster-randomized clinical trial.

    Science.gov (United States)

    Lee, Wonik; Spiekerman, Charles; Heima, Masahiro; Eggertsson, Hafsteinn; Ferretti, Gerald; Milgrom, Peter; Nelson, Suchitra

    2015-01-01

    The purpose of this double-blind, cluster-randomized clinical trial was to examine the effects of xylitol gummy bear snacks on dental caries progression in primary and permanent teeth of inner-city school children. A total of 562 children aged 5-6 years were recruited from five elementary schools in East Cleveland, Ohio. Children were randomized by classroom to receive xylitol (7.8 g/day) or placebo (inulin fiber 20 g/day) gummy bears. Gummy bears were given three times per day for the 9-month kindergarten year within a supervised school environment. Children in both groups also received oral health education, toothbrush and fluoridated toothpaste, topical fluoride varnish treatment and dental sealants. The numbers of new decayed, missing, and filled surfaces for primary teeth (dmfs) and permanent teeth (DMFS) from baseline to the middle of 2nd grade (exit exam) were compared between the treatment (xylitol/placebo) groups using an optimally-weighted permutation test for cluster-randomized data. The mean new d(3-6)mfs at the exit exam was 5.0 ± 7.6 and 4.0 ± 6.5 for the xylitol and placebo group, respectively. Similarly, the mean new D(3-6)MFS was 0.38 ± 0.88 and 0.48 ± 1.39 for the xylitol and placebo group, respectively. The adjusted mean difference between the two groups was not statistically significant: new d(3-6)mfs: mean 0.4, 95% CI -0.25, 0.8), and new D(3-6)MFS: mean 0.16, 95% CI -0.16, 0.43. Xylitol consumption did not have additional benefit beyond other preventive measures. Caries progression in the permanent teeth of both groups was minimal, suggesting that other simultaneous prevention modalities may have masked the possible beneficial effects of xylitol in this trial. © 2014 S. Karger AG, Basel.

  2. Improving Adherence to Smoking Cessation Treatment: Smoking Outcomes in a Web-based Randomized Trial.

    Science.gov (United States)

    Graham, Amanda L; Papandonatos, George D; Cha, Sarah; Erar, Bahar; Amato, Michael S

    2018-03-15

    Partial adherence in Internet smoking cessation interventions presents treatment and evaluation challenges. Increasing adherence may improve outcomes. To present smoking outcomes from an Internet randomized trial of two strategies to encourage adherence to tobacco dependence treatment components: (i) a social network (SN) strategy to integrate smokers into an online community and (ii) free nicotine replacement therapy (NRT). In addition to intent-to-treat analyses, we used novel statistical methods to distinguish the impact of treatment assignment from treatment utilization. A total of 5,290 current smokers on a cessation website (WEB) were randomized to WEB, WEB + SN, WEB + NRT, or WEB + SN + NRT. The main outcome was 30-day point prevalence abstinence at 3 and 9 months post-randomization. Adherence measures included self-reported medication use (meds), and website metrics of skills training (sk) and community use (comm). Inverse Probability of Retention Weighting and Inverse Probability of Treatment Weighting jointly addressed dropout and treatment selection. Propensity weights were used to calculate Average Treatment effects on the Treated. Treatment assignment analyses showed no effects on abstinence for either adherence strategy. Abstinence rates were 25.7%-32.2% among participants that used all three treatment components (sk+comm +meds).Treatment utilization analyses revealed that among such participants, sk+comm+meds yielded large percentage point increases in 3-month abstinence rates over sk alone across arms: WEB = 20.6 (95% CI = 10.8, 30.4), WEB + SN = 19.2 (95% CI = 11.1, 27.3), WEB + NRT = 13.1 (95% CI = 4.1, 22.0), and WEB + SN + NRT = 20.0 (95% CI = 12.2, 27.7). Novel propensity weighting approaches can serve as a model for establishing efficacy of Internet interventions and yield important insights about mechanisms. NCT01544153.

  3. A pilot randomized controlled trial of mindfulness-based stress reduction for caregivers of family members with dementia.

    Science.gov (United States)

    Brown, Kirk Warren; Coogle, Constance L; Wegelin, Jacob

    2016-11-01

    The majority of care for those with Alzheimer's disease and other age-related dementias is provided in the home by family members. To date, there is no consistently effective intervention for reducing the significant stress burden of many family caregivers. The present pilot randomized controlled trial tested the efficacy of an adapted, eight-week mindfulness-based stress reduction (MBSR) program, relative to a near structurally equivalent, standard social support (SS) control condition for reducing caregiver stress and enhancing the care giver-recipient relationship. Thirty-eight family caregivers were randomized to MBSR or SS, with measures of diurnal salivary cortisol, and perceived stress, mental health, experiential avoidance, caregiver burden, and relationship quality collected pre- and post-intervention and at three-month follow-up. MBSR participants reported significantly lower levels of perceived stress and mood disturbance at post-intervention relative to SS participants. At three-month follow-up, participants in both treatment conditions reported improvements on several psychosocial outcomes. At follow-up, there were no condition differences on these outcomes, nor did MBSR and SS participants differ in diurnal cortisol response change over the course of the study. Both MBSR and SS showed stress reduction effects, and MBSR showed no sustained neuroendocrine and psychosocial advantages over SS. The lack of treatment condition differences could be attributable to active ingredients in both interventions, and to population-specific and design factors.

  4. Fracture toughness measurements of WC-based hard metals

    International Nuclear Information System (INIS)

    Prakash, L.; Albert, B.

    1983-01-01

    The fracture toughness of WC-based cemented carbides was determined by different methods. The values obtained are dependent on the procedure of measurement. Each method thoughness of hard metals mutually. (orig.) [de

  5. Multivariate Methods Based Soft Measurement for Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Shen Yin

    2014-01-01

    a decision. However, since the physicochemical indexes of wine can to some extent reflect the quality of wine, the multivariate statistical methods based soft measure can help the oenologist in wine evaluation.

  6. Dissemination of evidence-based cancer control interventions among Catholic faith-based organizations: results from the CRUZA randomized trial.

    Science.gov (United States)

    Allen, Jennifer D; Torres, Maria Idalí; Tom, Laura S; Leyva, Bryan; Galeas, Ana V; Ospino, Hosffman

    2016-05-18

    The CRUZA randomized trial tested the efficacy of an organizational-level intervention to increase the capacity of Catholic faith-based organizations (FBOs) serving Latinos to implement evidence-based strategies (EBS) for cancer control. Thirty-one Catholic parishes were enrolled. Twenty were randomized to a "capacity enhancement" (CE) intervention and 11 to a "standard dissemination" (SD) condition. Each received a Program Implementation Manual and Toolkit of materials culturally adapted for FBOs with Latino audiences for five types of EBS recommended by the US Preventive Services Community Guide. CE parishes were offered a menu of capacity-building activities over a 3-month period, while SD parishes were provided a one-time consultation by an Intervention Specialist. Baseline and follow-up surveys compared the number and types of EBS offered. At baseline, only one parish had offered any cancer-related program in the prior year, yet a third (36 %) had offered some other type of health program or service. At post-intervention follow-up, all parishes offered a greater number of EBS. The only statistically significant difference between CE and SD groups was the number of parishes offering small media interventions (90 % in CE, 64 % in SD; p support to carry out programming. Further research is needed to examine the extent to which program offerings continued after the period of grant funding. Clinicaltrials.gov NCT01740219 .

  7. Feasibility of scenario-based simulation training versus traditional workshops in continuing medical education: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Brendan Kerr

    2013-07-01

    Full Text Available Introduction: Although simulation-based training is increasingly used for medical education, its benefits in continuing medical education (CME are less established. This study seeks to evaluate the feasibility of incorporating simulation-based training into a CME conference and compare its effectiveness with the traditional workshop in improving knowledge and self-reported confidence. Methods: Participants (N=27 were group randomized to either a simulation-based workshop or a traditional case-based workshop. Results: Post-training, knowledge assessment score neither did increase significantly in the traditional group (d=0.13; p=0.76 nor did significantly decrease in the simulation group (d= − 0.44; p=0.19. Self-reported comfort in patient assessment parameters increased in both groups (p<0.05 in all. However, only the simulation group reported an increase in comfort in patient management (d=1.1, p=0.051 for the traditional group and d=1.3; p= 0.0003 for the simulation group. At 1 month, comfort measures in the traditional group increased consistently over time while these measures in the simulation group increased post-workshop but decreased by 1 month, suggesting that some of the effects of training with simulation may be short lived. Discussion: The use of simulation-based training was not associated with benefits in knowledge acquisition, knowledge retention, or comfort in patient assessment. It was associated with superior outcomes in comfort in patient management, but this benefit may be short-lived. Further studies are required to better define the conditions under which simulation-based training is beneficial.

  8. A randomized clinical trial in preterm infants on the effects of a home-based early intervention with the 'CareToy System'

    DEFF Research Database (Denmark)

    Sgandurra, Giuseppina; Lorentzen, Jakob; Inguaggiato, Emanuela

    2017-01-01

    and visual development in preterm infants. 41 preterm infants (range age: 3.0-5.9 months of corrected age) were enrolled and randomized into two groups, CareToy and Standard Care. 19 infants randomized in CareToy group performed a 4-week CareToy program, while 22 allocated to control group completed 4 weeks......CareToy system is an innovative tele-rehabilitative tool, useful in providing intensive, individualized, home-based, family-centred Early Intervention (EI) in infants. Our aim was to evaluate, through a Randomized Clinical Trial (RCT) study, the effects of CareToy intervention on early motor...... of Standard Care. Infant Motor Profile (IMP) was primary outcome measure, Alberta Infant Motor Scale (AIMS) and Teller Acuity Cards were secondary ones. Assessments were carried out at baseline (T0) and at the end of CareToy training or Standard Care period (T1). T1 was the primary endpoint. After RCT phase...

  9. Prediction of plant promoters based on hexamers and random triplet pair analysis

    Directory of Open Access Journals (Sweden)

    Noman Nasimul

    2011-06-01

    Full Text Available Abstract Background With an increasing number of plant genome sequences, it has become important to develop a robust computational method for detecting plant promoters. Although a wide variety of programs are currently available, prediction accuracy of these still requires further improvement. The limitations of these methods can be addressed by selecting appropriate features for distinguishing promoters and non-promoters. Methods In this study, we proposed two feature selection approaches based on hexamer sequences: the Frequency Distribution Analyzed Feature Selection Algorithm (FDAFSA and the Random Triplet Pair Feature Selecting Genetic Algorithm (RTPFSGA. In FDAFSA, adjacent triplet-pairs (hexamer sequences were selected based on the difference in the frequency of hexamers between promoters and non-promoters. In RTPFSGA, random triplet-pairs (RTPs were selected by exploiting a genetic algorithm that distinguishes frequencies of non-adjacent triplet pairs between promoters and non-promoters. Then, a support vector machine (SVM, a nonlinear machine-learning algorithm, was used to classify promoters and non-promoters by combining these two feature selection approaches. We referred to this novel algorithm as PromoBot. Results Promoter sequences were collected from the PlantProm database. Non-promoter sequences were collected from plant mRNA, rRNA, and tRNA of PlantGDB and plant miRNA of miRBase. Then, in order to validate the proposed algorithm, we applied a 5-fold cross validation test. Training data sets were used to select features based on FDAFSA and RTPFSGA, and these features were used to train the SVM. We achieved 89% sensitivity and 86% specificity. Conclusions We compared our PromoBot algorithm to five other algorithms. It was found that the sensitivity and specificity of PromoBot performed well (or even better with the algorithms tested. These results show that the two proposed feature selection methods based on hexamer frequencies

  10. Population-based versus practice-based recall for childhood immunizations: a randomized controlled comparative effectiveness trial.

    Science.gov (United States)

    Kempe, Allison; Saville, Alison; Dickinson, L Miriam; Eisert, Sheri; Reynolds, Joni; Herrero, Diana; Beaty, Brenda; Albright, Karen; Dibert, Eva; Koehler, Vicky; Lockhart, Steven; Calonge, Ned

    2013-06-01

    We compared the effectiveness and cost-effectiveness of population-based recall (Pop-recall) versus practice-based recall (PCP-recall) at increasing immunizations among preschool children. This cluster-randomized trial involved children aged 19 to 35 months needing immunizations in 8 rural and 6 urban Colorado counties. In Pop-recall counties, recall was conducted centrally using the Colorado Immunization Information System (CIIS). In PCP-recall counties, practices were invited to attend webinar training using CIIS and offered financial support for mailings. The percentage of up-to-date (UTD) and vaccine documentation were compared 6 months after recall. A mixed-effects model assessed the association between intervention and whether a child became UTD. Ten of 195 practices (5%) implemented recall in PCP-recall counties. Among children needing immunizations, 18.7% became UTD in Pop-recall versus 12.8% in PCP-recall counties (P immunization rates in preschool children.

  11. Vitamin D measurement in the intensive care unit: methodology, clinical relevance and interpretation of a random value.

    Science.gov (United States)

    Krishnan, Anand; Venkatesh, Bala

    2013-08-01

    Vitamin D deficiency, as measured by a random level of 25-hydroxyvitamin D is very prevalent in critically ill patients admitted to the ICU and is associated with adverse outcomes. Both 25(OH)vitamin D and 1α,25(OH)2D3 are difficult to analyse because of their lipophilic nature, affinity for VDBP and small concentrations. Also, the various tests used to estimate vitamin D levels show significant inter- and intra-assay variability, which significantly affect the veracity of the results obtained and confound their interpretation. The two main types of assays include those that directly estimate vitamin D levels (HPLC, LC-MS/MS) and competitive binding assays (RIA, EIA). The former methods require skilled operators, with prolonged assay times and increased cost, whereas the latter are cheaper and easy to perform, but with decreased accuracy. The direct assays are not affected by lipophilic substances in plasma and heterophile antibodies, but may overestimate vitamin D levels by measuring the 3-epimers. These problems can be eliminated by adequate standardization of the test using SRMs provided by NIST, as well as participating in proficiency schemes like DEQAS. It is therefore important to consider the test employed as well as laboratory quality control, while interpreting vitamin D results. A single random measurement may not be reflective of the vitamin D status in ICU patients because of changes with fluid administration, and intra-day variation in 25-hydroxyvitamin D levels. 1α,25(OH)2D3 may behave differently to 25-hydroxyvitamin D, both in plasma and at tissue level, in inflammatory states. Measurement of tissue 1α,25(OH)2D3 levels may provide the true estimate of vitamin D activity.

  12. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip

    Directory of Open Access Journals (Sweden)

    Jane Louie Fresco Zamora

    2015-01-01

    Full Text Available Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  13. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip.

    Science.gov (United States)

    Zamora, Jane Louie Fresco; Kashihara, Shigeru; Yamaguchi, Suguru

    2015-01-01

    Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  14. Measuring Disorientation Based on the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Güyer, Tolga; Atasoy, Bilal; Somyürek, Sibel

    2015-01-01

    This study offers a new method to measure navigation disorientation in web based systems which is powerful learning medium for distance and open education. The Needleman-Wunsch algorithm is used to measure disorientation in a more precise manner. The process combines theoretical and applied knowledge from two previously distinct research areas,…

  15. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  16. Attrition and Adherence in a Web-Based Distress Management Program for Implantable Cardioverter Defibrillator Patients (WEBCARE): Randomized Controlled Trial

    DEFF Research Database (Denmark)

    Habibovic, M.; Cuijpers, P.; Alings, M.

    2014-01-01

    Background: WEB-Based Distress Management Program for Implantable CARdioverter defibrillator Patients (WEBCARE) is a Web-based randomized controlled trial, designed to improve psychological well-being in patients with an implantable cardioverter defibrillator (ICD). As in other Web-based trials, ...

  17. Rationale and design of a randomized controlled trial examining the effect of classroom-based physical activity on math achievement

    Directory of Open Access Journals (Sweden)

    Mona Have

    2016-04-01

    Full Text Available Abstract Background Integration of physical activity (PA into the classroom may be an effective way of promoting the learning and academic achievement of children at elementary school. This paper describes the research design and methodology of an intervention study examining the effect of classroom-based PA on mathematical achievement, creativity, executive function, body mass index and aerobic fitness. Methods The study was designed as a school-based cluster-randomized controlled trial targeting schoolchildren in 1st grade, and was carried out between August 2012 and June 2013. Eligible schools in two municipalities in the Region of Southern Denmark were invited to participate in the study. After stratification by municipality, twelve schools were randomized to either an intervention group or a control group, comprising a total of 505 children with mean age 7.2 ± 0.3 years. The intervention was a 9-month classroom-based PA program that involved integration of PA into the math lessons delivered by the schools’ math teachers. The primary study outcome was change in math achievement, measured by a 45-minute standardized math test. Secondary outcomes were change in executive function (using a modified Eriksen flanker task and the Behavior Rating Inventory of Executive Function (BRIEF questionnaire filled out by the parents, creativity (using the Torrance Tests of Creative Thinking, TTCT, aerobic fitness (by the Andersen intermittent shuttle-run test and body mass index. PA during math lessons and total PA (including time spent outside school were assessed using accelerometry. Math teachers used Short Message Service (SMS-tracking to report on compliance with the PA intervention and on their motivation for implementing PA in math lessons. Parents used SMS-tracking to register their children’s PA behavior in leisure time. Discussion The results of this randomized controlled trial are expected to provide schools and policy-makers with

  18. Rationale and design of a randomized controlled trial examining the effect of classroom-based physical activity on math achievement.

    Science.gov (United States)

    Have, Mona; Nielsen, Jacob Have; Gejl, Anne Kær; Thomsen Ernst, Martin; Fredens, Kjeld; Støckel, Jan Toftegaard; Wedderkopp, Niels; Domazet, Sidsel Louise; Gudex, Claire; Grøntved, Anders; Kristensen, Peter Lund

    2016-04-11

    Integration of physical activity (PA) into the classroom may be an effective way of promoting the learning and academic achievement of children at elementary school. This paper describes the research design and methodology of an intervention study examining the effect of classroom-based PA on mathematical achievement, creativity, executive function, body mass index and aerobic fitness. The study was designed as a school-based cluster-randomized controlled trial targeting schoolchildren in 1st grade, and was carried out between August 2012 and June 2013. Eligible schools in two municipalities in the Region of Southern Denmark were invited to participate in the study. After stratification by municipality, twelve schools were randomized to either an intervention group or a control group, comprising a total of 505 children with mean age 7.2 ± 0.3 years. The intervention was a 9-month classroom-based PA program that involved integration of PA into the math lessons delivered by the schools' math teachers. The primary study outcome was change in math achievement, measured by a 45-minute standardized math test. Secondary outcomes were change in executive function (using a modified Eriksen flanker task and the Behavior Rating Inventory of Executive Function (BRIEF) questionnaire filled out by the parents), creativity (using the Torrance Tests of Creative Thinking, TTCT), aerobic fitness (by the Andersen intermittent shuttle-run test) and body mass index. PA during math lessons and total PA (including time spent outside school) were assessed using accelerometry. Math teachers used Short Message Service (SMS)-tracking to report on compliance with the PA intervention and on their motivation for implementing PA in math lessons. Parents used SMS-tracking to register their children's PA behavior in leisure time. The results of this randomized controlled trial are expected to provide schools and policy-makers with significant new insights into the potential of classroom-based

  19. Employment-Based Abstinence Reinforcement as a Maintenance Intervention for the Treatment of Cocaine Dependence: A Randomized Controlled Trial

    Science.gov (United States)

    DeFulio, Anthony; Donlin, Wendy D.; Wong, Conrad J.; Silverman, Kenneth

    2009-01-01

    Context: Due to the chronic nature of cocaine dependence, long-term maintenance treatments may be required to sustain abstinence. Abstinence reinforcement is among the most effective means of initiating cocaine abstinence. Practical and effective means of maintaining abstinence reinforcement programs over time are needed. Objective: Determine whether employment-based abstinence reinforcement can be an effective long-term maintenance intervention for cocaine dependence. Design: Participants (N=128) were enrolled in a 6-month job skills training and abstinence initiation program. Participants who initiated abstinence, attended regularly, and developed needed job skills during the first six months were hired as operators in a data entry business and randomly assigned to an employment only (Control, n = 24) or abstinence-contingent employment (n = 27) group. Setting: A nonprofit data entry business. Participants: Unemployed welfare recipients who persistently used cocaine while enrolled in methadone treatment in Baltimore. Intervention: Abstinence-contingent employment participants received one year of employment-based contingency management, in which access to employment was contingent on provision drug-free urine samples under routine and then random drug testing. If a participant provided drug-positive urine or failed to provide a mandatory sample, then that participant received a temporary reduction in pay and could not work until urinalysis confirmed recent abstinence. Main Outcome Measure: Cocaine-negative urine samples at monthly assessments across one year of employment. Results: During the one-year of employment, abstinence-contingent employment participants provided significantly more cocaine-negative urine samples than employment only participants (79.3% and 50.7%, respectively; p = 0.004, OR = 3.73, 95% CI = 1.60 – 8.69). Conclusions: Employment-based abstinence reinforcement that includes random drug testing is effective as a long-term maintenance

  20. Effect of an Internet-Based Program on Weight Loss for Low-Income Postpartum Women: A Randomized Clinical Trial.

    Science.gov (United States)

    Phelan, Suzanne; Hagobian, Todd; Brannen, Anna; Hatley, Karen E; Schaffner, Andrew; Muñoz-Christian, Karen; Tate, Deborah F

    2017-06-20

    Postpartum weight retention increases lifetime risk of obesity and related morbidity. Few effective interventions exist for multicultural, low-income women. To test whether an internet-based weight loss program in addition to the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC program) for low-income postpartum women could produce greater weight loss than the WIC program alone over 12 months. A 12-month, cluster randomized, assessor-blind, clinical trial enrolling 371 adult postpartum women at 12 clinics in WIC programs from the California central coast between July 2011 and May 2015 with data collection completed in May 2016. Clinics were randomized to the WIC program (standard care group) or the WIC program plus a 12-month primarily internet-based weight loss program (intervention group), including a website with weekly lessons, web diary, instructional videos, computerized feedback, text messages, and monthly face-to-face groups at the WIC clinics. The primary outcome was weight change over 12 months, based on measurements at baseline, 6 months, and 12 months. Secondary outcomes included proportion returning to preconception weight and changes in physical activity and diet. Participants included 371 women (mean age, 28.1 years; Hispanic, 81.6%; mean weight above prepregnancy weight, 7.8 kg; mean months post partum, 5.2 months) randomized to the intervention group (n = 174) or standard care group (n = 197); 89.2% of participants completed the study. The intervention group produced greater mean 12-month weight loss compared with the standard care group (3.2 kg in the intervention group vs 0.9 kg in standard care group, P income postpartum women, an internet-based weight loss program in addition to the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC program) compared with the WIC program alone resulted in a statistically significant greater weight loss over 12 months. Further research is needed to

  1. A Randomized Controlled Trial of COMPASS Web-Based and Face-to-Face Teacher Coaching in Autism

    Science.gov (United States)

    Ruble, Lisa A.; McGrew, John H.; Toland, Michael D.; Dalrymple, Nancy J.; Jung, Lee Ann

    2013-01-01

    Objective Most children with autism rely on schools as their primary source of intervention, yet research has suggested that teachers rarely use evidence-based practices. To address the need for improved educational outcomes, a previously tested consultation intervention called the Collaborative Model for Promoting Competence and Success (COMPASS; Ruble, Dalrymple, & McGrew, 2010; Ruble, Dalrymple, & McGrew, 2012) was evaluated in a 2nd randomized controlled trial, with the addition of a web-based group. Method Forty-nine teacher–child dyads were randomized into 1 of 3 groups: (1) a placebo control (PBO) group, (2) COMPASS followed by face-to-face (FF) coaching sessions, and (3) COMPASS followed by web-based (WEB) coaching sessions. Three individualized goals (social, communication, and independence skills) were selected for intervention for each child. The primary outcome of independent ratings of child goal attainment and several process measures (e.g., consultant and teacher fidelity) were evaluated. Results Using an intent-to-treat approach, findings replicated earlier results with a very large effect size (d = 1.41) for the FF group and a large effect size (d = 1.12) for the WEB group relative to the PBO group. There were no differences in overall change across goal domains between the FF and WEB groups, suggesting the efficacy of videoconferencing technology. Conclusions COMPASS is effective and results in improved educational outcomes for young children with autism. Videoconferencing technology, as a scalable tool, has promise for facilitating access to autism specialists and bridging the research-to-practice gap. PMID:23438314

  2. Reducing procrastination using a smartphone-based treatment program: A randomized controlled pilot study

    Directory of Open Access Journals (Sweden)

    Christian Aljoscha Lukas

    2018-06-01

    Full Text Available Background: Procrastination affects a large number of individuals and is associated with significant mental health problems. Despite the deleterious consequences individuals afflicted with procrastination have to bear, there is a surprising paucity of well-researched treatments for procrastination. To fill this gap, this study evaluated the efficacy of an easy-to-use smartphone-based treatment for procrastination. Method: N=31 individuals with heightened procrastination scores were randomly assigned to a blended smartphone-based intervention including two brief group counseling sessions and 14days of training with the mindtastic procrastination app (MT-PRO, or to a waitlist condition. MT-PRO fosters the approach of functional and the avoidance of dysfunctional behavior by systematically utilizing techniques derived from cognitive bias modification approaches, gamification principles, and operant conditioning. Primary outcome was the course of procrastination symptom severity as assessed with the General Procrastination Questionnaire. Results: Participating in the smartphone-based treatment was associated with a significantly greater reduction of procrastination than was participating in the control condition (η2=.15. Conclusion: A smartphone-based intervention may be an effective treatment for procrastination. Future research should use larger samples and directly compare the efficacy of smartphone-based interventions and traditional interventions for procrastination. Keywords: Procrastination, Intervention, Treatment, Smartphone, Mobile health

  3. Recruiting and retaining family caregivers to a randomized controlled trial on mindfulness-based stress reduction.

    Science.gov (United States)

    Whitebird, Robin R; Kreitzer, Mary Jo; Lewis, Beth A; Hanson, Leah R; Crain, A Lauren; Enstad, Chris J; Mehta, Adele

    2011-09-01

    Caregivers for a family member with dementia experience chronic long-term stress that may benefit from new complementary therapies such as mindfulness-based stress reduction. Little is known however, about the challenges of recruiting and retaining family caregivers to research on mind-body based complementary therapies. Our pilot study is the first of its kind to successfully recruit caregivers for a family member with dementia to a randomized controlled pilot study of mindfulness-based stress reduction. The study used an array of recruitment strategies and techniques that were tailored to fit the unique features of our recruitment sources and employed retention strategies that placed high value on establishing early and ongoing communication with potential participants. Innovative recruitment methods including conducting outreach to health plan members and generating press coverage were combined with standard methods of community outreach and paid advertising. We were successful in exceeding our recruitment goal and retained 92% of the study participants at post-intervention (2 months) and 90% at 6 months. Recruitment and retention for family caregiver interventions employing mind-body based complementary therapies can be successful despite many challenges. Barriers include cultural perceptions about the use and benefit of complementary therapies, cultural differences with how the role of family caregiver is perceived, the use of group-based designs requiring significant time commitment by participants, and travel and respite care needs for busy family caregivers. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    Science.gov (United States)

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  5. Specialized home treatment versus hospital-based outpatient treatment for first-episode psychosis: a randomized clinical trial.

    Science.gov (United States)

    Dewa, Carolyn S; Zipursky, Robert B; Chau, Nancy; Furimsky, Ivana; Collins, April; Agid, Ofer; Goering, Paula

    2009-11-01

    This pilot study compared the effectiveness of specialized care that was home based versus hospital based for individuals experiencing their first psychotic episode. A randomized controlled trial design was used. A total of 29 subjects were interviewed at baseline, 3 and 9 months. Repeated measures analysis of variance was employed to test for statistically significant changes over time within and between groups with regard to community psychosocial functioning and symptom severity. Our findings indicate that subjects in both the home-based and hospital-based programmes significantly improved with regard to symptoms and community functioning over time. However, the rates of change over time were not significantly different between the two programmes. There was a statistically significant difference between programmes with regard to the proportion of subjects with less than two visits (i.e. either did not attend their first assessment or attended follow-up visits after their assessment). This was a modest pilot study and the sample was too small to allow definitive conclusions to be drawn. However, the results raise questions about differences in initial treatment engagement. They suggest the need for additional research focusing on interventions that promote initial treatment seeking. © 2009 The Authors. Journal compilation © 2009 Blackwell Publishing Asia Pty Ltd.

  6. A web-based lifestyle intervention for women with recent gestational diabetes mellitus: a randomized controlled trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Zera, Chloe A; England, Lucinda J; Rosner, Bernard A; Horton, Edward; Levkoff, Sue E; Seely, Ellen W

    2014-09-01

    To test the feasibility and effectiveness of a Web-based lifestyle intervention based on the Diabetes Prevention Program modified for women with recent gestational diabetes mellitus to reduce postpartum weight retention. We randomly allocated 75 women with recent gestational diabetes mellitus to either a Web-based lifestyle program (Balance after Baby) delivered over the first postpartum year or to a control group. Primary outcomes were change in body weight at 12 months from 1) first postpartum measured weight; and 2) self-reported prepregnancy weight. There were no significant differences in baseline characteristics between groups including age, body mass index, race, and income status. Women assigned to the Balance after Baby program (n=36, three lost to follow-up) lost a mean of 2.8 kg (95% confidence interval -4.8 to -0.7) from 6 weeks to 12 months postpartum, whereas the control group (n=39, one lost to follow-up) gained a mean of 0.5 kg (-1.4 to +2.4) (P=.022). Women in the intervention were closer to prepregnancy weight at 12 months postpartum (mean change -0.7 kg; -3.5 to +2.2) compared with women in the control arm (+4.0 kg; +1.3 to +6.8) (P=.035). A Web-based lifestyle modification program for women with recent gestational diabetes mellitus decreased postpartum weight retention. ClinicalTrials.gov, www.clinicaltrials.gov, NCT01158131. I.

  7. Exposure and non-fear emotions: A randomized controlled study of exposure-based and rescripting-based imagery in PTSD treatment.

    Science.gov (United States)

    Langkaas, Tomas Formo; Hoffart, Asle; Øktedalen, Tuva; Ulvenes, Pål G; Hembree, Elizabeth A; Smucker, Mervin

    2017-10-01

    Interventions involving rescripting-based imagery have been proposed as a better approach than exposure-based imagery when posttraumatic stress disorder (PTSD) is associated with emotions other than fear. Prior research led to the study's hypotheses that (a) higher pretreatment non-fear emotions would predict relatively better response to rescripting as compared to exposure, (b) rescripting would be associated with greater reduction in non-fear emotions, and (c) pretreatment non-fear emotions would predict poor response to exposure. A clinically representative sample of 65 patients presenting a wide range of traumas was recruited from patients seeking and being offered PTSD treatment in an inpatient setting. Subjects were randomly assigned to 10 weeks of treatment involving either rescripting-based imagery (Imagery Rescripting; IR) or exposure-based imagery (Prolonged Exposure; PE). Patients were assessed on outcome and emotion measures at pretreatment, posttreatment and 12 months follow-up. Comparison to control benchmarks indicated that both treatments were effective, but no outcome differences between them appeared. None of the initial hypotheses were supported. The results from this study challenge previous observations and hypotheses about exposure mainly being effective for fear-based PTSD and strengthen the notion that exposure-based treatment is a generally effective treatment for all types of PTSD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. The Optimal Wavelengths for Light Absorption Spectroscopy Measurements Based on Genetic Algorithm-Particle Swarm Optimization

    Science.gov (United States)

    Tang, Ge; Wei, Biao; Wu, Decao; Feng, Peng; Liu, Juan; Tang, Yuan; Xiong, Shuangfei; Zhang, Zheng

    2018-03-01

    To select the optimal wavelengths in the light extinction spectroscopy measurement, genetic algorithm-particle swarm optimization (GAPSO) based on genetic algorithm (GA) and particle swarm optimization (PSO) is adopted. The change of the optimal wavelength positions in different feature size parameters and distribution parameters is evaluated. Moreover, the Monte Carlo method based on random probability is used to identify the number of optimal wavelengths, and good inversion effects of the particle size distribution are obtained. The method proved to have the advantage of resisting noise. In order to verify the feasibility of the algorithm, spectra with bands ranging from 200 to 1000 nm are computed. Based on this, the measured data of standard particles are used to verify the algorithm.

  9. Telephone based cognitive behavioral therapy targeting major depression among urban dwelling, low income people living with HIV/AIDS: results of a randomized controlled trial.

    Science.gov (United States)

    Himelhoch, Seth; Medoff, Deborah; Maxfield, Jennifer; Dihmes, Sarah; Dixon, Lisa; Robinson, Charles; Potts, Wendy; Mohr, David C

    2013-10-01

    This pilot randomized controlled trial evaluated a previously developed manualized telephone based cognitive behavioral therapy (T-CBT) intervention compared to face-to-face (f2f) therapy among low-income, urban dwelling HIV infected depressed individuals. The primary outcome was the reduction of depressive symptoms as measured by the Hamliton rating scale for depression scale. The secondary outcome was adherence to HAART as measured by random telephone based pill counts. Outcome measures were collected by trained research assistants masked to treatment allocation. Analysis was based on intention-to-treat. Thirty-four participants met eligibility criteria and were randomly assigned to receive T-CBT (n = 16) or f2f (n = 18). There was no statistically significant difference in depression treatment outcomes comparing f2f to T-CBT. Within group evaluation demonstrated that both the T-CBT and the f2f psychotherapy groups resulted in significant reductions in depressive symptoms. Those who received the T-CBT were significantly more likely to maintain their adherence to antiretroviral medication compared to the f2f treatment. None of the participants discontinued treatment due to adverse events. T-CBT can be delivered to low-income, urban dwelling HIV infected depressed individuals resulting in significant reductions in depression symptoms and improved adherence to antiretroviral medication. Clinical Trial.gov identifier: NCT01055158.

  10. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis

    DEFF Research Database (Denmark)

    Dobson, F; Hinman, R S; Hall, M

    2012-01-01

    OBJECTIVES: To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). METHODS: Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two...... investigating measurement properties of performance measures, including responsiveness and interpretability in people with hip and/or knee OA, is needed. Consensus on which combination of measures will best assess physical function in people with hip/and or knee OA is urgently required....

  11. Embedded Platform for Automatic Testing and Optimizing of FPGA Based Cryptographic True Random Number Generators

    Directory of Open Access Journals (Sweden)

    M. Varchola

    2009-12-01

    Full Text Available This paper deals with an evaluation platform for cryptographic True Random Number Generators (TRNGs based on the hardware implementation of statistical tests for FPGAs. It was developed in order to provide an automatic tool that helps to speed up the TRNG design process and can provide new insights on the TRNG behavior as it will be shown on a particular example in the paper. It enables to test sufficient statistical properties of various TRNG designs under various working conditions on the fly. Moreover, the tests are suitable to be embedded into cryptographic hardware products in order to recognize TRNG output of weak quality and thus increase its robustness and reliability. Tests are fully compatible with the FIPS 140 standard and are implemented by the VHDL language as an IP-Core for vendor independent FPGAs. A recent Flash based Actel Fusion FPGA was chosen for preliminary experiments. The Actel version of the tests possesses an interface to the Actel’s CoreMP7 softcore processor that is fully compatible with the industry standard ARM7TDMI. Moreover, identical tests suite was implemented to the Xilinx Virtex 2 and 5 in order to compare the performance of the proposed solution with the performance of already published one based on the same FPGAs. It was achieved 25% and 65% greater clock frequency respectively while consuming almost equal resources of the Xilinx FPGAs. On the top of it, the proposed FIPS 140 architecture is capable of processing one random bit per one clock cycle which results in 311.5 Mbps throughput for Virtex 5 FPGA.

  12. Deviation rectification for dynamic measurement of rail wear based on coordinate sets projection

    International Nuclear Information System (INIS)

    Wang, Chao; Ma, Ziji; Li, Yanfu; Liu, Hongli; Zeng, Jiuzhen; Jin, Tan

    2017-01-01

    Dynamic measurement of rail wear using a laser imaging system suffers from random vibrations in the laser-based imaging sensor which cause distorted rail profiles. In this paper, a simple and effective method for rectifying profile deviation is presented to address this issue. There are two main steps: profile recognition and distortion calibration. According to the constant camera and projector parameters, efficient recognition of measured profiles is achieved by analyzing the geometric difference between normal profiles and distorted ones. For a distorted profile, by constructing coordinate sets projecting from it to the standard one on triple projecting primitives, including the rail head inner line, rail waist curve and rail jaw, iterative extrinsic camera parameter self-compensation is implemented. The distortion is calibrated by projecting the distorted profile onto the x – y plane of a measuring coordinate frame, which is parallel to the rail cross section, to eliminate the influence of random vibrations in the laser-based imaging sensor. As well as evaluating the implementation with comprehensive experiments, we also compare our method with other published works. The results exhibit the effectiveness and superiority of our method for the dynamic measurement of rail wear. (paper)

  13. Deviation rectification for dynamic measurement of rail wear based on coordinate sets projection

    Science.gov (United States)

    Wang, Chao; Ma, Ziji; Li, Yanfu; Zeng, Jiuzhen; Jin, Tan; Liu, Hongli

    2017-10-01

    Dynamic measurement of rail wear using a laser imaging system suffers from random vibrations in the laser-based imaging sensor which cause distorted rail profiles. In this paper, a simple and effective method for rectifying profile deviation is presented to address this issue. There are two main steps: profile recognition and distortion calibration. According to the constant camera and projector parameters, efficient recognition of measured profiles is achieved by analyzing the geometric difference between normal profiles and distorted ones. For a distorted profile, by constructing coordinate sets projecting from it to the standard one on triple projecting primitives, including the rail head inner line, rail waist curve and rail jaw, iterative extrinsic camera parameter self-compensation is implemented. The distortion is calibrated by projecting the distorted profile onto the x-y plane of a measuring coordinate frame, which is parallel to the rail cross section, to eliminate the influence of random vibrations in the laser-based imaging sensor. As well as evaluating the implementation with comprehensive experiments, we also compare our method with other published works. The results exhibit the effectiveness and superiority of our method for the dynamic measurement of rail wear.

  14. Promoting first relationships: randomized trial of a relationship-based intervention for toddlers in child welfare.

    Science.gov (United States)

    Spieker, Susan J; Oxford, Monica L; Kelly, Jean F; Nelson, Elizabeth M; Fleming, Charles B

    2012-11-01

    We conducted a community-based, randomized control trial with intent-to-treat analyses of Promoting First Relationships (PFR) to improve parenting and toddler outcomes for toddlers in state dependency. Toddlers (10-24 months; N = 210) with a recent placement disruption were randomized to 10-week PFR or a comparison condition. Community agency providers were trained to use PFR in the intervention for caregivers. From baseline to postintervention, observational ratings of caregiver sensitivity improved more in the PFR condition than in the comparison condition, with an effect size for the difference in adjusted means postintervention of d = .41. Caregiver understanding of toddlers' social emotional needs and caregiver reports of child competence also differed by intervention condition postintervention (d = .36 and d = .42) with caregivers in the PFR condition reporting more understanding of toddlers and child competence. Models of PFR effects on within-individual change were significant for caregiver sensitivity and understanding of toddlers. At the 6-month follow-up, only 61% of original sample dyads were still intact and there were no significant differences on caregiver or child outcomes.

  15. Recombinant streptokinase vs phenylephrine-based suppositories in acute hemorrhoids, randomized, controlled trial (THERESA-3)

    Science.gov (United States)

    Hernández-Bernal, Francisco; Castellanos-Sierra, Georgina; Valenzuela-Silva, Carmen M; Catasús-Álvarez, Karem M; Valle-Cabrera, Roselin; Aguilera-Barreto, Ana; López-Saura, Pedro A

    2014-01-01

    AIM: To compare the efficacy and safety of recombinant streptokinase (rSK) and phenylephrine-based suppositories in acute hemorrhoidal disease. METHODS: A multicenter (14 sites), randomized (1:1), open, parallel groups, active controlled trial was done. After inclusion, subjects with acute symptoms of hemorrhoids, who gave their written, informed consent to participate, were centrally randomized to receive, as outpatients, rSK (200000 IU) or 0.25% phenylephrine suppositories, which had different organoleptic characteristics. Treatment was administered by the rectal route, one unit every 6 h during 48 h for rSK, and up to a maximum of 5 d (20 suppositories) for phenylephrine. Evaluations were performed at 3, 5 and 10 d post-inclusion. The main end-point was the 5th-day complete clinical response (disappearance of pain and edema, and ≥ 70% reduction of the lesion size). Time to response and need for thrombectomy were secondary efficacy variables. Adverse events were evaluated too. RESULTS: 5th day complete response rates were 83/110 (75.5%) and 36/110 (32.7%) with rSK and phenylephrine suppositories, respectively. This 42.7% difference (95%CI: 30.5-54.2) was highly significant (P hemorrhoidal illness, with an adequate safety profile. PMID:24587636

  16. Recombinant streptokinase vs phenylephrine-based suppositories in acute hemorrhoids, randomized, controlled trial (THERESA-3).

    Science.gov (United States)

    Hernández-Bernal, Francisco; Castellanos-Sierra, Georgina; Valenzuela-Silva, Carmen M; Catasús-Álvarez, Karem M; Valle-Cabrera, Roselin; Aguilera-Barreto, Ana; López-Saura, Pedro A

    2014-02-14

    To compare the efficacy and safety of recombinant streptokinase (rSK) and phenylephrine-based suppositories in acute hemorrhoidal disease. A multicenter (14 sites), randomized (1:1), open, parallel groups, active controlled trial was done. After inclusion, subjects with acute symptoms of hemorrhoids, who gave their written, informed consent to participate, were centrally randomized to receive, as outpatients, rSK (200000 IU) or 0.25% phenylephrine suppositories, which had different organoleptic characteristics. Treatment was administered by the rectal route, one unit every 6 h during 48 h for rSK, and up to a maximum of 5 d (20 suppositories) for phenylephrine. Evaluations were performed at 3, 5 and 10 d post-inclusion. The main end-point was the 5(th)-day complete clinical response (disappearance of pain and edema, and ≥ 70% reduction of the lesion size). Time to response and need for thrombectomy were secondary efficacy variables. Adverse events were evaluated too. 5(th) day complete response rates were 83/110 (75.5%) and 36/110 (32.7%) with rSK and phenylephrine suppositories, respectively. This 42.7% difference (95%CI: 30.5-54.2) was highly significant (P suppositories showed a significant advantage over a widely used over-the-counter phenylephrine preparation for the treatment of acute hemorrhoidal illness, with an adequate safety profile.

  17. Randomized controlled trial of a computer-based module to improve contraceptive method choice.

    Science.gov (United States)

    Garbers, Samantha; Meserve, Allison; Kottke, Melissa; Hatcher, Robert; Ventura, Alicia; Chiasson, Mary Ann

    2012-10-01

    Unintended pregnancy is common in the United States, and interventions are needed to improve contraceptive use among women at higher risk of unintended pregnancy, including Latinas and women with low educational attainment. A three-arm randomized controlled trial was conducted at two family planning sites serving low-income, predominantly Latina populations. The trial tested the efficacy of a computer-based contraceptive assessment module in increasing the proportion of patients choosing an effective method of contraception (women per year, typical use). Participants were randomized to complete the module and receive tailored health materials, to complete the module and receive generic health materials, or to a control condition. In intent-to-treat analyses adjusted for recruitment site (n=2231), family planning patients who used the module were significantly more likely to choose an effective contraceptive method: 75% among those who received tailored materials [odds ratio (OR)=1.56; 95% confidence interval (CI): 1.23-1.98] and 78% among those who received generic materials (OR=1.74; 95% CI: 1.35-2.25), compared to 65% among control arm participants. The findings support prior research suggesting that patient-centered interventions can positively influence contraceptive method choice. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Continuous energy Monte Carlo calculations for randomly distributed spherical fuels based on statistical geometry model

    Energy Technology Data Exchange (ETDEWEB)

    Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi

    1996-03-01

    The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct eva