WorldWideScience

Sample records for randomly selected final

  1. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  2. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  3. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  4. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  5. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  6. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  7. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  8. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  9. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  10. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  11. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  12. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  13. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  14. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  15. The Finnish final disposal programme proceeds to the site selection

    International Nuclear Information System (INIS)

    Seppaelae, T.

    1999-01-01

    Research for the selection of the final disposal site has been carried out already since the beginning of 1980's. Field studies were started in 1987: In the recent years, studied sites have included Olkiluoto in Eurajoki, Haestholmen in Loviisa, Romuvaara in Kuhmo and Kivetty in Aeaenekoski. Based on 40 years operation of four power plant units, the estimate for the accumulation of spent fuel to be disposed of in Finland is 2,600 tU. A 'Decision in Principle' is needed from the Finnish government to select the final disposal site, Posiva submitted the application for a policy decision in May 1999. The intended site of the facility is Olkiluoto which produces most of the spent fuel in Finland: A disposal would minimise the need of transports. In a poll among the inhabitants of Eurajoki, 60 per cent approved the final disposal facility. After a positive decision of the government, Posiva will construct an underground research facility in Olkiluoto. The construction of the final disposal facility will take place in the 2010's, the facility should be operational in 2020. (orig.) [de

  16. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  17. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  18. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  19. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  20. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  1. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  2. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  3. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  4. 76 FR 2336 - Dynamic Random Access Memory Semiconductors From the Republic of Korea: Final Results of...

    Science.gov (United States)

    2011-01-13

    ... Semiconductors From the Republic of Korea: Final Results of Countervailing Duty Administrative Review AGENCY... administrative review of the countervailing duty order on dynamic random access memory semiconductors from the... to a change in the net subsidy rate. The final net subsidy rate for Hynix Semiconductor, Inc. is...

  5. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  6. Site selection - siting of the final repository for spent nuclear fuel

    International Nuclear Information System (INIS)

    2011-03-01

    SKB has selected Forsmark as the site for the final repository for spent nuclear fuel. The site selection is the end result of an extensive siting process that began in the early 1990s. The strategy and plan for the work was based on experience from investigations and development work over a period of more than ten years prior to then. This document describes the siting work and SKB's choice of site for the final repository. It also presents the information on which the choice was based and the reasons for the decisions made along the way. The document comprises Appendix PV to applications under the Nuclear Activities Act and the Environmental Code for licences to build and operate an encapsulation plant adjacent to the central interim storage facility for spent nuclear fuel in Oskarshamn, and to build and operate a final repository for spent nuclear fuel in Forsmark in Oesthammar Municipality

  7. Site selection - siting of the final repository for spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    2011-03-15

    SKB has selected Forsmark as the site for the final repository for spent nuclear fuel. The site selection is the end result of an extensive siting process that began in the early 1990s. The strategy and plan for the work was based on experience from investigations and development work over a period of more than ten years prior to then. This document describes the siting work and SKB's choice of site for the final repository. It also presents the information on which the choice was based and the reasons for the decisions made along the way. The document comprises Appendix PV to applications under the Nuclear Activities Act and the Environmental Code for licences to build and operate an encapsulation plant adjacent to the central interim storage facility for spent nuclear fuel in Oskarshamn, and to build and operate a final repository for spent nuclear fuel in Forsmark in Oesthammar Municipality

  8. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  9. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  10. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  11. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  12. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  13. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  14. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  16. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  17. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  18. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  19. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  20. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  1. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  2. Course Syllabi and Their Effects on Students' Final Grade Performance.

    Science.gov (United States)

    Serafin, Ana Gil

    This study examined the relationship between the changes introduced in a course syllabus for a course titled "Instructional Strategies" and the final grades obtained by freshman and sophomore students in three successive academic periods. A sample of 150 subjects was randomly selected from students enrolled in the course at the…

  3. Applied Math & Science Levels Utilized in Selected Trade & Industrial Vocational Education. Final Report.

    Science.gov (United States)

    Gray, James R.

    Research identified and evaluated the level of applied mathematics and science used in selected trade and industrial (T&I) subjects taught in the Kentucky Vocational Education System. The random sample was composed of 52 programs: 21 carpentry, 20 electricity/electronics, and 11 machine shop. The 96 math content items that were identified as…

  4. The procedure of alternative site selection within the report of the study group on the radioactive waste final repository selection process (AKEnd)

    International Nuclear Information System (INIS)

    Nies, A.

    2005-01-01

    The study group on the selection procedures of radioactive waste final repository sites has presented the report in December 2002. The author dicusses the consequences of this report with respect to the site selection focussing on two topics: the serach for the best possible site and the prevention of prejudices

  5. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  6. Final disposal of spent nuclear fuel - basis for site selection

    International Nuclear Information System (INIS)

    Anttila, P.

    1995-05-01

    International organizations, e.g. IAEA, have published several recommendations and guides for the safe disposal of radioactive waste. There are three major groups of issues affecting the site selection process, i.e. geological, environmental and socioeconomic. The first step of the site selection process is an inventory of potential host rock formations. After that, potential study areas are screened to identify sites for detailed investigations, prior to geological conditions and overall suitability for the safe disposal. This kind of stepwise site selection procedure has been used in Finland and in Sweden. A similar approach has been proposed in Canada, too. In accordance with the amendment to the Nuclear Energy Act, that entered into force in the beginning of 1995, Imatran Voima Oy has to make preparations for the final disposal of spent fuel in the Finnish bedrock. Relating to the possible site selection, the following geological factors, as internationally recommended and used in the Nordic countries, should be taken into account: topography, stability of bedrock, brokenness and fracturing of bedrock, size of bedrock block, rock type, predictability and natural resources. The bedrock of the Loviisa NPP site is a part of the Vyborg rapakivi massif. As a whole the rapakivi granite area forms a potential target area, although other rock types or areas cannot be excluded from possible site selection studies. (25 refs., 7 figs.)

  7. The procedure of alternative site selection within the report of the study group on the radioactive waste final repository selection process (AKEnd)

    International Nuclear Information System (INIS)

    Brenner, M.

    2005-01-01

    The paper discusses the results of the report of the study group on the radioactive waste final repository selection process with respect to the alternative site selection procedure. Key points of the report are the long-term safety, the alternativity of sites and the concept of one repository. The critique on this report is focussed on the topics site selection and licensing procedures, civil participation, the factor time and the question of cost

  8. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  9. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  10. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  11. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  12. Site-selection studies for final disposal of spent fuel in Finland

    International Nuclear Information System (INIS)

    Vuorela, P.; Aeikaes, T.

    1984-02-01

    In the management of waste by the Industrial Power Company Ltd. (TVO) preparations are being made for the final disposal of unprocessed spent fuel into the Finnish bedrock. The site selection program will advance in three phases. The final disposal site must be made at the latest by the end of the year 2000, in accordance with a decision laid down by the Finnish Government. In the first phase, 1983-85, the main object is to find homogeneous stable bedrock blocks surrounded by fracture zones located at a safe distance from the planned disposal area. The work usually starts with a regional structural analysis of mosaics of Landsat-1 winter and summer imagery. Next an assortment of different maps, which cover the whole country, is used. Technical methods for geological and hydrogeological site investigations are being developed during the very first phase of the studies, and a borehole 1000 meters deep will be made in southwestern Finland. Studies for the final disposal of spent fuel or high-level reprocessing waste have been made since 1974 in Finland. General suitability studies of the bedrock have been going on since 1977. The present results indicate that suitable investigation areas for the final disposal of highly active waste can be found in Finland

  13. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  14. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  15. Salt Repository emplacement mode evaluation and selection: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This document describes the decision analysis performed to evaluate and compare the emplacement mode for the Salt Repository. The study was commissioned to recommend one emplacement mode to the Salt Repository Project Office using multi-attribute decision analysis. The nature of the decision required analysis of uncertain outcomes and conflicting attributes and offers a high degree of objectivity for these types of decisions since the decision model is structured to allow only the facts to enter into the final decision. The analysis requires an explicit definition of the attributes used to evaluate the alternative (e.g., cost, safety, environmental impact), the definition of a utility function over the attributes which incorporated both risk attitudes and trade-offs between attributes, and the probability distribution over the outcomes that would result from the selection of one alternative over the other. The decision process is described and results are given. A simulation model was developed to evaluate the probability distributions over the attributes. This report documents logic, inputs and results of this model. Final ranking of alternatives is given. Extensive technical backup documentation is included in the appendices to provide the quantitative basis for this decision. 5 refs., 2 figs., 8 tabs

  16. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  17. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  18. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  19. 78 FR 12955 - Final Requirements, Definitions, and Selection Criteria-Native American Career and Technical...

    Science.gov (United States)

    2013-02-26

    ... career and technical education programs (20 U.S.C. 2326(e)). This notice does not preclude us from... DEPARTMENT OF EDUCATION 34 CFR Chapter IV [Docket ID ED-2012-OVAE-0053] Final Requirements, Definitions, and Selection Criteria--Native American Career and Technical Education Program (NACTEP) [Catalog...

  20. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  1. Random a-adic groups and random net fractals

    Energy Technology Data Exchange (ETDEWEB)

    Li Yin [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: Lyjerry7788@hotmail.com; Su Weiyi [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: suqiu@nju.edu.cn

    2008-08-15

    Based on random a-adic groups, this paper investigates the relationship between the existence conditions of a positive flow in a random network and the estimation of the Hausdorff dimension of a proper random net fractal. Subsequently we describe some particular random fractals for which our results can be applied. Finally the Mauldin and Williams theorem is shown to be very important example for a random Cantor set with application in physics as shown in E-infinity theory.

  2. A metaheuristic optimization framework for informative gene selection

    Directory of Open Access Journals (Sweden)

    Kaberi Das

    Full Text Available This paper presents a metaheuristic framework using Harmony Search (HS with Genetic Algorithm (GA for gene selection. The internal architecture of the proposed model broadly works in two phases, in the first phase, the model allows the hybridization of HS with GA to compute and evaluate the fitness of the randomly selected solutions of binary strings and then HS ranks the solutions in descending order of their fitness. In the second phase, the offsprings are generated using crossover and mutation operations of GA and finally, those offsprings were selected for the next generation whose fitness value is more than their parents evaluated by SVM classifier. The accuracy of the final gene subsets obtained from this model has been evaluated using SVM classifiers. The merit of this approach is analyzed by experimental results on five benchmark datasets and the results showed an impressive accuracy over existing feature selection approaches. The occurrence of gene subsets selected from this model have also been computed and the most often selected gene subsets with the probability of [0.1–0.9] have been chosen as optimal sets of informative genes. Finally, the performance of those selected informative gene subsets have been measured and established through probabilistic measures. Keywords: Gene Selection, Metaheuristic, Harmony Search Algorithm, Genetic Algorithm, SVM

  3. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  4. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  5. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  6. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  7. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  8. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  9. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  10. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  11. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  12. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  13. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  14. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  15. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  16. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  17. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  18. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  19. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  1. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  2. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  3. Final state selection in the 4p photoemission of Rb by combining laser spectroscopy with soft-x-ray photoionization

    International Nuclear Information System (INIS)

    Schulz, J.; Tchaplyguine, M.; Rander, T.; Bergersen, H.; Lindblad, A.; Oehrwall, G.; Svensson, S.; Heinaesmaeki, S.; Sankari, R.; Osmekhin, S.; Aksela, S.; Aksela, H.

    2005-01-01

    Fine structure resolved 4p photoemission studies have been performed on free rubidium atoms in the ground state and after excitation into the [Kr]5p 2 P 1/2 and 2 P 3/2 states. The 4p 5 5p final states have been excited in the 4p 6 5s→4p 5 5p conjugate shakeup process from ground state atoms as well as by direct photoemission from laser excited atoms. The relative intensities differ considerably in these three excitation schemes. The differences in the laser excited spectra could be described well using calculations based on the pure jK-coupling scheme. Thereby it was possible to specify the character of the various final states. Furthermore it has been possible to resolve two of the final states whose energy separation is smaller than the experimental resolution by selectively exciting them in a two step scheme, where the laser selects the spin-orbit coupling in the intermediate state and determines the final state coupling after x-ray photoemission

  4. Generation of pseudo-random sequences for spread spectrum systems

    Science.gov (United States)

    Moser, R.; Stover, J.

    1985-05-01

    The characteristics of pseudo random radio signal sequences (PRS) are explored. The randomness of the PSR is a matter of artificially altering the sequence of binary digits broadcast. Autocorrelations of the two sequences shifted in time, if high, determine if the signals are the same and thus allow for position identification. Cross-correlation can also be calculated between sequences. Correlations closest to zero are obtained with large volume of prime numbers in the sequences. Techniques for selecting optimal and maximal lengths for the sequences are reviewed. If the correlations are near zero in the sequences, then signal channels can accommodate multiple users. Finally, Gold codes are discussed as a technique for maximizing the code lengths.

  5. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  6. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M.

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations. PMID:29163136

  7. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Soledad Ballesteros

    2017-11-01

    Full Text Available Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508 tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/ or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group. Participants were tested individually before and after training to assess working memory (WM and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1 the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task; (2 a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  8. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity , a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N -back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  9. Engineering evaluation of selective ion-exchange radioactive waste processing at Susquehanna Nuclear Power Plant: Final report

    International Nuclear Information System (INIS)

    Vance, J.N.

    1989-01-01

    This final report describes the work performed of an engineering feasibility evaluation of the use and benefits of a selective ion exchange treatment process in the Susquehanna radwaste system. The evaluation addressed operability and processing capability concerns, radiological impacts of operating in the radwaste discharge mode, required hardware modifications to the radwaste and plant make-up systems, impacts on plant water quality limits and impacts on higher waste classifications. An economic analysis is also reported showing the economic benefit of the use of selective ion exchange. 1 ref., 4 figs., 13 tabs

  10. Feature Selection for Chemical Sensor Arrays Using Mutual Information

    Science.gov (United States)

    Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.

    2014-01-01

    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058

  11. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  12. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  13. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  14. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  15. Procedures for selecting and buying district heating equipment. Sofia district heating. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-11-01

    The aim of this Final Report, prepared for the project `Procedures for Selecting and Buying DistRict Heating Equipment - Sofia District Heating Company`, is to establish an overview of the activities accomplished, the outputs delivered and the general experience gained as a result of the project. The main objective of the project is to enable Sofia District Heating Company to prepare specifications and tender documents, identify possible suppliers, evaluate offers, etc. in connection with purchase of district heating equipment. This objective has been reached by using rehabilitation of sub-stations as an example requested by Sofia DH. The project was originally planned to be finalized end of 1995, but due to the extensions of the scope of work, the project has been prolonged until end 1997. The following main activities were accomplished: Preparation of a detailed work plan; Collection of background information; Discussion and advice about technical specifications and tender documents for sub-station rehabilitation; Input to terms of reference for a master plan study; Input to technical specification for heat meters; Collection of ideas for topics and examples related to dissemination of information to consumers about matters related to district heating consumption. (EG)

  16. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  17. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  18. Random ensemble learning for EEG classification.

    Science.gov (United States)

    Hosseini, Mohammad-Parsa; Pompili, Dario; Elisevich, Kost; Soltanian-Zadeh, Hamid

    2018-01-01

    Real-time detection of seizure activity in epilepsy patients is critical in averting seizure activity and improving patients' quality of life. Accurate evaluation, presurgical assessment, seizure prevention, and emergency alerts all depend on the rapid detection of seizure onset. A new method of feature selection and classification for rapid and precise seizure detection is discussed wherein informative components of electroencephalogram (EEG)-derived data are extracted and an automatic method is presented using infinite independent component analysis (I-ICA) to select independent features. The feature space is divided into subspaces via random selection and multichannel support vector machines (SVMs) are used to classify these subspaces. The result of each classifier is then combined by majority voting to establish the final output. In addition, a random subspace ensemble using a combination of SVM, multilayer perceptron (MLP) neural network and an extended k-nearest neighbors (k-NN), called extended nearest neighbor (ENN), is developed for the EEG and electrocorticography (ECoG) big data problem. To evaluate the solution, a benchmark ECoG of eight patients with temporal and extratemporal epilepsy was implemented in a distributed computing framework as a multitier cloud-computing architecture. Using leave-one-out cross-validation, the accuracy, sensitivity, specificity, and both false positive and false negative ratios of the proposed method were found to be 0.97, 0.98, 0.96, 0.04, and 0.02, respectively. Application of the solution to cases under investigation with ECoG has also been effected to demonstrate its utility. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A randomized, controlled trial of oral propranolol in infantile hemangioma.

    Science.gov (United States)

    Léauté-Labrèze, Christine; Hoeger, Peter; Mazereeuw-Hautier, Juliette; Guibaud, Laurent; Baselga, Eulalia; Posiunas, Gintas; Phillips, Roderic J; Caceres, Hector; Lopez Gutierrez, Juan Carlos; Ballona, Rosalia; Friedlander, Sheila Fallon; Powell, Julie; Perek, Danuta; Metz, Brandie; Barbarot, Sebastien; Maruani, Annabel; Szalai, Zsuzsanna Zsofia; Krol, Alfons; Boccara, Olivia; Foelster-Holst, Regina; Febrer Bosch, Maria Isabel; Su, John; Buckova, Hana; Torrelo, Antonio; Cambazard, Frederic; Grantzow, Rainer; Wargon, Orli; Wyrzykowski, Dariusz; Roessler, Jochen; Bernabeu-Wittel, Jose; Valencia, Adriana M; Przewratil, Przemyslaw; Glick, Sharon; Pope, Elena; Birchall, Nicholas; Benjamin, Latanya; Mancini, Anthony J; Vabres, Pierre; Souteyrand, Pierre; Frieden, Ilona J; Berul, Charles I; Mehta, Cyrus R; Prey, Sorilla; Boralevi, Franck; Morgan, Caroline C; Heritier, Stephane; Delarue, Alain; Voisard, Jean-Jacques

    2015-02-19

    Oral propranolol has been used to treat complicated infantile hemangiomas, although data from randomized, controlled trials to inform its use are limited. We performed a multicenter, randomized, double-blind, adaptive, phase 2-3 trial assessing the efficacy and safety of a pediatric-specific oral propranolol solution in infants 1 to 5 months of age with proliferating infantile hemangioma requiring systemic therapy. Infants were randomly assigned to receive placebo or one of four propranolol regimens (1 or 3 mg of propranolol base per kilogram of body weight per day for 3 or 6 months). A preplanned interim analysis was conducted to identify the regimen to study for the final efficacy analysis. The primary end point was success (complete or nearly complete resolution of the target hemangioma) or failure of trial treatment at week 24, as assessed by independent, centralized, blinded evaluations of standardized photographs. Of 460 infants who underwent randomization, 456 received treatment. On the basis of an interim analysis of the first 188 patients who completed 24 weeks of trial treatment, the regimen of 3 mg of propranolol per kilogram per day for 6 months was selected for the final efficacy analysis. The frequency of successful treatment was higher with this regimen than with placebo (60% vs. 4%, P<0.001). A total of 88% of patients who received the selected propranolol regimen showed improvement by week 5, versus 5% of patients who received placebo. A total of 10% of patients in whom treatment with propranolol was successful required systemic retreatment during follow-up. Known adverse events associated with propranolol (hypoglycemia, hypotension, bradycardia, and bronchospasm) occurred infrequently, with no significant difference in frequency between the placebo group and the groups receiving propranolol. This trial showed that propranolol was effective at a dose of 3 mg per kilogram per day for 6 months in the treatment of infantile hemangioma. (Funded by

  20. Gene selection and classification for cancer microarray data based on machine learning and similarity measures

    Directory of Open Access Journals (Sweden)

    Liu Qingzhong

    2011-12-01

    Full Text Available Abstract Background Microarray data have a high dimension of variables and a small sample size. In microarray data analyses, two important issues are how to choose genes, which provide reliable and good prediction for disease status, and how to determine the final gene set that is best for classification. Associations among genetic markers mean one can exploit information redundancy to potentially reduce classification cost in terms of time and money. Results To deal with redundant information and improve classification, we propose a gene selection method, Recursive Feature Addition, which combines supervised learning and statistical similarity measures. To determine the final optimal gene set for prediction and classification, we propose an algorithm, Lagging Prediction Peephole Optimization. By using six benchmark microarray gene expression data sets, we compared Recursive Feature Addition with recently developed gene selection methods: Support Vector Machine Recursive Feature Elimination, Leave-One-Out Calculation Sequential Forward Selection and several others. Conclusions On average, with the use of popular learning machines including Nearest Mean Scaled Classifier, Support Vector Machine, Naive Bayes Classifier and Random Forest, Recursive Feature Addition outperformed other methods. Our studies also showed that Lagging Prediction Peephole Optimization is superior to random strategy; Recursive Feature Addition with Lagging Prediction Peephole Optimization obtained better testing accuracies than the gene selection method varSelRF.

  1. Energy conservation in selected buildings, Gdansk. 1. final report

    International Nuclear Information System (INIS)

    1997-02-01

    This Final Report marks the end of the implementation stage of the project: 'Energy Conservation in Selected Buildings in Gdansk, Poland' supported by the Danish Environment-related Energy Sector Programme for Poland under the Danish Energy Agency. The residential and commercial sectors together with public buildings account for 40-45% of the total energy consumption and are dominated by the use of space heating and hot water. The sector has a significant over-consumption of energy, which first of all is due to the lack of or too weak incentives for the individual tenants to decrease the energy consumption. Bad thermal insulation of buildings and inefficient central heating systems with a widespread lack of measurement and automatic control systems give cause for extensive heat losses. The objective of the project has been to document the effects of energy savings in 18 multi-family houses when different types of energy saving measures are applied. These measures include thermal insulation of buildings, refurbishment of the heating system and introduction of individual billing system for heating and hot tap water. Energy audits of 18 buildings were performed by combination of on-site inspection of all buildings and data collection from the available drawings, technical descriptions, etc. The on-site inspection was carried out by use of an energy audit scheme specially developed for this project. (EG)

  2. Energy conservation in selected buildings, Gdansk. 1. final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This Final Report marks the end of the implementation stage of the project: `Energy Conservation in Selected Buildings in Gdansk, Poland` supported by the Danish Environment-related Energy Sector Programme for Poland under the Danish Energy Agency. The residential and commercial sectors together with public buildings account for 40-45% of the total energy consumption and are dominated by the use of space heating and hot water. The sector has a significant over-consumption of energy, which first of all is due to the lack of or too weak incentives for the individual tenants to decrease the energy consumption. Bad thermal insulation of buildings and inefficient central heating systems with a widespread lack of measurement and automatic control systems give cause for extensive heat losses. The objective of the project has been to document the effects of energy savings in 18 multi-family houses when different types of energy saving measures are applied. These measures include thermal insulation of buildings, refurbishment of the heating system and introduction of individual billing system for heating and hot tap water. Energy audits of 18 buildings were performed by combination of on-site inspection of all buildings and data collection from the available drawings, technical descriptions, etc. The on-site inspection was carried out by use of an energy audit scheme specially developed for this project. (EG)

  3. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  4. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  5. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  6. Selected CD133⁺ progenitor cells to promote angiogenesis in patients with refractory angina: final results of the PROGENITOR randomized trial.

    Science.gov (United States)

    Jimenez-Quevedo, Pilar; Gonzalez-Ferrer, Juan Jose; Sabate, Manel; Garcia-Moll, Xavier; Delgado-Bolton, Roberto; Llorente, Leopoldo; Bernardo, Esther; Ortega-Pozzi, Aranzazu; Hernandez-Antolin, Rosana; Alfonso, Fernando; Gonzalo, Nieves; Escaned, Javier; Bañuelos, Camino; Regueiro, Ander; Marin, Pedro; Fernandez-Ortiz, Antonio; Neves, Barbara Das; Del Trigo, Maria; Fernandez, Cristina; Tejerina, Teresa; Redondo, Santiago; Garcia, Eulogio; Macaya, Carlos

    2014-11-07

    Refractory angina constitutes a clinical problem. The aim of this study was to assess the safety and the feasibility of transendocardial injection of CD133(+) cells to foster angiogenesis in patients with refractory angina. In this randomized, double-blinded, multicenter controlled trial, eligible patients were treated with granulocyte colony-stimulating factor, underwent an apheresis and electromechanical mapping, and were randomized to receive treatment with CD133(+) cells or no treatment. The primary end point was the safety of transendocardial injection of CD133(+) cells, as measured by the occurrence of major adverse cardiac and cerebrovascular event at 6 months. Secondary end points analyzed the efficacy. Twenty-eight patients were included (n=19 treatment; n=9 control). At 6 months, 1 patient in each group had ventricular fibrillation and 1 patient in each group died. One patient (treatment group) had a cardiac tamponade during mapping. There were no significant differences between groups with respect to efficacy parameters; however, the comparison within groups showed a significant improvement in the number of angina episodes per month (median absolute difference, -8.5 [95% confidence interval, -15.0 to -4.0]) and in angina functional class in the treatment arm but not in the control group. At 6 months, only 1 simple-photon emission computed tomography (SPECT) parameter: summed score improved significantly in the treatment group at rest and at stress (median absolute difference, -1.0 [95% confidence interval, -1.9 to -0.1]) but not in the control arm. Our findings support feasibility and safety of transendocardial injection of CD133(+) cells in patients with refractory angina. The promising clinical results and favorable data observed in SPECT summed score may set up the basis to test the efficacy of cell therapy in a larger randomized trial. © 2014 American Heart Association, Inc.

  7. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  8. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  9. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  10. Curvature of random walks and random polygons in confinement

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2013-01-01

    The purpose of this paper is to study the curvature of equilateral random walks and polygons that are confined in a sphere. Curvature is one of several basic geometric properties that can be used to describe random walks and polygons. We show that confinement affects curvature quite strongly, and in the limit case where the confinement diameter equals the edge length the unconfined expected curvature value doubles from π/2 to π. To study curvature a simple model of an equilateral random walk in spherical confinement in dimensions 2 and 3 is introduced. For this simple model we derive explicit integral expressions for the expected value of the total curvature in both dimensions. These expressions are functions that depend only on the radius R of the confinement sphere. We then show that the values obtained by numeric integration of these expressions agrees with numerical average curvature estimates obtained from simulations of random walks. Finally, we compare the confinement effect on curvature of random walks with random polygons. (paper)

  11. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  12. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  13. Ontario Select Committee on Alternative Fuel Sources : Final Report

    International Nuclear Information System (INIS)

    Galt, D.

    2002-06-01

    On June 28, 2001, the Ontario Legislative Assembly appointed the Select Committee an Alternative Fuel Sources, comprised of representatives of all parties, with a broad mandate to investigate, report and offer recommendations with regard to the various options to support the development and application of environmentally sustainable alternatives to the fossil fuel sources already existing. The members of the Committee elected to conduct extensive public hearings, conduct site visits, attend relevant conferences, do some background research to examine a vast number of alternative fuel and energy sources that could be of relevance to the province of Ontario. A discussion paper (interim report) was issued by the Committee in November 2001, and the present document represents the final report, containing 141 recommendations touching 20 topics. The information contained in the report is expected to assist in the development and outline of policy and programs designed to specifically support alternative fuels and energy sources and applicable technologies. Policy issues were discussed in Part A of the report, along with the appropriate recommendations. The recommendations on specific alternative fuels and energy sources were included in Part B of the report. It is believed that the dependence of Ontario on traditional petroleum-based fuels and energy sources can be reduced through aggressive action on alternative fuels and energy. The benefits of such action would be felt in the area of air quality, with social, and economic benefits as well. 3 tabs

  14. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  15. The procedure of alternative site selection within the report of the study group on the radioactive waste final repository selection process (AKEnd); Das Verfahren der alternativen Standortsuche im Bericht des Arbeitskreises Auswahlverfahren Endlagerstandorte (AKEnd)

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, M. [Jena Univ. (Germany)

    2005-07-01

    The paper discusses the results of the report of the study group on the radioactive waste final repository selection process with respect to the alternative site selection procedure. Key points of the report are the long-term safety, the alternativity of sites and the concept of one repository. The critique on this report is focussed on the topics site selection and licensing procedures, civil participation, the factor time and the question of cost.

  16. The Fault Diagnosis of Rolling Bearing Based on Ensemble Empirical Mode Decomposition and Random Forest

    OpenAIRE

    Qin, Xiwen; Li, Qiaoling; Dong, Xiaogang; Lv, Siqi

    2017-01-01

    Accurate diagnosis of rolling bearing fault on the normal operation of machinery and equipment has a very important significance. A method combining Ensemble Empirical Mode Decomposition (EEMD) and Random Forest (RF) is proposed. Firstly, the original signal is decomposed into several intrinsic mode functions (IMFs) by EEMD, and the effective IMFs are selected. Then their energy entropy is calculated as the feature. Finally, the classification is performed by RF. In addition, the wavelet meth...

  17. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  18. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  19. Final report on implementation of energy conservation practices training in selected public housing developments

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This report on the implementation of energy conservation practices training in selected public housing developments represents an initiative of the Research and Education Division, Office of Minority Economic Impact, US Department of Energy. The Office of Minority Economic Impact (MI) was created by Congress in 1979, within the US Department of Energy, to afford the Secretary advice on the effect policies, regulations and other actions of DOE respecting minority participation in energy programs. The Director of MI is responsible for the conduct of ongoing research into the effects, including socio-economic and environmental, of national energy programs, policies, and regulations of the Department of minorities. Public housing in the United States is dominated by minorities, public housing is a large consumer of residential energy. Consequently, this project is a logical merging of these two factors and an attempt to somehow influence energy savings through improving public housing residents` energy-consumption practices. This final report attempts to capture the results of this current demonstration, and incorporate the historical basis for today`s results by renewing the efforts that preceded the implementation of energy conservation practices training in selected public housing developments.

  20. Final report on implementation of energy conservation practices training in selected public housing developments

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This report on the implementation of energy conservation practices training in selected public housing developments represents an initiative of the Research and Education Division, Office of Minority Economic Impact, US Department of Energy. The Office of Minority Economic Impact (MI) was created by Congress in 1979, within the US Department of Energy, to afford the Secretary advice on the effect policies, regulations and other actions of DOE respecting minority participation in energy programs. The Director of MI is responsible for the conduct of ongoing research into the effects, including socio-economic and environmental, of national energy programs, policies, and regulations of the Department of minorities. Public housing in the United States is dominated by minorities, public housing is a large consumer of residential energy. Consequently, this project is a logical merging of these two factors and an attempt to somehow influence energy savings through improving public housing residents' energy-consumption practices. This final report attempts to capture the results of this current demonstration, and incorporate the historical basis for today's results by renewing the efforts that preceded the implementation of energy conservation practices training in selected public housing developments.

  1. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  2. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  3. Comparative analysis of instance selection algorithms for instance-based classifiers in the context of medical decision support

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Tourassi, Georgia D; Malof, Jordan M

    2011-01-01

    When constructing a pattern classifier, it is important to make best use of the instances (a.k.a. cases, examples, patterns or prototypes) available for its development. In this paper we present an extensive comparative analysis of algorithms that, given a pool of previously acquired instances, attempt to select those that will be the most effective to construct an instance-based classifier in terms of classification performance, time efficiency and storage requirements. We evaluate seven previously proposed instance selection algorithms and compare their performance to simple random selection of instances. We perform the evaluation using k-nearest neighbor classifier and three classification problems: one with simulated Gaussian data and two based on clinical databases for breast cancer detection and diagnosis, respectively. Finally, we evaluate the impact of the number of instances available for selection on the performance of the selection algorithms and conduct initial analysis of the selected instances. The experiments show that for all investigated classification problems, it was possible to reduce the size of the original development dataset to less than 3% of its initial size while maintaining or improving the classification performance. Random mutation hill climbing emerges as the superior selection algorithm. Furthermore, we show that some previously proposed algorithms perform worse than random selection. Regarding the impact of the number of instances available for the classifier development on the performance of the selection algorithms, we confirm that the selection algorithms are generally more effective as the pool of available instances increases. In conclusion, instance selection is generally beneficial for instance-based classifiers as it can improve their performance, reduce their storage requirements and improve their response time. However, choosing the right selection algorithm is crucial.

  4. 14 CFR 1214.1105 - Final ranking.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Final ranking. 1214.1105 Section 1214.1105... Recruitment and Selection Program § 1214.1105 Final ranking. Final rankings will be based on a combination of... preference will be included in this final ranking in accordance with applicable regulations. ...

  5. CMOS Compressed Imaging by Random Convolution

    OpenAIRE

    Jacques, Laurent; Vandergheynst, Pierre; Bibet, Alexandre; Majidzadeh, Vahid; Schmid, Alexandre; Leblebici, Yusuf

    2009-01-01

    We present a CMOS imager with built-in capability to perform Compressed Sensing. The adopted sensing strategy is the random Convolution due to J. Romberg. It is achieved by a shift register set in a pseudo-random configuration. It acts as a convolutive filter on the imager focal plane, the current issued from each CMOS pixel undergoing a pseudo-random redirection controlled by each component of the filter sequence. A pseudo-random triggering of the ADC reading is finally applied to comp...

  6. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  7. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  8. Random walks, random fields, and disordered systems

    CERN Document Server

    Černý, Jiří; Kotecký, Roman

    2015-01-01

    Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...

  9. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Random Forest Application for NEXRAD Radar Data Quality Control

    Science.gov (United States)

    Keem, M.; Seo, B. C.; Krajewski, W. F.

    2017-12-01

    Identification and elimination of non-meteorological radar echoes (e.g., returns from ground, wind turbines, and biological targets) are the basic data quality control steps before radar data use in quantitative applications (e.g., precipitation estimation). Although WSR-88Ds' recent upgrade to dual-polarization has enhanced this quality control and echo classification, there are still challenges to detect some non-meteorological echoes that show precipitation-like characteristics (e.g., wind turbine or anomalous propagation clutter embedded in rain). With this in mind, a new quality control method using Random Forest is proposed in this study. This classification algorithm is known to produce reliable results with less uncertainty. The method introduces randomness into sampling and feature selections and integrates consequent multiple decision trees. The multidimensional structure of the trees can characterize the statistical interactions of involved multiple features in complex situations. The authors explore the performance of Random Forest method for NEXRAD radar data quality control. Training datasets are selected using several clear cases of precipitation and non-precipitation (but with some non-meteorological echoes). The model is structured using available candidate features (from the NEXRAD data) such as horizontal reflectivity, differential reflectivity, differential phase shift, copolar correlation coefficient, and their horizontal textures (e.g., local standard deviation). The influence of each feature on classification results are quantified by variable importance measures that are automatically estimated by the Random Forest algorithm. Therefore, the number and types of features in the final forest can be examined based on the classification accuracy. The authors demonstrate the capability of the proposed approach using several cases ranging from distinct to complex rain/no-rain events and compare the performance with the existing algorithms (e

  11. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  12. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1980-03-01

    The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  13. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1981-01-01

    The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  14. Assessment of chimeric mice with humanized livers in new drug development: generation of pharmacokinetics, metabolism and toxicity data for selecting the final candidate compound.

    Science.gov (United States)

    Kamimura, Hidetaka; Ito, Satoshi

    2016-01-01

    1. Chimeric mice with humanized livers are expected to be a novel tool for new drug development. This review discusses four applications where these animals can be used efficiently to collect supportive data for selecting the best compound in the final stage of drug discovery. 2. The first application is selection of the final compound based on estimated pharmacokinetic parameters in humans. Since chimeric mouse livers are highly repopulated with human hepatocytes, hepatic clearance values in vivo could be used preferentially to estimate pharmacokinetic profiles for humans. 3. The second is prediction of human-specific or disproportionate metabolites. Chimeric mice reproduce human-specific metabolites of drugs under development to conform to ICH guidance M3(R2), except for compounds that were extensively eliminated by co-existing mouse hepatocytes. 4. The third is identifying metabolites with distinct pharmacokinetic profiles in humans. Slow metabolite elimination specifically in humans increases its exposure level, but if its elimination is faster in laboratory animals, the animal exposure level might not satisfy ICH guidance M3(R2). 5. Finally, two examples of reproducing acute liver toxicity in chimeric mice are introduced. Integrated pharmacokinetics, metabolism and toxicity information are expected to assist pharmaceutical scientists in selecting the best candidate compound in new drug development.

  15. Levy flights and random searches

    Energy Technology Data Exchange (ETDEWEB)

    Raposo, E P [Laboratorio de Fisica Teorica e Computacional, Departamento de Fisica, Universidade Federal de Pernambuco, Recife-PE, 50670-901 (Brazil); Buldyrev, S V [Department of Physics, Yeshiva University, New York, 10033 (United States); Da Luz, M G E [Departamento de Fisica, Universidade Federal do Parana, Curitiba-PR, 81531-990 (Brazil); Viswanathan, G M [Instituto de Fisica, Universidade Federal de Alagoas, Maceio-AL, 57072-970 (Brazil); Stanley, H E [Center for Polymer Studies and Department of Physics, Boston University, Boston, MA 02215 (United States)

    2009-10-30

    In this work we discuss some recent contributions to the random search problem. Our analysis includes superdiffusive Levy processes and correlated random walks in several regimes of target site density, mobility and revisitability. We present results in the context of mean-field-like and closed-form average calculations, as well as numerical simulations. We then consider random searches performed in regular lattices and lattices with defects, and we discuss a necessary criterion for distinguishing true superdiffusion from correlated random walk processes. We invoke energy considerations in relation to critical survival states on the edge of extinction, and we analyze the emergence of Levy behavior in deterministic search walks. Finally, we comment on the random search problem in the context of biological foraging.

  16. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  17. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  18. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  19. Geological site selection studies for the final disposal of spent nuclear fuel in Finland

    International Nuclear Information System (INIS)

    Salmi, M.; Vuorela, P.; Kuivamaeki, A.

    1985-10-01

    have been met with that should be avoided in the sites to be selected for the final disposal of nuclear waste

  20. Specified radioactive waste final disposal act

    International Nuclear Information System (INIS)

    Yasui, Masaya

    2001-01-01

    Radioactive wastes must be finally and safely disposed far from human activities. Disposal act is a long-range task and needs to be understood and accepted by public for site selection. This paper explains basic policy of Japanese Government for final disposal act of specified radioactive wastes, examination for site selection guidelines to promote residential understanding, general concept of multi-barrier system for isolating the specific radioactive wastes, and research and technical development for radioactive waste management. (S. Ohno)

  1. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  2. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  3. Further development of public participation in the site-selection and approval process of a final repository in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Barth, Regine; Kallenbach-Herbert, Beate [OeEo-Institute e.V., Inst. for Applied Ecology, Darmstadt (Germany); Arens, Georg [Federal Office for Radiation Protection (BfS), Salzgitter (Germany)

    2006-09-15

    This paper reflects the first findings of a current research project funded by the German Federal Office for Radiation Protection and conducted by an interdisciplinary working group of the OEko-Institute. One focus of this project is the systematic analysis of past and existing participatory processes in different nuclear and non-nuclear projects. On the basis of this analysis and a literature review a specific concept for public participation in the site-selection and approval process of a repository for high radioactive waste (HAW repository) in Germany will be derived. The concept shall foster transparency and acceptance. The working group of the OEko-Institute combines long standing research experience and an intimate knowledge of radioactive waste management including political, technical, management and social problems of final disposal on the one hand. On the other hand members play an active role in stakeholder processes of different non-nuclear projects as well as experience with a wide range of participative measures and their impact. This allows an approach which integrates the specific features of radioactive waste disposal with a wider perspective on the demands and opportunities of stakeholder processes. The procedure of site selection for a HAW repository in Germany still has to be specified. The procedure introduced by the 'Committee on a Site Selection Procedure for Repository Sites' (Arbeitskreis Auswahlverfahren Endlagerstandorte - AkEnd) has not been adopted. The Committee had suggested installing a negotiation group to discuss the AkEnd proposals in the so called 'Phase II'. This suggestion could not be followed because not all relevant stakeholders were willing to participate. An internal draft for a federal law implementing main elements of the AkEnd findings was developed by the Ministry for Environment in 2005, but has never been brought to the cabinet. Due to the change of Government in Germany, the next steps still are

  4. Further development of public participation in the site-selection and approval process of a final repository in Germany

    International Nuclear Information System (INIS)

    Barth, Regine; Kallenbach-Herbert, Beate; Arens, Georg

    2006-01-01

    This paper reflects the first findings of a current research project funded by the German Federal Office for Radiation Protection and conducted by an interdisciplinary working group of the OEko-Institute. One focus of this project is the systematic analysis of past and existing participatory processes in different nuclear and non-nuclear projects. On the basis of this analysis and a literature review a specific concept for public participation in the site-selection and approval process of a repository for high radioactive waste (HAW repository) in Germany will be derived. The concept shall foster transparency and acceptance. The working group of the OEko-Institute combines long standing research experience and an intimate knowledge of radioactive waste management including political, technical, management and social problems of final disposal on the one hand. On the other hand members play an active role in stakeholder processes of different non-nuclear projects as well as experience with a wide range of participative measures and their impact. This allows an approach which integrates the specific features of radioactive waste disposal with a wider perspective on the demands and opportunities of stakeholder processes. The procedure of site selection for a HAW repository in Germany still has to be specified. The procedure introduced by the 'Committee on a Site Selection Procedure for Repository Sites' (Arbeitskreis Auswahlverfahren Endlagerstandorte - AkEnd) has not been adopted. The Committee had suggested installing a negotiation group to discuss the AkEnd proposals in the so called 'Phase II'. This suggestion could not be followed because not all relevant stakeholders were willing to participate. An internal draft for a federal law implementing main elements of the AkEnd findings was developed by the Ministry for Environment in 2005, but has never been brought to the cabinet. Due to the change of Government in Germany, the next steps still are under consideration

  5. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  6. Thermodynamic method for generating random stress distributions on an earthquake fault

    Science.gov (United States)

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  7. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  8. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  9. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  10. Hydrogen selective membrane for the natural gas system. Development of CO{sub 2}-selective biogas membrane. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Vestboe, A.P.

    2012-02-15

    The project started as a literature study and technology development project for a hydrogen selective membrane for the natural gas system. The introduction of hydrogen (for example produced from wind turbines by surplus electricity) in the gas system makes it possible to store energy which can be selectively used with high energy conversion in fuel cells directly located at the end users. In order to make this possible, it is necessary to have a separating unit that can selectively remove hydrogen from the gas mixture and deliver it as fuel to the electrical generator (a fuel cell). In the project, several existing technologies were evaluated with regard to the application in view. It was concluded that while other technologies are ripe, they are costly in energy and unsuitable for the relatively low capacity application that are in question close to the end users. Membrane technology was evaluated to be the most suitable, although the technology is still under development in many cases. In the project it was found that metallic membranes in the form of palladium coated stainless discs would answer the needs for the high purity needed. Laboratory development yielded discs that could separate hydrogen from natural gas, however, the flux was low compared to the needs of the application. It was found that at least 2 bar pressure difference of hydrogen would be needed to get a high enough flux. The way to achieve this pressure would necessitate a compressor which would consume an energy amount high enough to invalidate the concept. When concluding on the results and the study it was found that the direction of the project could be changed towards developing CO{sub 2}-selective membranes with the goal of developing membrane technology that could upgrade biogas by removing CO{sub 2}. The laboratory equipment and setup that were developed in the first part of the project could be used directly in this second part of the project. In this second part of the project it was

  11. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  12. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  13. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  14. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  15. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    1997-01-01

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  16. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  17. Final disposal of radioactive wastes. Site selection criteria. Technical and economical factors

    International Nuclear Information System (INIS)

    Granero, J.J.

    1984-01-01

    General considerations, geological and socioeconomical criteria for final disposal of radioactive wastes in geological formations are treated. More attention is given to the final disposal of high level radioactive wastes and different solutions searched abroad which seems of interest for Spain. (author)

  18. Peer-selected "best papers"-are they really that "good"?

    Science.gov (United States)

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference.

  19. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  20. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  1. Non-Selective Evolution of Growing Populations.

    Directory of Open Access Journals (Sweden)

    Karl Wienand

    Full Text Available Non-selective effects, like genetic drift, are an important factor in modern conceptions of evolution, and have been extensively studied for constant population sizes (Kimura, 1955; Otto and Whitlock, 1997. Here, we consider non-selective evolution in the case of growing populations that are of small size and have varying trait compositions (e.g. after a population bottleneck. We find that, in these conditions, populations never fixate to a trait, but tend to a random limit composition, and that the distribution of compositions "freezes" to a steady state. This final state is crucially influenced by the initial conditions. We obtain these findings from a combined theoretical and experimental approach, using multiple mixed subpopulations of two Pseudomonas putida strains in non-selective growth conditions (Matthijs et al, 2009 as model system. The experimental results for the population dynamics match the theoretical predictions based on the Pólya urn model (Eggenberger and Pólya, 1923 for all analyzed parameter regimes. In summary, we show that exponential growth stops genetic drift. This result contrasts with previous theoretical analyses of non-selective evolution (e.g. genetic drift, which investigated how traits spread and eventually take over populations (fixate (Kimura, 1955; Otto and Whitlock, 1997. Moreover, our work highlights how deeply growth influences non-selective evolution, and how it plays a key role in maintaining genetic variability. Consequently, it is of particular importance in life-cycles models (Melbinger et al, 2010; Cremer et al, 2011; Cremer et al, 2012 of periodically shrinking and expanding populations.

  2. Aprendizaje supervisado mediante random forests

    OpenAIRE

    Molero del Río, María Cristina

    2017-01-01

    Muchos problemas de la vida real pueden modelarse como problemas de clasificación, tales como la detección temprana de enfermedades o la concesión de crédito a un cierto individuo. La Clasificación Supervisada se encarga de este tipo de problemas: aprende de una muestra con el objetivo final de inferir observaciones futuras. Hoy en día, existe una amplia gama de técnicas de Clasificación Supervisada. En este trabajo nos centramos en los bosques aleatorios (Random Forests). El Random Forests e...

  3. EU DEMO blanket concepts safety assessment. Final report of Working Group 6a of the Blanket Concept Selection Exercise

    International Nuclear Information System (INIS)

    Kleefeldt, K.; Porfiri, T.

    1996-06-01

    The European Union has been engaged since 1989 in a programme to develop tritium breeding blankets for application in a fusion power reactor. There are four blanket concepts under development. Two of them use lithium ceramics, the other two concepts employ an eutectic lead-lithium alloy (Pb-17Li) as breeder material. The two most promising concepts were to select in 1995 for further development. In order to prepare the selection, a Blanket Concept Selection Exercise (BCSE) has been inititated by the participating associations under the auspices of the European Commission. This BCSE has been performed in 14 working groups which, in a comparative evaluation of the four blanket concepts, addressed specific fields. The working group safety addressed the safety implications. This report describes the methodology adopted, the safety issues identified, their comparative evaluation for the four concepts, and the results and conclusions of the working group to be entered into the overall evaluation. There, the results from all 14 working groups have been combined to yield a final ranking as a basis for the selection. In summary, the safety assessment showed that the four European blanket concepts can be considered as equivalent in terms of the safety rating adopted, each concept, however, rendering safety concerns of different quality in different areas which are substantiated in this report. (orig.) [de

  4. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  5. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  6. GuiTope: an application for mapping random-sequence peptides to protein sequences.

    Science.gov (United States)

    Halperin, Rebecca F; Stafford, Phillip; Emery, Jack S; Navalkar, Krupa Arun; Johnston, Stephen Albert

    2012-01-03

    Random-sequence peptide libraries are a commonly used tool to identify novel ligands for binding antibodies, other proteins, and small molecules. It is often of interest to compare the selected peptide sequences to the natural protein binding partners to infer the exact binding site or the importance of particular residues. The ability to search a set of sequences for similarity to a set of peptides may sometimes enable the prediction of an antibody epitope or a novel binding partner. We have developed a software application designed specifically for this task. GuiTope provides a graphical user interface for aligning peptide sequences to protein sequences. All alignment parameters are accessible to the user including the ability to specify the amino acid frequency in the peptide library; these frequencies often differ significantly from those assumed by popular alignment programs. It also includes a novel feature to align di-peptide inversions, which we have found improves the accuracy of antibody epitope prediction from peptide microarray data and shows utility in analyzing phage display datasets. Finally, GuiTope can randomly select peptides from a given library to estimate a null distribution of scores and calculate statistical significance. GuiTope provides a convenient method for comparing selected peptide sequences to protein sequences, including flexible alignment parameters, novel alignment features, ability to search a database, and statistical significance of results. The software is available as an executable (for PC) at http://www.immunosignature.com/software and ongoing updates and source code will be available at sourceforge.net.

  7. GuiTope: an application for mapping random-sequence peptides to protein sequences

    Directory of Open Access Journals (Sweden)

    Halperin Rebecca F

    2012-01-01

    Full Text Available Abstract Background Random-sequence peptide libraries are a commonly used tool to identify novel ligands for binding antibodies, other proteins, and small molecules. It is often of interest to compare the selected peptide sequences to the natural protein binding partners to infer the exact binding site or the importance of particular residues. The ability to search a set of sequences for similarity to a set of peptides may sometimes enable the prediction of an antibody epitope or a novel binding partner. We have developed a software application designed specifically for this task. Results GuiTope provides a graphical user interface for aligning peptide sequences to protein sequences. All alignment parameters are accessible to the user including the ability to specify the amino acid frequency in the peptide library; these frequencies often differ significantly from those assumed by popular alignment programs. It also includes a novel feature to align di-peptide inversions, which we have found improves the accuracy of antibody epitope prediction from peptide microarray data and shows utility in analyzing phage display datasets. Finally, GuiTope can randomly select peptides from a given library to estimate a null distribution of scores and calculate statistical significance. Conclusions GuiTope provides a convenient method for comparing selected peptide sequences to protein sequences, including flexible alignment parameters, novel alignment features, ability to search a database, and statistical significance of results. The software is available as an executable (for PC at http://www.immunosignature.com/software and ongoing updates and source code will be available at sourceforge.net.

  8. Creating, generating and comparing random network models with NetworkRandomizer.

    Science.gov (United States)

    Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni

    2016-01-01

    Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.

  9. Requirements for facilities transferring or receiving select agents. Final rule.

    Science.gov (United States)

    2001-08-31

    CDC administers regulations that govern the transfer of certain biological agents and toxins ("select agents"). These regulations require entities that transfer or receive select agents to register with CDC and comply with biosafety standards contained in the Third Edition of the CDC/NIH publication "Biosafety in Microbiological and Biomedical Laboratories ("BMBL")." On October 28,1999, CDC published a Notice of Proposed Rulemaking ("NPRM") seeking both to revise the biosafety standards facilities must follow when handling select agents and to provide new biosecurity standards for such facilities. These new standards are contained in the Fourth Edition of BMBL, which the NPRM proposed to incorporate by reference, thereby replacing the Third Edition. No comments were received in response to this proposal. CDC is therefore amending its regulations to incorporate the Fourth Edition.

  10. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  11. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  12. Random clustering ferns for multimodal object recognition

    OpenAIRE

    Villamizar Vergel, Michael Alejandro; Garrell Zulueta, Anais; Sanfeliu Cortés, Alberto; Moreno-Noguer, Francesc

    2017-01-01

    The final publication is available at link.springer.com We propose an efficient and robust method for the recognition of objects exhibiting multiple intra-class modes, where each one is associated with a particular object appearance. The proposed method, called random clustering ferns, combines synergically a single and real-time classifier, based on the boosted assembling of extremely randomized trees (ferns), with an unsupervised and probabilistic approach in order to recognize efficient...

  13. COSTECH - HURIA JOURNAL VOL. 24 (2) COSTECH FINAL_NEW

    African Journals Online (AJOL)

    Prof Kigadye

    consecutive days; finally 6 animals were randomly picked from each treatment and slaughtered ... Organoleptic test was conducted and samples of the mutton, goat meat and ..... Asian-Australasian Journal of Animal Sciences 27(1): 55 –. 60.

  14. Marginal Bone Remodeling around healing Abutment vs Final Abutment Placement at Second Stage Implant Surgery: A 12-month Randomized Clinical Trial.

    Science.gov (United States)

    Nader, Nabih; Aboulhosn, Maissa; Berberi, Antoine; Manal, Cordahi; Younes, Ronald

    2016-01-01

    The periimplant bone level has been used as one of the criteria to assess the success of dental implants. It has been documented that the bone supporting two-piece implants undergoes resorption first following the second-stage surgery and later on further to abutment connection and delivery of the final prosthesis. The aim of this multicentric randomized clinical trial was to evaluate the crestal bone resorption around internal connection dental implants using a new surgical protocol that aims to respect the biological distance, relying on the benefit of a friction fit connection abutment (test group) compared with implants receiving conventional healing abutments at second-stage surgery (control group). A total of partially edentulous patients were consecutively treated at two private clinics, with two adjacent two-stage implants. Three months after the first surgery, one of the implants was randomly allocated to the control group and was uncovered using a healing abutment, while the other implant received a standard final abutment and was seated and tightened to 30 Ncm. At each step of the prosthetic try-in, the abutment in the test group was removed and then retightened to 30 Ncm. Horizontal bone changes were assessed using periapical radiographs immediately after implant placement and at 3 (second-stage surgery), 6, 9 and 12 months follow-up examinations. At 12 months follow-up, no implant failure was reported in both groups. In the control group, the mean periimplant bone resorption was 0.249 ± 0.362 at M3, 0.773 ± 0.413 at M6, 0.904 ± 0.36 at M9 and 1.047 ± 0.395 at M12. The test group revealed a statistically significant lower marginal bone loss of 20.88% at M3 (0.197 ± 0.262), 22.25% at M6 (0.601 ± 0.386), 24.23% at M9 (0.685 ± 0.341) and 19.2% at M9 (0.846 ± 0.454). The results revealed that bone loss increased over time, with the greatest change in bone loss occurring between 3 and 6 months. Alveolar bone loss was significantly greater in the

  15. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  16. Selection of 3013 Containers for Field Surveillance

    International Nuclear Information System (INIS)

    Larry Peppers; Elizabeth Kelly; James McClard; Gary Friday; Theodore Venetz; Jerry Stakebade

    2007-01-01

    This report revises and combines three earlier reports dealing with the binning, statistical sampling, and sample selection of 3013 containers for field surveillance. It includes changes to the binning specification resulting from completion of the Savannah River Site packaging campaign and new information from the shelf-life program and field surveillance activities. The revised bin assignments result in changes to the random sample specification. These changes are necessary to meet the statistical requirements of the surveillance program. This report will be reviewed regularly and revised as needed. Section 1 of this report summarizes the results of an extensive effort to assign all of the current and projected 3013 containers in the Department of Energy (DOE) inventory to one of three bins (Innocuous, Pressure and Corrosion, or Pressure) based on potential failure mechanisms. Grouping containers into bins provides a framework to make a statistical selection of individual containers from the entire population for destructive and nondestructive field surveillance. The binning process consisted of three main steps. First, the packaged containers were binned using information in the Integrated Surveillance Program database and a decision tree. The second task was to assign those containers that could not be binned using the decision tree to a specific bin using container-by-container engineering review. The final task was to evaluate containers not yet packaged and assign them to bins using process knowledge. The technical basis for the decisions made during the binning process is included in Section 1. A composite decision tree and a summary table show all of the containers projected to be in the DOE inventory at the conclusion of packaging at all sites. Decision trees that provide an overview of the binning process and logic are included for each site. Section 2 of this report describes the approach to the statistical selection of containers for surveillance and

  17. What is quantum in quantum randomness?

    Science.gov (United States)

    Grangier, P; Auffèves, A

    2018-07-13

    It is often said that quantum and classical randomness are of different nature, the former being ontological and the latter epistemological. However, so far the question of 'What is quantum in quantum randomness?', i.e. what is the impact of quantization and discreteness on the nature of randomness, remains to be answered. In a first part, we make explicit the differences between quantum and classical randomness within a recently proposed ontology for quantum mechanics based on contextual objectivity. In this view, quantum randomness is the result of contextuality and quantization. We show that this approach strongly impacts the purposes of quantum theory as well as its areas of application. In particular, it challenges current programmes inspired by classical reductionism, aiming at the emergence of the classical world from a large number of quantum systems. In a second part, we analyse quantum physics and thermodynamics as theories of randomness, unveiling their mutual influences. We finally consider new technological applications of quantum randomness that have opened up in the emerging field of quantum thermodynamics.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  18. Development of a selective surface vacuum collector. Final report

    Energy Technology Data Exchange (ETDEWEB)

    de Waal, H.; Simonis, F.

    1980-01-01

    To make solar energy useful for cooling applications a flat plate high performance collector, which can supply solar energy at 100 to 150/sup 0/C, has been developed. To achieve a reasonable efficiency at these temperatures the thermal heat loss must be very small. This has been obtained by (1) concentration of sunlight (c = 1.6); (2) evacuation of the collector housing to eliminate convection currents (pressure less than or equal to 4kPa); (3) spectral selective coating on the absorber; and (4) a low conductive gas in the collector housing (pressure approx. = 2kPa). The collector consists of a metal box with a glass cover hermetically sealed to it in the way double glazing units are manufactured. The sides of the V-trough concentrators support the glass cover. Measurements have been performed concerning heat loss factor and durability of the vacuum. The first prototype, fitted with a spectral selective coating of tin-oxide on enameled steel (epsilon = 0.25) showed a heat-loss of 2.0 W/m/sup 2/ /sup 0/C at 90/sup 0/C, being in reasonable agreement with calculations. Improvements with respect to the spectral selective coating and the use of a low conductive gas are necessary and will lead to a heat loss factor of about 1 W/m/sup 2/ /sup 0/C. Measurements have shown that in the chosen system the desired vacuum level can be maintained for at least 10 to 15 years.

  19. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  20. The Fault Diagnosis of Rolling Bearing Based on Ensemble Empirical Mode Decomposition and Random Forest

    Directory of Open Access Journals (Sweden)

    Xiwen Qin

    2017-01-01

    Full Text Available Accurate diagnosis of rolling bearing fault on the normal operation of machinery and equipment has a very important significance. A method combining Ensemble Empirical Mode Decomposition (EEMD and Random Forest (RF is proposed. Firstly, the original signal is decomposed into several intrinsic mode functions (IMFs by EEMD, and the effective IMFs are selected. Then their energy entropy is calculated as the feature. Finally, the classification is performed by RF. In addition, the wavelet method is also used in the proposed process, the same as EEMD. The results of the comparison show that the EEMD method is more accurate than the wavelet method.

  1. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  2. Countering the Consequences of Ego Depletion: The Effects of Self-Talk on Selective Attention.

    Science.gov (United States)

    Gregersen, Jón; Hatzigeorgiadis, Antonis; Galanis, Evangelos; Comoutos, Nikos; Papaioannou, Athanasios

    2017-06-01

    This study examined the effects of a self-talk intervention on selective attention in a state of ego depletion. Participants were 62 undergraduate students with a mean age of 20.02 years (SD = 1.17). The experiment was conducted in four consecutive sessions. Following baseline assessment, participants were randomly assigned into experimental and control groups. A two-session training was conducted for the two groups, with the experimental group using self-talk. In the final assessment, participants performed a selective attention test, including visual and auditory components, following a task inducing a state of ego depletion. The analysis showed that participants of the experimental group achieved a higher percentage of correct responses on the visual test and produced faster reaction times in both the visual and the auditory test compared with participants of the control group. The results of this study suggest that the use of self-talk can benefit selective attention for participants in states of ego depletion.

  3. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  4. A New Selectable Marker System for Genetic Studies of Bacteria: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Parsons, D; Tolmasky, M; Chain, P; Segelke, B W

    2011-03-18

    Genetic manipulations in bacteria currently rely on the introduction of antibiotic resistance genes into a bacterial strain; for those organisms that will be used for commercial or industrial applications, the genetic cassette encoding the antibiotic resistance is sometimes removed after selection. it is clear that if alternative technologies could obviate the need to introduce antibiotic resistance into bacteria, they would most certainly become a standard tool in molecular micriobiology for commercial, industrial as well as research applications. Here, they present the development of a novel genetic engineering technology based on toxin-antitoxin systems to modify bacterial genomes without the use of antibiotic resistance in the mutagenesis process. The primary goal is to develop antibiotic-free selection for genetically altered select agent pathogens. They are adapting the toxinc-antitoxin system to enable gene replacement in select agent pathogens since the NIH restrictions introducing antibiotic resistance into select agent pathogens have hindered research with select agent pathogens.

  5. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  6. The Goodness of Covariance Selection Problem from AUC Bounds

    OpenAIRE

    Khajavi, Navid Tafaghodi; Kuh, Anthony

    2016-01-01

    We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...

  7. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Variance Component Selection With Applications to Microbiome Taxonomic Data

    Directory of Open Access Journals (Sweden)

    Jing Zhai

    2018-03-01

    Full Text Available High-throughput sequencing technology has enabled population-based studies of the role of the human microbiome in disease etiology and exposure response. Microbiome data are summarized as counts or composition of the bacterial taxa at different taxonomic levels. An important problem is to identify the bacterial taxa that are associated with a response. One method is to test the association of specific taxon with phenotypes in a linear mixed effect model, which incorporates phylogenetic information among bacterial communities. Another type of approaches consider all taxa in a joint model and achieves selection via penalization method, which ignores phylogenetic information. In this paper, we consider regression analysis by treating bacterial taxa at different level as multiple random effects. For each taxon, a kernel matrix is calculated based on distance measures in the phylogenetic tree and acts as one variance component in the joint model. Then taxonomic selection is achieved by the lasso (least absolute shrinkage and selection operator penalty on variance components. Our method integrates biological information into the variable selection problem and greatly improves selection accuracies. Simulation studies demonstrate the superiority of our methods versus existing methods, for example, group-lasso. Finally, we apply our method to a longitudinal microbiome study of Human Immunodeficiency Virus (HIV infected patients. We implement our method using the high performance computing language Julia. Software and detailed documentation are freely available at https://github.com/JingZhai63/VCselection.

  9. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  10. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  11. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  12. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  13. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  14. The site selection process

    International Nuclear Information System (INIS)

    Kittel, J.H.

    1989-01-01

    One of the most arduous tasks associated with the management of radioactive wastes is the siting of new disposal facilities. Experience has shown that the performance of the disposal facility during and after disposal operations is critically dependent on the characteristics of the site itself. The site selection process consists of defining needs and objectives, identifying geographic regions of interest, screening and selecting candidate sites, collecting data on the candidate sites, and finally selecting the preferred site. Before the site selection procedures can be implemented, however, a formal legal system must be in place that defines broad objectives and, most importantly, clearly establishes responsibilities and accompanying authorities for the decision-making steps in the procedure. Site selection authorities should make every effort to develop trust and credibility with the public, local officials, and the news media. The responsibilities of supporting agencies must also be spelled out. Finally, a stable funding arrangement must be established so that activities such as data collection can proceed without interruption. Several examples, both international and within the US, are given

  15. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  16. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  17. Efficient Text Encryption and Hiding with Double-Random Phase-Encoding

    Directory of Open Access Journals (Sweden)

    Mohammad S. Alam

    2012-10-01

    Full Text Available In this paper, a double-random phase-encoding technique-based text encryption and hiding method is proposed. First, the secret text is transformed into a 2-dimensional array and the higher bits of the elements in the transformed array are used to store the bit stream of the secret text, while the lower bits are filled with specific values. Then, the transformed array is encoded with double-random phase-encoding technique. Finally, the encoded array is superimposed on an expanded host image to obtain the image embedded with hidden data. The performance of the proposed technique, including the hiding capacity, the recovery accuracy of the secret text, and the quality of the image embedded with hidden data, is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient. By using optical information processing techniques, the proposed method has been found to significantly improve the security of text information transmission, while ensuring hiding capacity at a prescribed level.

  18. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  19. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  20. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  1. Intelligent Fault Diagnosis of HVCB with Feature Space Optimization-Based Random Forest.

    Science.gov (United States)

    Ma, Suliang; Chen, Mingxuan; Wu, Jianwen; Wang, Yuhao; Jia, Bowen; Jiang, Yuan

    2018-04-16

    Mechanical faults of high-voltage circuit breakers (HVCBs) always happen over long-term operation, so extracting the fault features and identifying the fault type have become a key issue for ensuring the security and reliability of power supply. Based on wavelet packet decomposition technology and random forest algorithm, an effective identification system was developed in this paper. First, compared with the incomplete description of Shannon entropy, the wavelet packet time-frequency energy rate (WTFER) was adopted as the input vector for the classifier model in the feature selection procedure. Then, a random forest classifier was used to diagnose the HVCB fault, assess the importance of the feature variable and optimize the feature space. Finally, the approach was verified based on actual HVCB vibration signals by considering six typical fault classes. The comparative experiment results show that the classification accuracy of the proposed method with the origin feature space reached 93.33% and reached up to 95.56% with optimized input feature vector of classifier. This indicates that feature optimization procedure is successful, and the proposed diagnosis algorithm has higher efficiency and robustness than traditional methods.

  2. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  3. Data-Driven Derivation of an "Informer Compound Set" for Improved Selection of Active Compounds in High-Throughput Screening.

    Science.gov (United States)

    Paricharak, Shardul; IJzerman, Adriaan P; Jenkins, Jeremy L; Bender, Andreas; Nigsch, Florian

    2016-09-26

    Despite the usefulness of high-throughput screening (HTS) in drug discovery, for some systems, low assay throughput or high screening cost can prohibit the screening of large numbers of compounds. In such cases, iterative cycles of screening involving active learning (AL) are employed, creating the need for smaller "informer sets" that can be routinely screened to build predictive models for selecting compounds from the screening collection for follow-up screens. Here, we present a data-driven derivation of an informer compound set with improved predictivity of active compounds in HTS, and we validate its benefit over randomly selected training sets on 46 PubChem assays comprising at least 300,000 compounds and covering a wide range of assay biology. The informer compound set showed improvement in BEDROC(α = 100), PRAUC, and ROCAUC values averaged over all assays of 0.024, 0.014, and 0.016, respectively, compared to randomly selected training sets, all with paired t-test p-values agnostic fashion. This approach led to a consistent improvement in hit rates in follow-up screens without compromising scaffold retrieval. The informer set is adjustable in size depending on the number of compounds one intends to screen, as performance gains are realized for sets with more than 3,000 compounds, and this set is therefore applicable to a variety of situations. Finally, our results indicate that random sampling may not adequately cover descriptor space, drawing attention to the importance of the composition of the training set for predicting actives.

  4. Some results of the spectra of random Schroedinger operators and their application to random point interaction models in one and three dimensions

    International Nuclear Information System (INIS)

    Kirsch, W.; Martinelli, F.

    1981-01-01

    After the derivation of weak conditions under which the potential for the Schroedinger operator is well defined the authers state an ergodicity assumption of this potential which ensures that the spectrum of this operator is a fixed non random set. Then random point interaction Hamiltonians are considered in this framework. Finally the authors consider a model where for sufficiently small fluctuations around the equilibrium positions a finite number of gaps appears. (HSI)

  5. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  6. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  7. Multi-Index Monte Carlo and stochastic collocation methods for random PDEs

    KAUST Repository

    Nobile, Fabio; Haji Ali, Abdul Lateef; Tamellini, Lorenzo; Tempone, Raul

    2016-01-01

    In this talk we consider the problem of computing statistics of the solution of a partial differential equation with random data, where the random coefficient is parametrized by means of a finite or countable sequence of terms in a suitable expansion. We describe and analyze a Multi-Index Monte Carlo (MIMC) and a Multi-Index Stochastic Collocation method (MISC). the former is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using firstorder differences as in MLMC, MIMC uses mixed differences to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2). On the same vein, MISC is a deterministic combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. Provided enough mixed regularity, MISC can achieve better complexity than MIMC. Moreover, we show that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one-dimensional spatial problem. We propose optimization procedures to select the most effective mixed differences to include in MIMC and MISC. Such optimization is a crucial step that allows us to make MIMC and MISC computationally effective. We finally show the effectiveness of MIMC and MISC with some computational tests, including tests with a infinite countable number of random parameters.

  8. Multi-Index Monte Carlo and stochastic collocation methods for random PDEs

    KAUST Repository

    Nobile, Fabio

    2016-01-09

    In this talk we consider the problem of computing statistics of the solution of a partial differential equation with random data, where the random coefficient is parametrized by means of a finite or countable sequence of terms in a suitable expansion. We describe and analyze a Multi-Index Monte Carlo (MIMC) and a Multi-Index Stochastic Collocation method (MISC). the former is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using firstorder differences as in MLMC, MIMC uses mixed differences to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2). On the same vein, MISC is a deterministic combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. Provided enough mixed regularity, MISC can achieve better complexity than MIMC. Moreover, we show that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one-dimensional spatial problem. We propose optimization procedures to select the most effective mixed differences to include in MIMC and MISC. Such optimization is a crucial step that allows us to make MIMC and MISC computationally effective. We finally show the effectiveness of MIMC and MISC with some computational tests, including tests with a infinite countable number of random parameters.

  9. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  10. Pseudo-random bit generator based on lag time series

    Science.gov (United States)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  11. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  12. Selection gradients, the opportunity for selection, and the coefficient of determination.

    Science.gov (United States)

    Moorad, Jacob A; Wade, Michael J

    2013-03-01

    Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.

  13. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  14. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  15. Random Fuzzy Differential Equations with Impulses

    Directory of Open Access Journals (Sweden)

    Ho Vu

    2017-01-01

    Full Text Available We consider the random fuzzy differential equations (RFDEs with impulses. Using Picard method of successive approximations, we shall prove the existence and uniqueness of solutions to RFDEs with impulses under suitable conditions. Some of the properties of solution of RFDEs with impulses are studied. Finally, an example is presented to illustrate the results.

  16. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  17. Interaction of random wave-current over uneven and porous bottoms

    International Nuclear Information System (INIS)

    Suo Yaohong; Zhang Zhonghua; Zhang Jiafan; Suo Xiaohong

    2009-01-01

    Starting from linear wave theory and applying Green's second identity and considering wave-current interaction for porous bottoms and variable water depth, the comprehensive mild-slope equation model theory of wave-current interaction is developed, then paying attention to the effect of random waves, by use of Kubo et al.'s method, a model theory of the interaction between random waves and current over uneven and porous bottoms is established. Finally the characteristics of the random waves are discussed numerically from both the geometric-optics approximation and the target spectrum.

  18. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  19. MATERIALS FOR THE FINAL COVER OF SANITARY LANDFILLS

    OpenAIRE

    Davorin Kovačić

    1994-01-01

    The paper deals with the selection of materials for the sea¬ling layer in the final cover of sanitary landfills. The sealing la¬yer is the most critical component of the final cover. Its role is to minimize percolation of water through the final cover. Ma¬terials used for the construction of the sealing layer are either of mineral origin (compacted clay) or geosynthetic (geomem¬brane). They are most often used in combination creating com¬posite liners. Recently alternative materials are also ...

  20. Expressing stochastic unravellings using random evolution operators

    International Nuclear Information System (INIS)

    Salgado, D; Sanchez-Gomez, J L

    2002-01-01

    We prove how the form of the most general invariant stochastic unravelling for Markovian (recently given in the literature by Wiseman and Diosi) and non-Markovian but Lindblad-type open quantum systems can be attained by imposing a single mathematical condition upon the random evolution operator of the system, namely a.s. trace preservation (a.s. stands for almost surely). The use of random operators ensures the complete positivity of the density operator evolution and characterizes the linear/non-linear character of the evolution in a straightforward way. It is also shown how three quantum stochastic evolution models - continuous spontaneous localization, quantum state diffusion and quantum mechanics with universal position localization - appear as concrete choices for the noise term of the evolution random operators are assumed. We finally conjecture how these operators may in the future be used in two different directions: both to connect quantum stochastic evolution models with random properties of space-time and to handle noisy quantum logical gates

  1. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  2. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  3. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  4. New Trends in Pseudo-Random Number Generation

    Science.gov (United States)

    Gutbrod, F.

    Properties of pseudo-random number generators are reviewed. The emphasis is on correlations between successive random numbers and their suppression by improvement steps. The generators under discussion are the linear congruential generators, lagged Fibonacci generators with various operations, and the improvement techniques combination, shuffling and decimation. The properties of the RANSHI generator are reviewed somewhat more extensively. The transition to 64-bit technology is discussed in several cases. The generators are subject to several tests, which look both for short range and for long range correlations. Some performance figures are given for a Pentium Pro PC. Recommendations are presented in the final chapter.

  5. Peer-Selected “Best Papers”—Are They Really That “Good”?

    Science.gov (United States)

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Background Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Methods Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. Results The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. Discussion There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference. PMID:25789480

  6. An improved label propagation algorithm based on node importance and random walk for community detection

    Science.gov (United States)

    Ma, Tianren; Xia, Zhengyou

    2017-05-01

    Currently, with the rapid development of information technology, the electronic media for social communication is becoming more and more popular. Discovery of communities is a very effective way to understand the properties of complex networks. However, traditional community detection algorithms consider the structural characteristics of a social organization only, with more information about nodes and edges wasted. In the meanwhile, these algorithms do not consider each node on its merits. Label propagation algorithm (LPA) is a near linear time algorithm which aims to find the community in the network. It attracts many scholars owing to its high efficiency. In recent years, there are more improved algorithms that were put forward based on LPA. In this paper, an improved LPA based on random walk and node importance (NILPA) is proposed. Firstly, a list of node importance is obtained through calculation. The nodes in the network are sorted in descending order of importance. On the basis of random walk, a matrix is constructed to measure the similarity of nodes and it avoids the random choice in the LPA. Secondly, a new metric IAS (importance and similarity) is calculated by node importance and similarity matrix, which we can use to avoid the random selection in the original LPA and improve the algorithm stability. Finally, a test in real-world and synthetic networks is given. The result shows that this algorithm has better performance than existing methods in finding community structure.

  7. Gendered dimensions in ERC grant selection - gendERC : Final Report

    NARCIS (Netherlands)

    Schiffbaenker, Helene; van den Besselaar, P.A.A.

    2017-01-01

    To explain lower success rates of female applicants in ERC grants, we collected data about past performance of the applicants and interviewed panel members about how selection criteria are practiced in general and specifically for female vs. male applicants. Controlling for past performance, we

  8. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  9. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  10. Theory of Randomized Search Heuristics in Combinatorial Optimization

    DEFF Research Database (Denmark)

    The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...

  11. Using Random Forest Models to Predict Organizational Violence

    Science.gov (United States)

    Levine, Burton; Bobashev, Georgly

    2012-01-01

    We present a methodology to access the proclivity of an organization to commit violence against nongovernment personnel. We fitted a Random Forest model using the Minority at Risk Organizational Behavior (MAROS) dataset. The MAROS data is longitudinal; so, individual observations are not independent. We propose a modification to the standard Random Forest methodology to account for the violation of the independence assumption. We present the results of the model fit, an example of predicting violence for an organization; and finally, we present a summary of the forest in a "meta-tree,"

  12. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  13. Development of solar selective absorber layers on aluminium. Final report; Entwicklung solarselektiver Absorberschichten auf Aluminium fuer Solarkollektoren. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Hoenicke, D.; Moeller, T.; Schwarz, T.

    1998-01-31

    A new electrolytic process was developed to form solar selective layers on aluminium. In the developed process, both the formation of the alumina layer and the deposition of metals into the layer takes place in only one treatment step using a single electrolysis bath. The main step of the so called ISOC-method (impulse structured oxide ceramic) is the anodic oxidation of aluminium which was carried out by using a pulse technique at different voltages. During the anodic polarisation a thin alumina ceramic layer was formed, while the cathodic led to the metal deposition as copper and nickel. The conditions of the electrolysis were varied in order to estimate optimal parameters achieving solar selective layers with high selectivity. Furthermore, a scale-up of the lab scale apparatus to a mini plant was carried out. Finally, the corrosion resistance of the absorber layers was improved by the formation of a thin hydrophobic overlayer using a sol-gel treatment. (orig.) [Deutsch] Ein neuartiges Behandlungsverfahren zur Erzeugung von solarselektiven Absorberschichten auf Aluminium wurde entwickelt. Bei dieser elektrochemischen Behandlung wird in einem Einstufenprozess mit einem Elektrolyten durch eine Kombination von anodischer Oxidation und bipolarer Pulsbehandlung auf der Oberflaeche des Aluminiums eine impulsstrukturierte Oxidkeramik (ISOK) erzeugt. Dabei entsteht durch eine anodische Oxidation eine strukturierte Aluminiumoxidschicht. Bei der bipolaren Pulsbehandlung erfolgt dann eine Abscheidung der im ISOK-Elektrolyten befindlichen Metalle Cu und Ni auf oder in die Aluminiumoxidoberflaeche. Die ISOK-Behandlung wurde vom Labormassstab zu einem ISOK-Verfahren im Miniplant-Massstab entwickelt. Der Einfluss der elektrischen Parameter und der chemischen Zusammensetzung der ISOK-Elektrolyte wurde untersucht. Durch eine auf das ISOK-Verfahren abgestimmte Nachbehandlung, ein Tauchverfahren in einer Sol-Gel-Loesung, entsteht ein Schichtsystem mit hoher Solarselektivitaet

  14. Black hole state evolution, final state and Hawking radiation

    International Nuclear Information System (INIS)

    Ahn, D

    2012-01-01

    The effect of a black hole state evolution on the Hawking radiation is studied using the final state boundary condition. It is found that the thermodynamic or statistical mechanical properties of a black hole depend strongly on the unitary evolution operator S, which determines the black hole state evolution. When the operator S is random unitary or pseudo-random unitary, a black hole emits thermal radiation as predicted by Hawking three decades ago. In particular, when the black hole mass of the final state vanishes, Hawking’s original result is retrieved. On the other hand, it is found that the emission of the Hawking radiation could be suppressed when the evolution of a black hole state is determined by the generator of the coherent state. Such a case can occur for some primordial black holes with Planck scale mass formed by primordial density fluctuations through the process of squeezing the zero-point quantum fluctuation of a scalar field. Those primordial black holes can survive until the present time and can contribute to cold dark matter. (paper)

  15. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    Science.gov (United States)

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  16. Random Numbers and Monte Carlo Methods

    Science.gov (United States)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  17. On the spectral properties of random finite difference operators

    International Nuclear Information System (INIS)

    Kunz, H.; Souillard, B.

    1980-01-01

    We study a class of random finite difference operators, a typical example of which is the finite difference Schroedinger operator with a random potential which arises in solid state physics in the tight binding approximation. We obtain with probability one, in various situations, the exact location of the spectrum, and criterions for a given part in the spectrum to be pure point or purely continuous, or for the static electric conductivity to vanish. A general formalism is developped which transforms the study of these random operators into that of the asymptotics of a multiple integral constructed from a given recipe. Finally we apply our criterions and formalism to prove that, with probability one, the one-dimensional finite difference Schroedinger operator with a random potential has pure point spectrum and developps no static conductivity. (orig.)

  18. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  19. The influence of anti-infective periodontal treatment on C-reactive protein: a systematic review and meta-analysis of randomized controlled trials.

    Directory of Open Access Journals (Sweden)

    Ryan T Demmer

    Full Text Available Periodontal infections are hypothesized to increase the risk of adverse systemic outcomes through inflammatory mechanisms. The magnitude of effect, if any, of anti-infective periodontal treatment on systemic inflammation is unknown, as are the patient populations most likely to benefit. We conducted a systematic review and meta-analysis of randomized controlled trials (RCTs to test the hypothesis that anti-infective periodontal treatment reduces systemic c-reactive protein (CRP.MEDLINE, EMBASE, CENTRAL and CINAHL databases were searched using sensitivity-enhancing search terms. Eligible RCTs enrolled patients with periodontal infection, compared a clearly defined anti-infective periodontal intervention (experimental group to an "inactive control" (no periodontal intervention or to an "active control" (lower treatment intensity than the experimental group. Mean differences in final CRP values at the earliest post-treatment time point (typically 1-3 months between experimental and control groups were analyzed using random-effects regression. Among 2,753 possible studies 20 were selected, which included 2,561 randomized patients(median=57. Baseline CRP values were >3.0 mg/L in 40% of trials. Among studies with a control group receiving no treatment, the mean difference in CRP final values among experimental treatment vs. control groups was -0.37 mg/L [95%CI=-0.64, -0.11], (P=0.005, favoring experimental treatment. Trials for which the experimental group received antibiotics had stronger effects (P for interaction=0.03 and the mean difference in CRP final values among experimental treatment vs. control was -0.75 mg/L [95%CI=-1.17,-0.33]. No treatment effect was observed among studies using an active treatment comparator. Treatment effects were stronger for studies that included patients with co-morbidities vs. studies that included "systemically healthy" patients, although the interaction was not significant (P=0.48.Anti-infective periodontal

  20. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    Science.gov (United States)

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  1. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  2. Quantum random number generator based on quantum tunneling effect

    OpenAIRE

    Zhou, Haihan; Li, Junlin; Pan, Dong; Zhang, Weixing; Long, Guilu

    2017-01-01

    In this paper, we proposed an experimental implementation of quantum random number generator(QRNG) with inherent randomness of quantum tunneling effect of electrons. We exploited InGaAs/InP diodes, whose valance band and conduction band shared a quasi-constant energy barrier. We applied a bias voltage on the InGaAs/InP avalanche diode, which made the diode works under Geiger mode, and triggered the tunneling events with a periodic pulse. Finally, after data collection and post-processing, our...

  3. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  4. Feature-selective attention in healthy old age: a selective decline in selective attention?

    Science.gov (United States)

    Quigley, Cliodhna; Müller, Matthias M

    2014-02-12

    Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.

  5. Opportunistic Relay Selection with Cooperative Macro Diversity

    Directory of Open Access Journals (Sweden)

    Yu Chia-Hao

    2010-01-01

    Full Text Available We apply a fully opportunistic relay selection scheme to study cooperative diversity in a semianalytical manner. In our framework, idle Mobile Stations (MSs are capable of being used as Relay Stations (RSs and no relaying is required if the direct path is strong. Our relay selection scheme is fully selection based: either the direct path or one of the relaying paths is selected. Macro diversity, which is often ignored in analytical works, is taken into account together with micro diversity by using a complete channel model that includes both shadow fading and fast fading effects. The stochastic geometry of the network is taken into account by having a random number of randomly located MSs. The outage probability analysis of the selection differs from the case where only fast fading is considered. Under our framework, distribution of the received power is formulated using different Channel State Information (CSI assumptions to simulate both optimistic and practical environments. The results show that the relay selection gain can be significant given a suitable amount of candidate RSs. Also, while relay selection according to incomplete CSI is diversity suboptimal compared to relay selection based on full CSI, the loss in average throughput is not too significant. This is a consequence of the dominance of geometry over fast fading.

  6. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  7. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    Science.gov (United States)

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Velocity and Dispersion for a Two-Dimensional Random Walk

    International Nuclear Information System (INIS)

    Li Jinghui

    2009-01-01

    In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)

  9. Risk Attitudes, Sample Selection and Attrition in a Longitudinal Field Experiment

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten Igel

    with respect to risk attitudes. Our design builds in explicit randomization on the incentives for participation. We show that there are significant sample selection effects on inferences about the extent of risk aversion, but that the effects of subsequent sample attrition are minimal. Ignoring sample...... selection leads to inferences that subjects in the population are more risk averse than they actually are. Correcting for sample selection and attrition affects utility curvature, but does not affect inferences about probability weighting. Properly accounting for sample selection and attrition effects leads...... to findings of temporal stability in overall risk aversion. However, that stability is around different levels of risk aversion than one might naively infer without the controls for sample selection and attrition we are able to implement. This evidence of “randomization bias” from sample selection...

  10. Final disposal of radioactive waste

    Directory of Open Access Journals (Sweden)

    Freiesleben H.

    2013-06-01

    Full Text Available In this paper the origin and properties of radioactive waste as well as its classification scheme (low-level waste – LLW, intermediate-level waste – ILW, high-level waste – HLW are presented. The various options for conditioning of waste of different levels of radioactivity are reviewed. The composition, radiotoxicity and reprocessing of spent fuel and their effect on storage and options for final disposal are discussed. The current situation of final waste disposal in a selected number of countries is mentioned. Also, the role of the International Atomic Energy Agency with regard to the development and monitoring of international safety standards for both spent nuclear fuel and radioactive waste management is described.

  11. MATERIALS FOR THE FINAL COVER OF SANITARY LANDFILLS

    Directory of Open Access Journals (Sweden)

    Davorin Kovačić

    1994-12-01

    Full Text Available The paper deals with the selection of materials for the sea¬ling layer in the final cover of sanitary landfills. The sealing la¬yer is the most critical component of the final cover. Its role is to minimize percolation of water through the final cover. Ma¬terials used for the construction of the sealing layer are either of mineral origin (compacted clay or geosynthetic (geomem¬brane. They are most often used in combination creating com¬posite liners. Recently alternative materials are also used like paper mill sludge or discarded swelling clay.

  12. Lines of Descent Under Selection

    Science.gov (United States)

    Baake, Ellen; Wakolbinger, Anton

    2017-11-01

    We review recent progress on ancestral processes related to mutation-selection models, both in the deterministic and the stochastic setting. We mainly rely on two concepts, namely, the killed ancestral selection graph and the pruned lookdown ancestral selection graph. The killed ancestral selection graph gives a representation of the type of a random individual from a stationary population, based upon the individual's potential ancestry back until the mutations that define the individual's type. The pruned lookdown ancestral selection graph allows one to trace the ancestry of individuals from a stationary distribution back into the distant past, thus leading to the stationary distribution of ancestral types. We illustrate the results by applying them to a prototype model for the error threshold phenomenon.

  13. Implementing traceability using particle randomness-based textile printed tags

    Science.gov (United States)

    Agrawal, T. K.; Koehl, L.; Campagne, C.

    2017-10-01

    This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.

  14. Security Analysis of Randomize-Hash-then-Sign Digital Signatures

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde

    2012-01-01

    At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar...... functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online...... 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash...

  15. Real-time flavor tagging selection in ATLAS

    CERN Document Server

    Madaffari, D

    2016-01-01

    In high-energy physics experiments the online selection is crucial to reject the overwhelming uninteresting collisions. In particular the ATLAS experiment includes b-jet selections in its trigger, in order to select final states with significant heavy-flavor content. Dedicated selections are developed to timely identifying fully hadronic final states containing b-jets and maintaining affordable trigger rates. ATLAS successfully operated b-jet trigger selections during both 2011 and 2012 Large Hadron Collider data-taking campaigns. Work is on-going now to improve the performance of online tagging algorithms to be deployed in Run 2 in 2015. An overview of the Run 1 ATLAS b-jet trigger strategy along with future prospects is presented in this paper. Data-driven techniques to extract the online b-tagging performance, a key ingredient for all analysis relying on such triggers, are also discussed and preliminary results presented.

  16. Feature Selection with the Boruta Package

    OpenAIRE

    Kursa, Miron B.; Rudnicki, Witold R.

    2010-01-01

    This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  17. Modeling random combustion of lycopodium particles and gas

    Directory of Open Access Journals (Sweden)

    M Bidabadi

    2016-06-01

    Full Text Available The random modeling combustion of lycopodium particles has been researched by many authors. In this paper, we extend this model and we also generate a different method by analyzing the effect of random distributed sources of combustible mixture. The flame structure is assumed to consist of a preheat-vaporization zone, a reaction zone and finally a post flame zone. We divide the preheat zone to different parts. We assumed that there is different distribution of particles in sections which are really random. Meanwhile, it is presumed that the fuel particles vaporize first to yield gaseous fuel. In other words, most of the fuel particles are vaporized at the end of the preheat zone. It is assumed that the Zel’dovich number is large; therefore, the reaction term in preheat zone is negligible. In this work, the effect of random distribution of particles in the preheat zone on combustion characteristics such as burning velocity, flame temperature for different particle radius is obtained.

  18. Monomer-dimer problem on random planar honeycomb lattice

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Haizhen [School of Mathematical Sciences, Xiamen University, Xiamen 361005, Fujian (China); Department of Mathematics, Qinghai Normal University, Xining 810008, Qinghai (China); Zhang, Fuji; Qian, Jianguo, E-mail: jqqian@xmu.edu.cn [School of Mathematical Sciences, Xiamen University, Xiamen 361005, Fujian (China)

    2014-02-15

    We consider the monomer-dimer (MD) problem on a random planar honeycomb lattice model, namely, the random multiple chain. This is a lattice system with non-periodic boundary condition, whose generating process is inspired by the growth of single walled zigzag carbon nanotubes. By applying algebraic and combinatorial techniques we establish a calculating expression of the MD partition function for bipartite graphs, which corresponds to the permanent of a matrix. Further, by using the transfer matrix argument we show that the computing problem of the permanent of high order matrix can be converted into some lower order matrices for this family of lattices, based on which we derive an explicit recurrence formula for evaluating the MD partition function of multiple chains and random multiple chains. Finally, we analyze the expectation of the number of monomer-dimer arrangements on a random multiple chain and the asymptotic behavior of the annealed MD entropy when the multiple chain becomes infinite in width and length, respectively.

  19. Social phenotype extended to communities: expanded multilevel social selection analysis reveals fitness consequences of interspecific interactions.

    Science.gov (United States)

    Campobello, Daniela; Hare, James F; Sarà, Maurizio

    2015-04-01

    In social species, fitness consequences are associated with both individual and social phenotypes. Social selection analysis has quantified the contribution of conspecific social traits to individual fitness. There has been no attempt, however, to apply a social selection approach to quantify the fitness implications of heterospecific social phenotypes. Here, we propose a novel social selection based approach integrating the role of all social interactions at the community level. We extended multilevel selection analysis by including a term accounting for the group phenotype of heterospecifics. We analyzed nest activity as a model social trait common to two species, the lesser kestrel (Falco naumanni) and jackdaw (Corvus monedula), nesting in either single- or mixed-species colonies. By recording reproductive outcome as a measure of relative fitness, our results reveal an asymmetric system wherein only jackdaw breeding performance was affected by the activity phenotypes of both conspecific and heterospecific neighbors. Our model incorporating heterospecific social phenotypes is applicable to animal communities where interacting species share a common social trait, thus allowing an assessment of the selection pressure imposed by interspecific interactions in nature. Finally, we discuss the potential role of ecological limitations accounting for random or preferential assortments among interspecific social phenotypes, and the implications of such processes to community evolution. © 2015 The Author(s).

  20. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  1. Emotional textile image classification based on cross-domain convolutional sparse autoencoders with feature selection

    Science.gov (United States)

    Li, Zuhe; Fan, Yangyu; Liu, Weihua; Yu, Zeqi; Wang, Fengqin

    2017-01-01

    We aim to apply sparse autoencoder-based unsupervised feature learning to emotional semantic analysis for textile images. To tackle the problem of limited training data, we present a cross-domain feature learning scheme for emotional textile image classification using convolutional autoencoders. We further propose a correlation-analysis-based feature selection method for the weights learned by sparse autoencoders to reduce the number of features extracted from large size images. First, we randomly collect image patches on an unlabeled image dataset in the source domain and learn local features with a sparse autoencoder. We then conduct feature selection according to the correlation between different weight vectors corresponding to the autoencoder's hidden units. We finally adopt a convolutional neural network including a pooling layer to obtain global feature activations of textile images in the target domain and send these global feature vectors into logistic regression models for emotional image classification. The cross-domain unsupervised feature learning method achieves 65% to 78% average accuracy in the cross-validation experiments corresponding to eight emotional categories and performs better than conventional methods. Feature selection can reduce the computational cost of global feature extraction by about 50% while improving classification performance.

  2. Tracheal intubation in critically ill patients: a comprehensive systematic review of randomized trials.

    Science.gov (United States)

    Cabrini, Luca; Landoni, Giovanni; Baiardo Radaelli, Martina; Saleh, Omar; Votta, Carmine D; Fominskiy, Evgeny; Putzu, Alessandro; Snak de Souza, Cézar Daniel; Antonelli, Massimo; Bellomo, Rinaldo; Pelosi, Paolo; Zangrillo, Alberto

    2018-01-20

    We performed a systematic review of randomized controlled studies evaluating any drug, technique or device aimed at improving the success rate or safety of tracheal intubation in the critically ill. We searched PubMed, BioMed Central, Embase and the Cochrane Central Register of Clinical Trials and references of retrieved articles. Finally, pertinent reviews were also scanned to detect further studies until May 2017. The following inclusion criteria were considered: tracheal intubation in adult critically ill patients; randomized controlled trial; study performed in Intensive Care Unit, Emergency Department or ordinary ward; and work published in the last 20 years. Exclusion criteria were pre-hospital or operating theatre settings and simulation-based studies. Two investigators selected studies for the final analysis. Extracted data included first author, publication year, characteristics of patients and clinical settings, intervention details, comparators and relevant outcomes. The risk of bias was assessed with the Cochrane Collaboration's Risk of Bias tool. We identified 22 trials on use of a pre-procedure check-list (1 study), pre-oxygenation or apneic oxygenation (6 studies), sedatives (3 studies), neuromuscular blocking agents (1 study), patient positioning (1 study), video laryngoscopy (9 studies), and post-intubation lung recruitment (1 study). Pre-oxygenation with non-invasive ventilation (NIV) and/or high-flow nasal cannula (HFNC) showed a possible beneficial role. Post-intubation recruitment improved oxygenation , while ramped position increased the number of intubation attempts and thiopental had negative hemodynamic effects. No effect was found for use of a checklist, apneic oxygenation (on oxygenation and hemodynamics), videolaryngoscopy (on number and length of intubation attempts), sedatives and neuromuscular blockers (on hemodynamics). Finally, videolaryngoscopy was associated with severe adverse effects in multiple trials. The limited available

  3. Pseudo-random-number generators and the square site percolation threshold.

    Science.gov (United States)

    Lee, Michael J

    2008-09-01

    Selected pseudo-random-number generators are applied to a Monte Carlo study of the two-dimensional square-lattice site percolation model. A generator suitable for high precision calculations is identified from an application specific test of randomness. After extended computation and analysis, an ostensibly reliable value of p_{c}=0.59274598(4) is obtained for the percolation threshold.

  4. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  5. Effect of mirtazapine versus selective serotonin reuptake inhibitors on benzodiazepine use in patients with major depressive disorder: a pragmatic, multicenter, open-label, randomized, active-controlled, 24-week trial.

    Science.gov (United States)

    Hashimoto, Tasuku; Shiina, Akihiro; Hasegawa, Tadashi; Kimura, Hiroshi; Oda, Yasunori; Niitsu, Tomihisa; Ishikawa, Masatomo; Tachibana, Masumi; Muneoka, Katsumasa; Matsuki, Satoshi; Nakazato, Michiko; Iyo, Masaomi

    2016-01-01

    This study aimed to evaluate whether selecting mirtazapine as the first choice for current depressive episode instead of selective serotonin reuptake inhibitors (SSRIs) reduces benzodiazepine use in patients with major depressive disorder (MDD). We concurrently examined the relationship between clinical responses and serum mature brain-derived neurotrophic factor (BDNF) and its precursor, proBDNF. We conducted an open-label randomized trial in routine psychiatric practice settings. Seventy-seven MDD outpatients were randomly assigned to the mirtazapine or predetermined SSRIs groups, and investigators arbitrarily selected sertraline or paroxetine. The primary outcome was the proportion of benzodiazepine users at weeks 6, 12, and 24 between the groups. We defined patients showing a ≥50 % reduction in Hamilton depression rating scale (HDRS) scores from baseline as responders. Blood samples were collected at baseline, weeks 6, 12, and 24. Sixty-five patients prescribed benzodiazepines from prescription day 1 were analyzed for the primary outcome. The percentage of benzodiazepine users was significantly lower in the mirtazapine than in the SSRIs group at weeks 6, 12, and 24 (21.4 vs. 81.8 %; 11.1 vs. 85.7 %, both P  depressive episodes may reduce benzodiazepine use in patients with MDD. Trial registration UMIN000004144. Registered 2nd September 2010. The date of enrolment of the first participant to the trial was 24th August 2010. This study was retrospectively registered 9 days after the first participant was enrolled.

  6. Pseudo-random number generator for the Sigma 5 computer

    Science.gov (United States)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  7. Review of Random Phase Encoding in Volume Holographic Storage

    Directory of Open Access Journals (Sweden)

    Wei-Chia Su

    2012-09-01

    Full Text Available Random phase encoding is a unique technique for volume hologram which can be applied to various applications such as holographic multiplexing storage, image encryption, and optical sensing. In this review article, we first review and discuss diffraction selectivity of random phase encoding in volume holograms, which is the most important parameter related to multiplexing capacity of volume holographic storage. We then review an image encryption system based on random phase encoding. The alignment of phase key for decryption of the encoded image stored in holographic memory is analyzed and discussed. In the latter part of the review, an all-optical sensing system implemented by random phase encoding and holographic interconnection is presented.

  8. Managing salinity in Upper Colorado River Basin streams: Selecting catchments for sediment control efforts using watershed characteristics and random forests models

    Science.gov (United States)

    Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.

    2018-01-01

    Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.

  9. Randomizer for High Data Rates

    Science.gov (United States)

    Garon, Howard; Sank, Victor J.

    2018-01-01

    NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.

  10. Real-time flavor tagging selection in ATLAS

    CERN Document Server

    Madaffari, D; The ATLAS collaboration

    2014-01-01

    In high-energy physics experiments on hadron colliders, online selection is crucial to reject most uninteresting collisions. In particular, the ATLAS experiment includes b-jet selections in its trigger strategy, in order to select final states with heavy-flavor content and enlarge its physics potentials. Dedicated selections are developed to quickly identify fully hadronic final states containing b-jets, while rejecting light QCD jets, and maintain affordable trigger rates without raising jet energy thresholds. ATLAS successfully operated b-jet trigger selections during both 2011 and 2012 data-taking campaigns and hard work is on-going now to improve the performance of tagging algorithms for coming Run2 in 2015. An overview of the ATLAS b-jet trigger strategy and its performance on real data is presented in this contribution, along with future prospects. Data-driven techniques to extract the online b-tagging performance, a key ingredient for all analyses relying on such triggers, are also discussed and result...

  11. Evaluation of Bearing Capacity of Strip Footing Using Random Layers Concept

    Directory of Open Access Journals (Sweden)

    Kawa Marek

    2015-09-01

    Full Text Available The paper deals with evaluation of bearing capacity of strip foundation on random purely cohesive soil. The approach proposed combines random field theory in the form of random layers with classical limit analysis and Monte Carlo simulation. For given realization of random the bearing capacity of strip footing is evaluated by employing the kinematic approach of yield design theory. The results in the form of histograms for both bearing capacity of footing as well as optimal depth of failure mechanism are obtained for different thickness of random layers. For zero and infinite thickness of random layer the values of depth of failure mechanism as well as bearing capacity assessment are derived in a closed form. Finally based on a sequence of Monte Carlo simulations the bearing capacity of strip footing corresponding to a certain probability of failure is estimated. While the mean value of the foundation bearing capacity increases with the thickness of the random layers, the ultimate load corresponding to a certain probability of failure appears to be a decreasing function of random layers thickness.

  12. Feature Selection with the Boruta Package

    Directory of Open Access Journals (Sweden)

    Miron B. Kursa

    2010-10-01

    Full Text Available This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  13. Modification of appetite by bread consumption: A systematic review of randomized controlled trials.

    Science.gov (United States)

    Gonzalez-Anton, Carolina; Artacho, Reyes; Ruiz-Lopez, Maria D; Gil, Angel; Mesa, Maria D

    2017-09-22

    The inclusion of different ingredients or the use of different baking technologies may modify the satiety response to bread, and aid in the control of food intake. The aim of this study was to perform a systematic search of randomized clinical trials on the effect of bread consumption on appetite ratings in humans. The search equation was ("Bread"[MeSH]) AND ("Satiation"[MeSH] OR "Satiety response"[MeSH]), and the filter "clinical trials." As a result of this procedure, 37 publications were selected. The satiety response was considered as the primary outcome. The studies were classified as follows: breads differing in their flour composition, breads differing in ingredients other than flours, breads with added organic acids, or breads made using different baking technologies. In addition, we have revised the data related to the influence of bread on glycemic index, insulinemic index and postprandial gastrointestinal hormones responses. The inclusion of appropriate ingredients such as fiber, proteins, legumes, seaweeds and acids into breads and the use of specific technologies may result in the development of healthier breads that increase satiety and satiation, which may aid in the control of weight gain and benefit postprandial glycemia. However, more well-designed randomized control trials are required to reach final conclusions.

  14. A Monte Carlo study of adsorption of random copolymers on random surfaces

    CERN Document Server

    Moghaddam, M S

    2003-01-01

    We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...

  15. Selective information sampling

    Directory of Open Access Journals (Sweden)

    Peter A. F. Fraser-Mackenzie

    2009-06-01

    Full Text Available This study investigates the amount and valence of information selected during single item evaluation. One hundred and thirty-five participants evaluated a cell phone by reading hypothetical customers reports. Some participants were first asked to provide a preliminary rating based on a picture of the phone and some technical specifications. The participants who were given the customer reports only after they made a preliminary rating exhibited valence bias in their selection of customers reports. In contrast, the participants that did not make an initial rating sought subsequent information in a more balanced, albeit still selective, manner. The preliminary raters used the least amount of information in their final decision, resulting in faster decision times. The study appears to support the notion that selective exposure is utilized in order to develop cognitive coherence.

  16. Selection of mRNA 5'-untranslated region sequence with high translation efficiency through ribosome display

    International Nuclear Information System (INIS)

    Mie, Masayasu; Shimizu, Shun; Takahashi, Fumio; Kobatake, Eiry

    2008-01-01

    The 5'-untranslated region (5'-UTR) of mRNAs functions as a translation enhancer, promoting translation efficiency. Many in vitro translation systems exhibit a reduced efficiency in protein translation due to decreased translation initiation. The use of a 5'-UTR sequence with high translation efficiency greatly enhances protein production in these systems. In this study, we have developed an in vitro selection system that favors 5'-UTRs with high translation efficiency using a ribosome display technique. A 5'-UTR random library, comprised of 5'-UTRs tagged with a His-tag and Renilla luciferase (R-luc) fusion, were in vitro translated in rabbit reticulocytes. By limiting the translation period, only mRNAs with high translation efficiency were translated. During translation, mRNA, ribosome and translated R-luc with His-tag formed ternary complexes. They were collected with translated His-tag using Ni-particles. Extracted mRNA from ternary complex was amplified using RT-PCR and sequenced. Finally, 5'-UTR with high translation efficiency was obtained from random 5'-UTR library

  17. Universal Prevention for Anxiety and Depressive Symptoms in Children: A Meta-analysis of Randomized and Cluster-Randomized Trials.

    Science.gov (United States)

    Ahlen, Johan; Lenhard, Fabian; Ghaderi, Ata

    2015-12-01

    Although under-diagnosed, anxiety and depression are among the most prevalent psychiatric disorders in children and adolescents, leading to severe impairment, increased risk of future psychiatric problems, and a high economic burden to society. Universal prevention may be a potent way to address these widespread problems. There are several benefits to universal relative to targeted interventions because there is limited knowledge as to how to screen for anxiety and depression in the general population. Earlier meta-analyses of the prevention of depression and anxiety symptoms among children suffer from methodological inadequacies such as combining universal, selective, and indicated interventions in the same analyses, and comparing cluster-randomized trials with randomized trials without any correction for clustering effects. The present meta-analysis attempted to determine the effectiveness of universal interventions to prevent anxiety and depressive symptoms after correcting for clustering effects. A systematic search of randomized studies in PsychINFO, Cochrane Library, and Google Scholar resulted in 30 eligible studies meeting inclusion criteria, namely peer-reviewed, randomized or cluster-randomized trials of universal interventions for anxiety and depressive symptoms in school-aged children. Sixty-three percent of the studies reported outcome data regarding anxiety and 87 % reported outcome data regarding depression. Seventy percent of the studies used randomization at the cluster level. There were small but significant effects regarding anxiety (.13) and depressive (.11) symptoms as measured at immediate posttest. At follow-up, which ranged from 3 to 48 months, effects were significantly larger than zero regarding depressive (.07) but not anxiety (.11) symptoms. There was no significant moderation effect of the following pre-selected variables: the primary aim of the intervention (anxiety or depression), deliverer of the intervention, gender distribution

  18. Prevalence of at-risk genotypes for genotoxic effects decreases with age in a randomly selected population in Flanders: a cross sectional study

    Directory of Open Access Journals (Sweden)

    van Delft Joost HM

    2011-10-01

    Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of

  19. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    Science.gov (United States)

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  20. Effects of prey abundance, distribution, visual contrast and morphology on selection by a pelagic piscivore

    Science.gov (United States)

    Hansen, Adam G.; Beauchamp, David A.

    2014-01-01

    Most predators eat only a subset of possible prey. However, studies evaluating diet selection rarely measure prey availability in a manner that accounts for temporal–spatial overlap with predators, the sensory mechanisms employed to detect prey, and constraints on prey capture.We evaluated the diet selection of cutthroat trout (Oncorhynchus clarkii) feeding on a diverse planktivore assemblage in Lake Washington to test the hypothesis that the diet selection of piscivores would reflect random (opportunistic) as opposed to non-random (targeted) feeding, after accounting for predator–prey overlap, visual detection and capture constraints.Diets of cutthroat trout were sampled in autumn 2005, when the abundance of transparent, age-0 longfin smelt (Spirinchus thaleichthys) was low, and 2006, when the abundance of smelt was nearly seven times higher. Diet selection was evaluated separately using depth-integrated and depth-specific (accounted for predator–prey overlap) prey abundance. The abundance of different prey was then adjusted for differences in detectability and vulnerability to predation to see whether these factors could explain diet selection.In 2005, cutthroat trout fed non-randomly by selecting against the smaller, transparent age-0 longfin smelt, but for the larger age-1 longfin smelt. After adjusting prey abundance for visual detection and capture, cutthroat trout fed randomly. In 2006, depth-integrated and depth-specific abundance explained the diets of cutthroat trout well, indicating random feeding. Feeding became non-random after adjusting for visual detection and capture. Cutthroat trout selected strongly for age-0 longfin smelt, but against similar sized threespine stickleback (Gasterosteus aculeatus) and larger age-1 longfin smelt in 2006. Overlap with juvenile sockeye salmon (O. nerka) was minimal in both years, and sockeye salmon were rare in the diets of cutthroat trout.The direction of the shift between random and non-random selection

  1. Efficacy and safety of fasudil in patients with subarachnoid hemorrhage. Final results of a randomized trial of fasudil versus nimodipine

    International Nuclear Information System (INIS)

    Zhao Jizong; Zhou Dingbiao; Guo Jing

    2011-01-01

    Fasudil is believed to be at least equally effective as nimodipine for the prevention of cerebral vasospasm and subsequent ischemic injury in patients undergoing surgery for subarachnoid hemorrhage (SAH). We report the final results of a randomized, open trial to compare the efficacy and safety of fasudil with nimodipine. A total of 63 patients undergoing surgery for SAH received fasudil and 66 received nimodipine between 1998 and 2004. Symptomatic vasospasm, low density areas on computed tomography (CT), clinical outcomes, and adverse events were all recorded, and the results were compared between the fasudil and nimodipine groups. Absence of symptomatic vasospasm, occurrence of low density areas associated with vasospasm on CT, and occurrence of adverse events were similar between the two groups. The clinical outcomes were more favorable in the fasudil group than in the nimodipine group (p=0.040). The proportion of patients with good clinical outcome was 74.5% (41/55) in the fasudil group and 61.7% (37/60) in the nimodipine group. There were no serious adverse events reported in the fasudil group. The present results suggest that fasudil is equally or more effective than nimodipine for the prevention of cerebral vasospasm and subsequent ischemic injury in patients undergoing surgery for SAH. (author)

  2. Why the null matters: statistical tests, random walks and evolution.

    Science.gov (United States)

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  3. Changes in competence of public authorities in connection with final storage pursuant to the omnibus law on site selection

    International Nuclear Information System (INIS)

    Keienburg, Bettina

    2012-01-01

    The omnibus law on site finding and selection for a repository for radioactive waste generating heat, and with amendments to other laws, of June 13, 2012 is to reshuffle fundamentally the competences of public authorities for final storage. The federal government is to assume more responsibilities from former federal state competences. Moreover, most of the existing competences of the Federal Office for Radiation Protection are to be transferred to a federal agency yet to be founded, which is called Federal Office for Nuclear Safety in the present draft legislation. The Federal Office for Radiation Protection will only retain its responsibility as project agent for repositories, and that only in the phases of site exploration and licensing. Afterwards, the duty of final storage in the draft legislation is transferred to a 3rd party. Again in the version of the draft legislation, and unlike present regulations, this 3rd party may only be a company whose sole owner is the federal government, which also is to strengthen the influence of the federal government under aspects of company law. Legislative efforts seeking to strengthen the federal government and its competences by assigning licensing duties for repositories to federal agencies are understandable under feasibility aspects and may even be in the emotional interest of the states and their competent representatives in public authorities who, merely because their work is connected with the disputed topic of final storage, often face attacks and accusations by the public. Nevertheless, the transfer to federal agencies of administrative duties is subject to constitutional limits which must be observed. These constitutional aspects are highlighted in the publication. It is left to the reader to assess the meaningfulness of establishing another independent high-level federal agency in the area of responsibility of the Federal Ministry of the Environment (BMU), i.e. a Federal Office for Nuclear Safety, alongside the

  4. Natural Selection as an Emergent Process: Instructional Implications

    Science.gov (United States)

    Cooper, Robert A.

    2017-01-01

    Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…

  5. Assessing the germplasm of Laminaria (phaeophyceae) with random amplified polymorphic DNA (RAPD) method

    Science.gov (United States)

    He, Yingjun; Zou, Yuping; Wang, Xiaodong; Zheng, Zhiguo; Zhang, Daming; Duan, Delin

    2003-06-01

    Eighteen gametophytes including L. japonica, L. ochotensis and L. longissima, were verified with random amplified polymorphic DNA (RAPD) technique. Eighteen ten-base primers were chosen from 100 primers selected for final amplification test. Among the total of 205 bands amplified, 181 (88.3%) were polymorphic. The genetic distance among different strains ranged from 0.072 to 0.391. The dendrogram constructed by unweighted pair-group method with arithmetic (UPGMA) method showed that the female and male gametophytes of the same cell lines could be grouped in pairs respectively. It indicated that RAPD analysis could be used not only to distinguish different strains of Laminaria, but also to distinguish male and female gametophyte within the same cell lines. There is ambiguous systematic relationship if judged merely by the present data. It seems that the use of RAPD marker is limited to elucidation of the phylogenetic relationship among the species of Laminaria.

  6. Conversion of the random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    Conversion of the random amplified polymorphic DNA (RAPD) marker UBC#116 linked to Fusarium crown and root rot resistance gene (Frl) into a co-dominant sequence characterized amplified region (SCAR) marker for marker-assisted selection of tomato.

  7. Legal protection issues with regard to the site selection of a final repository for heat-generating radioactive wastes; Rechtsschutzfragen hinsichtlich der Standortauswahl eines Endlagers fuer waermeentwickelnde radioaktive Abfaelle

    Energy Technology Data Exchange (ETDEWEB)

    Keienburg, Bettina [KUEMMERLEIN Rechtsanwaelte und Notare, Essen (Germany)

    2014-10-15

    The site selection law (hereinafter StandAG) adopted on 23.07 2013 and entirely entered into force on 01.01.2014 shall clarify especially the site selection question of a final repository for highly radioactive waste until the year 2031. For this purpose, the act regulates a comprehensive procedure with multiple public participations and multiple interventions of the legislator. Thus the legislator hopes for an accepted dispute decision on a permanent basis - meaning acceptance. It remains to be seen, if this expectation is realistic. The StandAG and the decisions provided already rise potential for dispute. Added to this is a large number of dispute potential with regard to scouting out in connection with the site selection - despite of legal regulations for scouting out sites and remaining sites- and required authorisation. The potential shall be reported below.

  8. Local lattice relaxations in random metallic alloys: Effective tetrahedron model and supercell approach

    DEFF Research Database (Denmark)

    Ruban, Andrei; Simak, S.I.; Shallcross, S.

    2003-01-01

    We present a simple effective tetrahedron model for local lattice relaxation effects in random metallic alloys on simple primitive lattices. A comparison with direct ab initio calculations for supercells representing random Ni0.50Pt0.50 and Cu0.25Au0.75 alloys as well as the dilute limit of Au-ri......-rich CuAu alloys shows that the model yields a quantitatively accurate description of the relaxtion energies in these systems. Finally, we discuss the bond length distribution in random alloys....

  9. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  10. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole

    Science.gov (United States)

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-01

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  11. Effect of a Counseling Session Bolstered by Text Messaging on Self-Selected Health Behaviors in College Students: A Preliminary Randomized Controlled Trial.

    Science.gov (United States)

    Sandrick, Janice; Tracy, Doreen; Eliasson, Arn; Roth, Ashley; Bartel, Jeffrey; Simko, Melanie; Bowman, Tracy; Harouse-Bell, Karen; Kashani, Mariam; Vernalis, Marina

    2017-05-17

    The college experience is often the first time when young adults live independently and make their own lifestyle choices. These choices affect dietary behaviors, exercise habits, techniques to deal with stress, and decisions on sleep time, all of which direct the trajectory of future health. There is a need for effective strategies that will encourage healthy lifestyle choices in young adults attending college. This preliminary randomized controlled trial tested the effect of coaching and text messages (short message service, SMS) on self-selected health behaviors in the domains of diet, exercise, stress, and sleep. A second analysis measured the ripple effect of the intervention on health behaviors not specifically selected as a goal by participants. Full-time students aged 18-30 years were recruited by word of mouth and campuswide advertisements (flyers, posters, mailings, university website) at a small university in western Pennsylvania from January to May 2015. Exclusions included pregnancy, eating disorders, chronic medical diagnoses, and prescription medications other than birth control. Of 60 participants, 30 were randomized to receive a single face-to-face meeting with a health coach to review results of behavioral questionnaires and to set a health behavior goal for the 8-week study period. The face-to-face meeting was followed by SMS text messages designed to encourage achievement of the behavioral goal. A total of 30 control subjects underwent the same health and behavioral assessments at intake and program end but did not receive coaching or SMS text messages. The texting app showed that 87.31% (2187/2505) of messages were viewed by intervention participants. Furthermore, 28 of the 30 intervention participants and all 30 control participants provided outcome data. Among intervention participants, 22 of 30 (73%) showed improvement in health behavior goal attainment, with the whole group (n=30) showing a mean improvement of 88% (95% CI 39-136). Mean

  12. Interplay of Determinism and Randomness: From Irreversibility to Chaos, Fractals, and Stochasticity

    Science.gov (United States)

    Tsonis, A.

    2017-12-01

    We will start our discussion into randomness by looking exclusively at our formal mathematical system to show that even in this pure and strictly logical system one cannot do away with randomness. By employing simple mathematical models, we will identify the three possible sources of randomness: randomness due to inability to find the rules (irreversibility), randomness due to inability to have infinite power (chaos), and randomness due to stochastic processes. Subsequently we will move from the mathematical system to our physical world to show that randomness, through the quantum mechanical character of small scales, through chaos, and because of the second law of thermodynamics, is an intrinsic property of nature as well. We will subsequently argue that the randomness in the physical world is consistent with the three sources of randomness suggested from the study of simple mathematical systems. Many examples ranging from purely mathematical to natural processes will be presented, which clearly demonstrate how the combination of rules and randomness produces the world we live in. Finally, the principle of least effort or the principle of minimum energy consumption will be suggested as the underlying principle behind this symbiosis between determinism and randomness.

  13. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  14. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  15. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  16. Real-time flavor tagging selection in ATLAS

    CERN Document Server

    Sahinsoy, M; The ATLAS collaboration

    2014-01-01

    In high-energy physics experiments, online selection is crucial to reject most uninteresting collisions; in particular, b-jet selections, part of the ATLAS trigger strategy, are meant to select final states with heavy-flavor content. This is the only option to select fully hadronic final states containing b-jets, and is important to reject QCD light jets and maintain affordable trigger rates without raising jet energy thresholds. ATLAS operated b-jet triggers in both 2011 and 2012 data-taking campaigns and is now working to improve the performance of tagging algorithms for Run2. An overview of the ATLAS b-jet trigger strategy and its performance on real data is presented in this contribution, along with future prospects. Data-driven techniques to extract the online b-tagging performance, a key ingredient for all analyses relying on such triggers, are also discussed and results presented.

  17. The environmental factors to be considered in the site selection studies of the spent fuel final disposal

    International Nuclear Information System (INIS)

    Aeikaes, Timo

    1985-10-01

    The ojective of the work has been to elucidate environmental factors, which could have an influence on the selection of areas. The factors were identified and their significance evaluated by going through the present plan for the final disposal of spent fuel. Population density and transport conditions were the most important factors. Protected areas, groundwater reservoirs and restrictions presented in regional land-use plans were also noted. The potential areas have been identified by the Geological Survey of Finland. First 327 large bedrock blocks were identified. The extent of the block areas was between 100-200 km 2 . The environmental factors of these areas were mapped and the areas were classified. The study was based on maps, published regional plans and inventory of groundwater reservoirs. The Geological Survey of Finland selected 162 block areas for preliminary characterization and geological classification. 61 block areas were chosen for further geological studies. By interpretation of aerial photographs and field reconnaissance trip the Geological Survey identified 134 potential investigation areas. A large block area typically contained two possible investigation areas. The extent of these areas varied between 5-10 km 2 . The environmental factors of 134 possible investigation areas were studied in detail. Due to the classification made earlier, the areas were typically sparsely populated forest areas. In the detailed study the main emphasis was but on evaluation of population density, transport and inventory of land ownership. The land-ownership is important for practical reasons. Land-owner's permission is needed for the operations in the field. Areas were classified separately according to population density, transport and land-ownership. In classification the most suitable areas were uninhabited regions with few landowners and locating close (less than 10 km) to the railroad. Only a minority of the areas fell in this category with the requirement

  18. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    Science.gov (United States)

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  19. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  20. Selective mutism.

    Science.gov (United States)

    Hua, Alexandra; Major, Nili

    2016-02-01

    Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.

  1. Statistical properties of random clique networks

    Science.gov (United States)

    Ding, Yi-Min; Meng, Jun; Fan, Jing-Fang; Ye, Fang-Fu; Chen, Xiao-Song

    2017-10-01

    In this paper, a random clique network model to mimic the large clustering coefficient and the modular structure that exist in many real complex networks, such as social networks, artificial networks, and protein interaction networks, is introduced by combining the random selection rule of the Erdös and Rényi (ER) model and the concept of cliques. We find that random clique networks having a small average degree differ from the ER network in that they have a large clustering coefficient and a power law clustering spectrum, while networks having a high average degree have similar properties as the ER model. In addition, we find that the relation between the clustering coefficient and the average degree shows a non-monotonic behavior and that the degree distributions can be fit by multiple Poisson curves; we explain the origin of such novel behaviors and degree distributions.

  2. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. GRD: An SPSS extension command for generating random data

    Directory of Open Access Journals (Sweden)

    Bradley Harding

    2014-09-01

    Full Text Available To master statistics and data analysis tools, it is necessary to understand a number of concepts, manyof which are quite abstract. For example, sampling from a theoretical distribution can help individuals explore andunderstand randomness. Sampling can also be used to build exercises aimed to help students master statistics. Here, we present GRD (Generator of Random Data, an extension command for SPSS (version 17 and above. With GRD, it is possible to get random data from a given distribution. In its simplest use, GRD will return a set of simulated data from a normal distribution.With subcommands to GRD, it is possible to get data from multiple groups, over multiple repeated measures, and with desired effectsizes. Group sizes can be equal or unequal. With further subcommands, it is possible to sample from any theoretical population, (not simply the normal distribution, introduce non-homogeneous variances,fix or randomize subject effects, etc. Finally, GRD’s generated data are in a format ready to be analyzed.

  4. Level sets and extrema of random processes and fields

    CERN Document Server

    Azais, Jean-Marc

    2009-01-01

    A timely and comprehensive treatment of random field theory with applications across diverse areas of study Level Sets and Extrema of Random Processes and Fields discusses how to understand the properties of the level sets of paths as well as how to compute the probability distribution of its extremal values, which are two general classes of problems that arise in the study of random processes and fields and in related applications. This book provides a unified and accessible approach to these two topics and their relationship to classical theory and Gaussian processes and fields, and the most modern research findings are also discussed. The authors begin with an introduction to the basic concepts of stochastic processes, including a modern review of Gaussian fields and their classical inequalities. Subsequent chapters are devoted to Rice formulas, regularity properties, and recent results on the tails of the distribution of the maximum. Finally, applications of random fields to various areas of mathematics a...

  5. Flow, transport and diffusion in random geometries II: applications

    KAUST Repository

    Asinari, Pietro

    2015-01-07

    Multilevel Monte Carlo (MLMC) is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrization of the input randomness is not available or too expensive. We present several applications of our MLMC algorithm for flow, transport and diffusion in random heterogeneous materials. The absolute permeability and effective diffusivity (or formation factor) of micro-scale porous media samples are computed and the uncertainty related to the sampling procedures is studied. The algorithm is then extended to the transport problems and multiphase flows for the estimation of dispersion and relative permeability curves. The impact of water drops on random stuctured surfaces, with microfluidics applications to self-cleaning materials, is also studied and simulated. Finally the estimation of new drag correlation laws for poly-dispersed dilute and dense suspensions is presented.

  6. Flow, transport and diffusion in random geometries II: applications

    KAUST Repository

    Asinari, Pietro; Ceglia, Diego; Icardi, Matteo; Prudhomme, Serge; Tempone, Raul

    2015-01-01

    Multilevel Monte Carlo (MLMC) is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrization of the input randomness is not available or too expensive. We present several applications of our MLMC algorithm for flow, transport and diffusion in random heterogeneous materials. The absolute permeability and effective diffusivity (or formation factor) of micro-scale porous media samples are computed and the uncertainty related to the sampling procedures is studied. The algorithm is then extended to the transport problems and multiphase flows for the estimation of dispersion and relative permeability curves. The impact of water drops on random stuctured surfaces, with microfluidics applications to self-cleaning materials, is also studied and simulated. Finally the estimation of new drag correlation laws for poly-dispersed dilute and dense suspensions is presented.

  7. Range Selection and Median

    DEFF Research Database (Denmark)

    Jørgensen, Allan Grønlund; Larsen, Kasper Green

    2011-01-01

    and several natural special cases thereof. The rst special case is known as range median, which arises when k is xed to b(j 􀀀 i + 1)=2c. The second case, denoted prex selection, arises when i is xed to 0. Finally, we also consider the bounded rank prex selection problem and the xed rank range......Range selection is the problem of preprocessing an input array A of n unique integers, such that given a query (i; j; k), one can report the k'th smallest integer in the subarray A[i];A[i+1]; : : : ;A[j]. In this paper we consider static data structures in the word-RAM for range selection...... selection problem. In the former, data structures must support prex selection queries under the assumption that k for some value n given at construction time, while in the latter, data structures must support range selection queries where k is xed beforehand for all queries. We prove cell probe lower bounds...

  8. Selection of a tool to decision making for site selection for high level waste

    International Nuclear Information System (INIS)

    Madeira, J.G.; Alvin, A.C.M.; Martins, V.B.; Monteiro, N.A.

    2016-01-01

    The aim of this paper is to create a panel comparing some of the key decision-making support tools used in situations with the characteristics of the problem of selecting suitable areas for constructing a final deep geologic repository. The tools addressed in this work are also well known and with easy implementation. The decision-making process in matters of this kind is, in general, complex due to its multi-criteria nature and the conflicting opinions of various stakeholders. Thus, a comprehensive study was performed with the literature in this subject, specifically in documents of the International Atomic Energy Agency (IAEA), regarding the importance of the criteria involved in the decision-making process. Therefore, we highlighted six judgment attributes for selecting a decision support tool, suitable for the problem. For this study, we have selected the following multi-criteria tools: AHP, Delphi, Brainstorm, Nominal Group Technique and AHP-Delphi. Finally, the AHP-Delphi method has demonstrated to be more appropriate for managing the inherent multiple attributes to the problem proposed. (authors)

  9. The Effect of Speed Alterations on Tempo Note Selection.

    Science.gov (United States)

    Madsen, Clifford K.; And Others

    1986-01-01

    Investigated the tempo note preferences of 100 randomly selected college-level musicians using familiar orchestral music as stimuli. Subjects heard selections at increased, decreased, and unaltered tempi. Results showed musicians were not accurate in estimating original tempo and showed consistent preference for faster than actual tempo.…

  10. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  11. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    Science.gov (United States)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  12. Nonlinear Pricing with Random Participation

    OpenAIRE

    Jean-Charles Rochet; Lars A. Stole

    2002-01-01

    The canonical selection contracting programme takes the agent's participation decision as deterministic and finds the optimal contract, typically satisfying this constraint for the worst type. Upon weakening this assumption of known reservation values by introducing independent randomness into the agents' outside options, we find that some of the received wisdom from mechanism design and nonlinear pricing is not robust and the richer model which allows for stochastic participation affords a m...

  13. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  14. Impact of selective genotyping in the training population on accuracy and bias of genomic selection.

    Science.gov (United States)

    Zhao, Yusheng; Gowda, Manje; Longin, Friedrich H; Würschum, Tobias; Ranc, Nicolas; Reif, Jochen C

    2012-08-01

    Estimating marker effects based on routinely generated phenotypic data of breeding programs is a cost-effective strategy to implement genomic selection. Truncation selection in breeding populations, however, could have a strong impact on the accuracy to predict genomic breeding values. The main objective of our study was to investigate the influence of phenotypic selection on the accuracy and bias of genomic selection. We used experimental data of 788 testcross progenies from an elite maize breeding program. The testcross progenies were evaluated in unreplicated field trials in ten environments and fingerprinted with 857 SNP markers. Random regression best linear unbiased prediction method was used in combination with fivefold cross-validation based on genotypic sampling. We observed a substantial loss in the accuracy to predict genomic breeding values in unidirectional selected populations. In contrast, estimating marker effects based on bidirectional selected populations led to only a marginal decrease in the prediction accuracy of genomic breeding values. We concluded that bidirectional selection is a valuable approach to efficiently implement genomic selection in applied plant breeding programs.

  15. Noncontextuality with Marginal Selectivity in Reconstructing Mental Architectures

    Directory of Open Access Journals (Sweden)

    Ru eZhang

    2015-06-01

    Full Text Available We present a general theory of series-parallel mental architectures with selectively influenced stochastically non-independent components. A mental architecture is a hypothetical network of processes aimed at performing a task, of which we only observe the overall time it takes under variable parameters of the task. It is usually assumed that the network contains several processes selectively influenced by different experimental factors, and then the question is asked as to how these processes are arranged within the network, e.g., whether they are concurrent or sequential. One way of doing this is to consider the distribution functions for the overall processing time and compute certain linear combinations thereof (interaction contrasts. The theory of selective influences in psychology can be viewed as a special application of the interdisciplinary theory of (noncontextuality having its origins and main applications in quantum theory. In particular, lack of contextuality is equivalent to the existence of a hidden random entity of which all the random variables in play are functions. Consequently, for any given value of this common random entity, the processing times and their compositions (minima, maxima, or sums become deterministic quantities. These quantities, in turn, can be treated as random variables with (shifted Heaviside distribution functions, for which one can easily compute various linear combinations across different treatments, including interaction contrasts. This mathematical fact leads to a simple method, more general than the previously used ones, to investigate and characterize the interaction contrast for different types of series-parallel architectures.

  16. Statistical reviewers improve reporting in biomedical articles: a randomized trial.

    Directory of Open Access Journals (Sweden)

    Erik Cobo

    2007-03-01

    Full Text Available Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both.Interventions were defined as 1 the addition of a statistical reviewer to the clinical peer review process, and 2 suggesting reporting guidelines to reviewers; with "no statistical expert" and "no checklist" as controls. The two interventions were crossed in a 2x2 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others. Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3% of the 115 included papers were interventions, 46 (40.0% were longitudinal designs, 28 (24.3% cross-sectional and 20 (17.4% others. The 16 (13.9% rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6-24.4. The effect of suggesting a guideline to the

  17. DNA-based random number generation in security circuitry.

    Science.gov (United States)

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  18. Multistage Selection and the Financing of New Ventures

    OpenAIRE

    Jonathan T. Eckhardt; Scott Shane; Frédéric Delmar

    2006-01-01

    Using a random sample of 221 new Swedish ventures initiated in 1998, we examine why some new ventures are more likely than others to successfully be awarded capital from external sources. We examine venture financing as a staged selection process in which two sequential selection events systematically winnow the population of ventures and influence which ventures receive financing. For a venture to receive external financing its founders must first select it as a candidate for external fundin...

  19. Rural Women\\'s Preference For Selected Programmes Of The ...

    African Journals Online (AJOL)

    The study focused on the rural women's preference for selected programmes of the National Special Programme for Food Security (NSPFS) in Imo State, Nigeria. Data was collected with the aid of structured interview from 150 randomly selected women in the study area. Results from the study showed that respondents ...

  20. A New Random Walk for Replica Detection in WSNs

    Science.gov (United States)

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  1. A New Random Walk for Replica Detection in WSNs.

    Science.gov (United States)

    Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.

  2. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  3. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  4. Portfolio Selection with Jumps under Regime Switching

    Directory of Open Access Journals (Sweden)

    Lin Zhao

    2010-01-01

    Full Text Available We investigate a continuous-time version of the mean-variance portfolio selection model with jumps under regime switching. The portfolio selection is proposed and analyzed for a market consisting of one bank account and multiple stocks. The random regime switching is assumed to be independent of the underlying Brownian motion and jump processes. A Markov chain modulated diffusion formulation is employed to model the problem.

  5. Music in the cath lab: who should select it?

    Science.gov (United States)

    Goertz, Wolfram; Dominick, Klaus; Heussen, Nicole; vom Dahl, Juergen

    2011-05-01

    The ALMUT study wants to evaluate the anxiolytic effects of different music styles and no music in 200 patients undergoing cardiac catheterization and to assess if there is a difference if patients select one of these therapies or are randomized to one of them. The anxiolytic and analgesic effects of music have been described in previous trials. Some authors have suggested to evaluate whether patient-selected music is more effective than the music selected by the physician in reducing anxiety and stress levels. After randomization 100 patients (group A) were allowed to choose between classical music, relaxing modern music, smooth jazz, and no music. One hundred patients (group B) were randomized directly to one of these therapies (n = 25 each). Complete data were available for 197 patients (65 ± 10 years; 134 male). Using the State-Trait Anxiety Inventory (STAI) all patients in group B who listened to music showed a significantly higher decrease of their anxiety level (STAI-State difference pre-post of 16.8 ± 10.2) compared to group A (13.3 ± 11.1; p = 0.0176). Patients without music (6.2 ± 6.7) had a significantly weaker reduction of anxiety compared to all music-listeners (14.9 ± 10.7, p music in the cath lab support previous reports. Surprisingly, the hypothesis that the patient's choice of preferred music might yield higher benefits than a randomized assignment could be dismissed.

  6. Random matrix models for phase diagrams

    International Nuclear Information System (INIS)

    Vanderheyden, B; Jackson, A D

    2011-01-01

    We describe a random matrix approach that can provide generic and readily soluble mean-field descriptions of the phase diagram for a variety of systems ranging from quantum chromodynamics to high-T c materials. Instead of working from specific models, phase diagrams are constructed by averaging over the ensemble of theories that possesses the relevant symmetries of the problem. Although approximate in nature, this approach has a number of advantages. First, it can be useful in distinguishing generic features from model-dependent details. Second, it can help in understanding the 'minimal' number of symmetry constraints required to reproduce specific phase structures. Third, the robustness of predictions can be checked with respect to variations in the detailed description of the interactions. Finally, near critical points, random matrix models bear strong similarities to Ginsburg-Landau theories with the advantage of additional constraints inherited from the symmetries of the underlying interaction. These constraints can be helpful in ruling out certain topologies in the phase diagram. In this Key Issues Review, we illustrate the basic structure of random matrix models, discuss their strengths and weaknesses, and consider the kinds of system to which they can be applied.

  7. Project No. 8 - Final decommissioning plan

    International Nuclear Information System (INIS)

    2000-01-01

    Ignalina NPP should prepare the final Ignalina NPP unit 1 decommissioning plan by march 31, 2002. This plan should include the following : description of Ignalina NPP and the Ignalina NPP boundary that could be influenced by decommissioning process; decommissioning strategy selected and a logical substantiation for this selection; description of the decommissioning actions suggested and a time schedule for the actions to be performed; conceptual safety and environmental impact assessment covering ionizing radiation and other man and environment impact; description of the environmental monitoring program proposed during decommissioning process; description of the waste management proposed; assessment of decommissioning expenses including waste management, accumulated funds and other sources. Estimated project cost - 0.75 M EURO

  8. Annuities under random rates of interest - revisited

    OpenAIRE

    Burnecki, K.; Marciniuk, A.; Weron, A.

    2001-01-01

    In the article we consider accumulated values of annuities-certain with yearly payments with independent random interest rates. We focus on annuities with payments varying in arithmetic and geometric progression which are important basic varying annuities (see Kellison, 1991). They appear to be a generalization of the types studied recently by Zaks (2001). We derive, via recursive relationships, mean and variance formulae of the final values of the annuities. As a consequence, we obtain momen...

  9. Contractor Selection in Saudi Arabia

    OpenAIRE

    M. A. Bajaber; M. A. Taha

    2012-01-01

    Contractor selection in Saudi Arabia is very important due to the large construction boom and the contractor role to get over construction risks. The need for investigating contractor selection is due to the following reasons; large number of defaulted or failed projects (18%), large number of disputes attributed to contractor during the project execution stage (almost twofold), the extension of the General Agreement on Tariffs and Trade (GATT) into construction industry, and finally the few ...

  10. Stand basal area model for Cunninghamia lanceolata (Lamb.) Hook ...

    African Journals Online (AJOL)

    When evaluating the predictive accuracy of the final model, the first measurement was used for estimation of random parameters. The Chapman–Richards model was finally selected for the basic model based on model-fitting statistics, and both the fitting model and validation data with site-, block- and plot-level random ...

  11. Combined impact of negative lifestyle factors on cardiovascular risk in children: a randomized prospective study

    OpenAIRE

    Meyer, Ursina; Schindler, Christian; Bloesch, Tamara; Schmocker, Eliane; Zahner, Lukas; Puder, Jardena J; Kriemler, Susi

    2014-01-01

    PURPOSE: Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. METHODS: Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and ...

  12. Brain Tumor Segmentation Based on Random Forest

    Directory of Open Access Journals (Sweden)

    László Lefkovits

    2016-09-01

    Full Text Available In this article we present a discriminative model for tumor detection from multimodal MR images. The main part of the model is built around the random forest (RF classifier. We created an optimization algorithm able to select the important features for reducing the dimensionality of data. This method is also used to find out the training parameters used in the learning phase. The algorithm is based on random feature properties for evaluating the importance of the variable, the evolution of learning errors and the proximities between instances. The detection performances obtained have been compared with the most recent systems, offering similar results.

  13. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  14. Report on ''questions of site selection''

    International Nuclear Information System (INIS)

    Alt, Stefan; Kallenbach-Herbert, Beate; Neles, Julia

    2016-01-01

    The report on radioactive waste site selection questions covers the following issues: excluded options: disposal in space, Antarctic, Greenland or oceans, surface storage without final deep geologic repository; possible alternatives: final disposal in deep boreholes, long-term interim storage, transmutation; central confinement function for radioactive wastes - geologic and/or technical barriers? Final repository monitoring: geo-scientific exclusion criteria, geo-scientific minimum requirements, geo-scientific decision criteria; geo-scientific data: information status and handling of regions with non-sufficient geo-scientific data; scientific planning criteria: basis for definitions concerning the content, procedural aspects; analysis of the socio-economic potential; requirements for the disposal of further radioactive wastes; requirements concerning the containers for final disposal.

  15. Chemical library subset selection algorithms: a unified derivation using spatial statistics.

    Science.gov (United States)

    Hamprecht, Fred A; Thiel, Walter; van Gunsteren, Wilfred F

    2002-01-01

    If similar compounds have similar activity, rational subset selection becomes superior to random selection in screening for pharmacological lead discovery programs. Traditional approaches to this experimental design problem fall into two classes: (i) a linear or quadratic response function is assumed (ii) some space filling criterion is optimized. The assumptions underlying the first approach are clear but not always defendable; the second approach yields more intuitive designs but lacks a clear theoretical foundation. We model activity in a bioassay as realization of a stochastic process and use the best linear unbiased estimator to construct spatial sampling designs that optimize the integrated mean square prediction error, the maximum mean square prediction error, or the entropy. We argue that our approach constitutes a unifying framework encompassing most proposed techniques as limiting cases and sheds light on their underlying assumptions. In particular, vector quantization is obtained, in dimensions up to eight, in the limiting case of very smooth response surfaces for the integrated mean square error criterion. Closest packing is obtained for very rough surfaces under the integrated mean square error and entropy criteria. We suggest to use either the integrated mean square prediction error or the entropy as optimization criteria rather than approximations thereof and propose a scheme for direct iterative minimization of the integrated mean square prediction error. Finally, we discuss how the quality of chemical descriptors manifests itself and clarify the assumptions underlying the selection of diverse or representative subsets.

  16. Fixation probability in a two-locus intersexual selection model.

    Science.gov (United States)

    Durand, Guillermo; Lessard, Sabin

    2016-06-01

    We study a two-locus model of intersexual selection in a finite haploid population reproducing according to a discrete-time Moran model with a trait locus expressed in males and a preference locus expressed in females. We show that the probability of ultimate fixation of a single mutant allele for a male ornament introduced at random at the trait locus given any initial frequency state at the preference locus is increased by weak intersexual selection and recombination, weak or strong. Moreover, this probability exceeds the initial frequency of the mutant allele even in the case of a costly male ornament if intersexual selection is not too weak. On the other hand, the probability of ultimate fixation of a single mutant allele for a female preference towards a male ornament introduced at random at the preference locus is increased by weak intersexual selection and weak recombination if the female preference is not costly, and is strong enough in the case of a costly male ornament. The analysis relies on an extension of the ancestral recombination-selection graph for samples of haplotypes to take into account events of intersexual selection, while the symbolic calculation of the fixation probabilities is made possible in a reasonable time by an optimizing algorithm. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Evaluation of the tolerance to Finale® in the germination and regeneration of Cuban rice varieties (IACuba-17 and IACuba-19

    Directory of Open Access Journals (Sweden)

    Daymí Abreu

    2005-01-01

    Full Text Available Selection agent used during the shoot selection has an important role on the transgenic plant generation efficiency. In this work, the tolerance to the herbicide Finale® in two Cuban rice cultivars, IACuba17 and IACuba-19 was evaluated, and determined that 10 days exposure to 5 and 10 mg.l-1 of Finale® were enough to avoid seedlings of IAC-17 and IAC-19, respectively. Cultivated calluses (0, 2, 4 and 6 days in the absence of Finale®in the regeneration medium were used to evaluate the minimal concentration of Finale®that totally inhibits shoot regeneration. Pre-induced calluses cultured during two days and 3 mg.l-1 of Finale® in the regeneration medium was the most efficient combination to select shoots during the generation of transgenic plants resistant to the herbicide. Our shoot selection procedure reduces to 3 weeks the time to obtain shoots during the generation of transgenic rice plants. Key words: germination, mature seeds, Oryza, phosphinothricin, regeneration, selection markers

  18. Energy efficient selective reforming of hydrocarbons. ERA-NET Bioenergy. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rodin, J.

    2010-07-15

    The research project 'Energy efficient selective reforming of hydrocarbons', funded by the Swedish and Energinet.dk Agency has now reached its end. The report is an overview of the work. Details of the work within the different areas can be found in the reports from each part. In this project, an innovative method for tar removal and reformation of hydrocarbons was investigated: Chemical Looping Reforming (CLR). This gas treatment has the potential to be economically competitive, reliable and environmentally friendly (due to higher energy efficiency, amongst others). The aim of the CLR is to 1) eliminate downstream problems with tar 2) simplify the energy recovery from the hot product gas 3) selectively save lighter hydrocarbons for the production of synthetic natural gas (SNG). A guarantor for the outcome of the project is the engagement of Goeteborg Energi, which has a commitment to build a 20 MW output SNG plant by 2012. DTU (Danish Technical University) is responsible for carrying out the laboratorial part, where different oxygen carriers for the CLR have been considering their capability of selectively reforming hydrocarbons. The conclusion was that, of the four carriers tested, the Mn and Ni40 was the most promising. CUT (Chalmers University of Technology) has installed a 600 W CLR unit connected to a slipstream from the gasifier. During the firing season 2010 the CLR has been tested with raw gas for 36 hours and the results so far show that the equipment works as intended and that it can reduce the amount of tars substantially. GE (Goeteborg Energi AB) together with SEP (Scandinavian Energy Project AB) and CUT have studied the integration of a methane production plant to an existing boiler. The main focus of the study has been the gasifier and the CLR. The integration of a 100 MW methane production plant is estimated to cost 1.3-2.4 billion SEK. The different work packages have altogether shown that a CLR is a possible solution to the tar problem

  19. How random is a random vector?

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2015-01-01

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  20. How random is a random vector?

    Science.gov (United States)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  1. A Robust and Versatile Method of Combinatorial Chemical Synthesis of Gene Libraries via Hierarchical Assembly of Partially Randomized Modules

    Science.gov (United States)

    Popova, Blagovesta; Schubert, Steffen; Bulla, Ingo; Buchwald, Daniela; Kramer, Wilfried

    2015-01-01

    A major challenge in gene library generation is to guarantee a large functional size and diversity that significantly increases the chances of selecting different functional protein variants. The use of trinucleotides mixtures for controlled randomization results in superior library diversity and offers the ability to specify the type and distribution of the amino acids at each position. Here we describe the generation of a high diversity gene library using tHisF of the hyperthermophile Thermotoga maritima as a scaffold. Combining various rational criteria with contingency, we targeted 26 selected codons of the thisF gene sequence for randomization at a controlled level. We have developed a novel method of creating full-length gene libraries by combinatorial assembly of smaller sub-libraries. Full-length libraries of high diversity can easily be assembled on demand from smaller and much less diverse sub-libraries, which circumvent the notoriously troublesome long-term archivation and repeated proliferation of high diversity ensembles of phages or plasmids. We developed a generally applicable software tool for sequence analysis of mutated gene sequences that provides efficient assistance for analysis of library diversity. Finally, practical utility of the library was demonstrated in principle by assessment of the conformational stability of library members and isolating protein variants with HisF activity from it. Our approach integrates a number of features of nucleic acids synthetic chemistry, biochemistry and molecular genetics to a coherent, flexible and robust method of combinatorial gene synthesis. PMID:26355961

  2. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  3. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2012-01-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  4. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2010-02-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  5. Influence of Maximum Inbreeding Avoidance under BLUP EBV Selection on Pinzgau Population Diversity

    Directory of Open Access Journals (Sweden)

    Radovan Kasarda

    2011-05-01

    Full Text Available Evaluated was effect of mating (random vs. maximum avoidance of inbreeding under BLUP EBV selection strategy. Existing population structure was under Monte Carlo stochastic simulation analyzed from the point to minimize increase of inbreeding. Maximum avoidance of inbreeding under BLUP selection resulted into comparable increase of inbreeding then random mating in average of 10 generation development. After 10 generations of simulation of mating strategy was observed ΔF= 6,51 % (2 sires, 5,20 % (3 sires, 3,22 % (4 sires resp. 2,94 % (5 sires. With increased number of sires selected, decrease of inbreeding was observed. With use of 4, resp. 5 sires increase of inbreeding was comparable to random mating with phenotypic selection. For saving of genetic diversity and prevention of population loss is important to minimize increase of inbreeding in small populations. Classical approach was based on balancing ratio of sires and dams in mating program. Contrariwise in the most of commercial populations small number of sires was used with high mating ratio.

  6. Supplier Selection by Coupling-Attribute Combinatorial Analysis

    Directory of Open Access Journals (Sweden)

    Xinyu Sun

    2017-01-01

    Full Text Available Increasing reliance on outsourcing has made supplier selection a critical success factor for a supply chain/network. In addition to cost, the synergy among product components and supplier selection criteria should be considered holistically during the supplier selection process. This paper shows this synergy using coupled-attribute analysis. The key coupling attributes, including total cost, quality, delivery reliability, and delivery lead time of the final product, are identified and formulated. A max-max model is designed to assist the selection of the optional combination of suppliers. The results are compared with the individual supplier selection. Management insights are also discussed.

  7. 48 CFR 36.607 - Release of information on firm selection.

    Science.gov (United States)

    2010-10-01

    ... firm selection. 36.607 Section 36.607 Federal Acquisition Regulations System FEDERAL ACQUISITION... Services 36.607 Release of information on firm selection. (a) After final selection has taken place, the contracting officer may release information identifying only the architect-engineer firm with which a contract...

  8. Adoption of selected innovations in rice production and their effect ...

    African Journals Online (AJOL)

    Adoption of selected innovations in rice production and their effect on farmers living standard in Bauchi local government area, Bauchi state, Nigeria. ... International Journal of Natural and Applied Sciences ... Simple random sampling technique was used for the selection of 82 rice growers from these villages. The data ...

  9. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    Science.gov (United States)

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  10. Response to family selection and genetic parameters in Japanese quail selected for four week breast weight

    DEFF Research Database (Denmark)

    Khaldari, Majid; Yeganeh, Hassan Mehrabani; Pakdel, Abbas

    2011-01-01

    An experiment was conducted to investigate the effect of short-term selection for 4 week breast weight (4wk BRW), and to estimate genetic parameters of body weight, and carcass traits. A selection (S) line and control (C) line was randomly selected from a base population. Data were collected over...... was 0.35±0.06. There were a significant difference for BW, and carcass weights but not for carcass percent components between lines (Pcarcass and leg weights were 0.46, 0.41 and 0.47, and 13.2, 16.2, 4.4 %, respectively....... The genetic correlations of BRW with BW, carcass, leg, and back weights were 0.85, 0.88 and 0.72, respectively. Selection for 4 wk BRW improved feed conversion ratio (FCR) about 0.19 units over the selection period. Inbreeding caused an insignificant decline of the mean of some traits. Results from...

  11. Selection of a tool to support decision making for site selection for high level waste - 15010

    International Nuclear Information System (INIS)

    Madeira, J.G.; Alvim, A.C.M.; Martins, V.B.; Monteiro, N.A.

    2015-01-01

    The aim of this paper is to create a panel comparing some of the key decision-making support tools used in situations with the characteristics of the problem of selecting suitable areas for constructing a final deep geologic repository. The tools presented in this work are also well-known and with easy implementation. The decision making process in issues of this kind is, in general, complex due to its multi-criteria nature and the conflicting opinions of various of stakeholders. Thus a comprehensive study was performed with the literature on this subject, specifically documents of the International Atomic Energy Agency - IAEA, regarding the importance of the criteria involved in the decision making process. Therefore, we highlighted 6 judgments attributes for selecting an adequate support tool: -) transparency and reliability, -) subjectivity, -) updating and adapting, -) multi-criteria analysis, -) ease of deployment, and -) application time. We have selected the following key decision-making support tools: AHP, Delphi, Brainstorm, Nominal Group Technique, and AHP-Delphi. Finally, the AHP-Delphi method has demonstrated to be more appropriate for managing the inherent multiple attributes to the problem proposed

  12. Characterize and Model Final Waste Formulations and Offgas Solids from Thermal Treatment Processes - FY-98 Final Report for LDRD 2349

    Energy Technology Data Exchange (ETDEWEB)

    Kessinger, Glen Frank; Nelson, Lee Orville; Grandy, Jon Drue; Zuck, Larry Douglas; Kong, Peter Chuen Sun; Anderson, Gail

    1999-08-01

    The purpose of LDRD #2349, Characterize and Model Final Waste Formulations and Offgas Solids from Thermal Treatment Processes, was to develop a set of tools that would allow the user to, based on the chemical composition of a waste stream to be immobilized, predict the durability (leach behavior) of the final waste form and the phase assemblages present in the final waste form. The objectives of the project were: • investigation, testing and selection of thermochemical code • development of auxiliary thermochemical database • synthesis of materials for leach testing • collection of leach data • using leach data for leach model development • thermochemical modeling The progress toward completion of these objectives and a discussion of work that needs to be completed to arrive at a logical finishing point for this project will be presented.

  13. 40 CFR 205.57-2 - Test vehicle sample selection.

    Science.gov (United States)

    2010-07-01

    ... pursuant to a test request in accordance with this subpart will be selected in the manner specified in the... then using a table of random numbers to select the number of vehicles as specified in paragraph (c) of... with the desig-nated AQL are contained in Appendix I, -Table II. (c) The appropriate batch sample size...

  14. Selection of Celebrity Endorsers

    DEFF Research Database (Denmark)

    Hollensen, Svend; Schimmelpfennig, Christian

    2013-01-01

    several candidates by means of subtle evaluation procedures. Design/methodology/approach – A case study research has been carried out among companies experienced in celebrity endorsements to learn more about the endorser selection process in practise. Based on these cases theory is inductively developed......Purpose - This research aims at shedding some light on the various avenues marketers can undertake until finally an endorsement contract is signed. The focus of the study lies on verifying the generally held assumption that endorser selection is usually taken care of by creative agencies, vetting....... Findings – Our research suggests that generally held assumption that endorsers being selected and thoroughly vetted by a creative agency may not be universally valid. A normative model to illustrate the continuum of the selection process in practise is suggested and the two polar case studies (Swiss brand...

  15. Record statistics of financial time series and geometric random walks.

    Science.gov (United States)

    Sabir, Behlool; Santhanam, M S

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  16. Mating schemes for optimum contribution selection with constrained rates of inbreeding

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2000-01-01

    The effect of non-random mating on genetic response was compared for populations with discrete generations. Mating followed a selection step where the average coancestry of selected animals was constrained, while genetic response was maximised. Minimum coancestry (MC), Minimum coancestry with a

  17. Teleology and randomness in the development of natural sciences research: systems, ontology and evolution

    Directory of Open Access Journals (Sweden)

    Paulo Pereira Martins Júnior

    2011-12-01

    Full Text Available This is an investigation on the subject of  Teleology, which has been dealt with all along the history of the human thought with special emphasis to the interval related to the development of scientific theories referring to the study of Nature.  The presentation of the subject starts with the conceptual definitions of Teleology. Following, this subject is revisited all along the historical application of the concept in the development of science. In this respect, the first approach is about teleology in Biology and life sciences, with emphasis on the repercussion over the vitalist conception and natural selection.  Hence, the discussion turns around the dialectic conceptions of teleological systems and random systems. Finally, this paper finishes with a thought about how these themes may be pertinent within the environmental studies whereon physical, biological and human systems are in co-operation, with the various applications of nuances and uses of the teleological concept.

  18. Redd Site Selection and Spawning Habitat Use by Fall Chinook Salmon, Hanford Reach, Columbia River : Final Report 1995 - 1998.

    Energy Technology Data Exchange (ETDEWEB)

    Geist, David R.

    1999-05-01

    year to year. The tendency to spawn in clusters suggests fall chinook salmon's use of spawning habitat is highly selective. Hydraulic characteristics of the redd clusters were significantly different than the habitat surrounding them. Velocity and lateral slope of the river bottom were the most important habitat variables in predicting redd site selection. While these variables explained a large proportion of the variance in redd site selection (86 to 96%), some unmeasured factors still accounted for a small percentage of actual spawning site selection. Chapter three describes the results from an investigation into the hyporheic characteristics of the two spawning areas studied in chapter two. This investigation showed that the magnitude and chemical characteristics of hyporheic discharge were different between and within two spawning areas. Apparently, fall chinook salmon used chemical and physical cues from the discharge to locate spawning areas. Finally, chapter four describes a unique method that was developed to install piezometers into the cobble bed of the Columbia River.

  19. PREFACE: The random search problem: trends and perspectives The random search problem: trends and perspectives

    Science.gov (United States)

    da Luz, Marcos G. E.; Grosberg, Alexander; Raposo, Ernesto P.; Viswanathan, Gandhi M.

    2009-10-01

    aircraft, a given web site). Regarding the nature of the searching drive, in certain instances, it can be guided almost entirely by external cues, either by the cognitive (memory) or detective (olfaction, vision, etc) skills of the searcher. However, in many situations the movement is non-oriented, being in essence a stochastic process. Therefore, in such cases (and even when a small deterministic component in the locomotion exists) a random search effectively defines the final rates of encounters. Hence, one reason underlying the richness of the random search problem relates just to the `ignorance' of the locations of the randomly located targets. Contrary to conventional wisdom, the lack of complete information does not necessarily lead to greater complexity. As an illustrative example, let us consider the case of complete information. If the positions of all target sites are known in advance, then the question of what sequential order to visit the sites so to reduce the energy costs of locomotion itself becomes a rather challenging problem: the famous `travelling salesman' optimization query, belonging to the NP-complete class of problems. The ignorance of the target site locations, however, considerably modifies the problem and renders it not amenable to be treated by purely deterministic computational methods. In fact, as expected, the random search problem is not particularly suited to search algorithms that do not use elements of randomness. So, only a statistical approach to the search problem can adequately deal with the element of ignorance. In other words, the incomplete information renders the search under-determined, i.e., it is not possible to find the `best' solution to the problem because all the information is not given. Instead, one must guess and probabilistic or stochastic strategies become unavoidable. Also, the random search problem bears a relation to reaction-diffusion processes, because the search involves a diffusive aspect, movement, as well as a

  20. OPAL: selection and acquisition of LEP data

    International Nuclear Information System (INIS)

    Le Du, P.

    1985-01-01

    The OPAL project (Omni Purpose aparatus for LEP) is presented. It will be a frame and an example to explain the main problems and limitations concerning the mode of event selection, acquisition and information transfer to the final registering system. A quick review of the different problems related to data selection and acquisition is made [fr

  1. Analytic regularity and collocation approximation for elliptic PDEs with random domain deformations

    KAUST Repository

    Castrillon, Julio

    2016-03-02

    In this work we consider the problem of approximating the statistics of a given Quantity of Interest (QoI) that depends on the solution of a linear elliptic PDE defined over a random domain parameterized by N random variables. The elliptic problem is remapped onto a corresponding PDE with a fixed deterministic domain. We show that the solution can be analytically extended to a well defined region in CN with respect to the random variables. A sparse grid stochastic collocation method is then used to compute the mean and variance of the QoI. Finally, convergence rates for the mean and variance of the QoI are derived and compared to those obtained in numerical experiments.

  2. Vast Portfolio Selection with Gross-exposure Constraints().

    Science.gov (United States)

    Fan, Jianqing; Zhang, Jingjin; Yu, Ke

    2012-01-01

    We introduce the large portfolio selection using gross-exposure constraints. We show that with gross-exposure constraint the empirically selected optimal portfolios based on estimated covariance matrices have similar performance to the theoretical optimal ones and there is no error accumulation effect from estimation of vast covariance matrices. This gives theoretical justification to the empirical results in Jagannathan and Ma (2003). We also show that the no-short-sale portfolio can be improved by allowing some short positions. The applications to portfolio selection, tracking, and improvements are also addressed. The utility of our new approach is illustrated by simulation and empirical studies on the 100 Fama-French industrial portfolios and the 600 stocks randomly selected from Russell 3000.

  3. Final storage of radioactive waste

    International Nuclear Information System (INIS)

    Ziehm, Cornelia

    2015-01-01

    As explained in the present article, operators of nuclear power plants are responsible for the safe final disposal of the radioactive wastes they produce on the strength of the polluter pays principle. To shift the burden of responsibility for safe disposal to society as a whole would violate this principle and is therefore not possible. The polluter pays principle follows from more general principles of the fair distribution of benefits and burdens. Instances of its implementation are to be found in the national Atomic Energy Law as well as in the European Radioactive Waste and Spent Fuel Management Directive. The polluters in this case are in particular responsible for financing the installation and operation of final disposal sites. The reserves accumulated so far for the decommissioning and dismantling of nuclear power plants and disposal of radioactive wastes, including the installation and operation of final disposal sites, should be transferred to a public-law fund. This fund should be supplemented by the polluters to cover further foreseeable costs not covered by the reserves accumulated so far, including a realistic cost increase factor, appropriate risk reserves as well as the costs of the site selection procedure and a share in the costs for the safe closure of the final disposal sites of Morsleben and Asse II. This would merely be implementing in the sphere of atomic law that has long been standard practice in other areas of environmental law involving environmental hazards.

  4. Novel β-lactamase-random peptide fusion libraries for phage display selection of cancer cell-targeting agents suitable for enzyme prodrug therapy

    Science.gov (United States)

    Shukla, Girja S.; Krag, David N.

    2010-01-01

    Novel phage-displayed random linear dodecapeptide (X12) and cysteine-constrained decapeptide (CX10C) libraries constructed in fusion to the amino-terminus of P99 β-lactamase molecules were used for identifying β-lactamase-linked cancer cell-specific ligands. The size and quality of both libraries were comparable to the standards of other reported phage display systems. Using the single-round panning method based on phage DNA recovery, we identified severalβ-lactamase fusion peptides that specifically bind to live human breast cancer MDA-MB-361 cells. The β-lactamase fusion to the peptides helped in conducting the enzyme activity-based clone normalization and cell-binding screening in a very time- and cost-efficient manner. The methods were suitable for 96-well readout as well as microscopic imaging. The success of the biopanning was indicated by the presence of ~40% cancer cell-specific clones among recovered phages. One of the binding clones appeared multiple times. The cancer cell-binding fusion peptides also shared several significant motifs. This opens a new way of preparing and selecting phage display libraries. The cancer cell-specific β-lactamase-linked affinity reagents selected from these libraries can be used for any application that requires a reporter for tracking the ligand molecules. Furthermore, these affinity reagents have also a potential for their direct use in the targeted enzyme prodrug therapy of cancer. PMID:19751096

  5. Glufosinate ammonium selection of transformed Arabidopsis.

    Science.gov (United States)

    Weigel, Detlef; Glazebrook, Jane

    2006-12-01

    INTRODUCTIONOne of the most commonly used markers for the selection of transgenic Arabidopsis is resistance to glufosinate ammonium, an herbicide that is sold under a variety of trade names including Basta and Finale. Resistance to glufosinate ammonium is conferred by the bacterial bialophos resistance gene (BAR) encoding the enzyme phosphinotricin acetyl transferase (PAT). This protocol describes the use of glufosinate ammonium to select transformed Arabidopsis plants. The major advantage of glufosinate ammonium selection is that it can be performed on plants growing in soil and does not require the use of sterile techniques.

  6. Pseudo-random data acquisition geometry in 3D seismic survey; Sanjigen jishin tansa ni okeru giji random data shutoku reiauto ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Minegishi, M; Tsuburaya, Y [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Influence of pseudo-random geometry on the imaging for 3D seismic exploration data acquisition has been investigate using a simple model by comparing with the regular geometry. When constituting wave front by the interference of elemental waves, pseudo-random geometry data did not always provide good results. In the case of a point diffractor, the imaging operation, where the constituted wave front was returned to the point diffractor by the interference of elemental waves for the spatial alias records, did not always give clear images. In the case of multi point diffractor, good images were obtained with less noise generation in spite of alias records. There are a lot of diffractors in the actual geological structures, which corresponds to the case of multi point diffractors. Finally, better images could be obtained by inputting records acquired using the pseudo-random geometry rather than by inputting spatial alias records acquired using the regular geometry. 7 refs., 6 figs.

  7. Malaria parasitemia amongst pregnant women attending selected ...

    African Journals Online (AJOL)

    A cross-sectional study to determine malaria parasitemia amongst 300 randomly selected pregnant women attending government and private healthcare facilities in Rivers State was carried out. Blood samples were obtained through venous procedure and the presence or absence of Plasmodium was determined ...

  8. Polyatomic Trilobite Rydberg Molecules in a Dense Random Gas.

    Science.gov (United States)

    Luukko, Perttu J J; Rost, Jan-Michael

    2017-11-17

    Trilobites are exotic giant dimers with enormous dipole moments. They consist of a Rydberg atom and a distant ground-state atom bound together by short-range electron-neutral attraction. We show that highly polar, polyatomic trilobite states unexpectedly persist and thrive in a dense ultracold gas of randomly positioned atoms. This is caused by perturbation-induced quantum scarring and the localization of electron density on randomly occurring atom clusters. At certain densities these states also mix with an s state, overcoming selection rules that hinder the photoassociation of ordinary trilobites.

  9. Ellipsometry measurements of glass transition breadth in bulk films of random, block, and gradient copolymers.

    Science.gov (United States)

    Mok, M M; Kim, J; Marrou, S R; Torkelson, J M

    2010-03-01

    Bulk films of random, block and gradient copolymer systems were studied using ellipsometry to demonstrate the applicability of the numerical differentiation technique pioneered by Kawana and Jones for studying the glass transition temperature (T (g)) behavior and thermal expansivities of copolymers possessing different architectures and different levels of nanoheterogeneity. In a series of styrene/n -butyl methacrylate (S/nBMA) random copolymers, T (g) breadths were observed to increase from approximately 17( degrees ) C in styrene-rich cases to almost 30( degrees ) C in nBMA-rich cases, reflecting previous observations of significant nanoheterogeneity in PnBMA homopolymers. The derivative technique also revealed for the first time a substantial increase in glassy-state expansivity with increasing nBMA content in S/nBMA random copolymers, from 1.4x10(-4) K-1 in PS to 3.5x10(-4) K-1 in PnBMA. The first characterization of block copolymer T (g) 's and T (g) breadths by ellipsometry is given, examining the impact of nanophase-segregated copolymer structure on ellipsometric measurements of glass transition. The results show that, while the technique is effective in detecting the two T (g) 's expected in certain block copolymer systems, the details of the glass transition can become suppressed in ellipsometry measurements of a rubbery minor phase under conditions where the matrix is glassy; meanwhile, both transitions are easily discernible by differential scanning calorimetry. Finally, broad glass transition regions were measured in gradient copolymers, yielding in some cases extraordinary T (g) breadths of 69- 71( degrees ) C , factors of 4-5 larger than the T (g) breadths of related homopolymers and random copolymers. Surprisingly, one gradient copolymer demonstrated a slightly narrower T (g) breadth than the S/nBMA random copolymers with the highest nBMA content. This highlights the fact that nanoheterogeneity relevant to the glass transition response in selected

  10. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    Directory of Open Access Journals (Sweden)

    Humam Saltaji

    Full Text Available To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time.We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics.Sequence generation was assessed to be inadequate (at unclear or high risk of bias in 68% (n = 367 of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%. Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154 and 40.5% (n = 219 of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427 of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95 of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%, while the method of blinding was appropriate in 53% (n = 286 of the trials. We identified a significant decrease over time (1955-2013 in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05 in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias.The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  11. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  12. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  13. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  14. Random ancestor trees

    International Nuclear Information System (INIS)

    Ben-Naim, E; Krapivsky, P L

    2010-01-01

    We investigate a network growth model in which the genealogy controls the evolution. In this model, a new node selects a random target node and links either to this target node, or to its parent, or to its grandparent, etc; all nodes from the target node to its most ancient ancestor are equiprobable destinations. The emerging random ancestor tree is very shallow: the fraction g n of nodes at distance n from the root decreases super-exponentially with n, g n = e −1 /(n − 1)!. We find that a macroscopic hub at the root coexists with highly connected nodes at higher generations. The maximal degree of a node at the nth generation grows algebraically as N 1/β n , where N is the system size. We obtain the series of nontrivial exponents which are roots of transcendental equations: β 1 ≅1.351 746, β 2 ≅1.682 201, etc. As a consequence, the fraction p k of nodes with degree k has an algebraic tail, p k ∼ k −γ , with γ = β 1 + 1 = 2.351 746

  15. Towards a pro-health food-selection model for gatekeepers in ...

    African Journals Online (AJOL)

    The purpose of this study was to develop a pro-health food selection model for gatekeepers of Bulawayo high-density suburbs in Zimbabwe. Gatekeepers in five suburbs constituted the study population from which a sample of 250 subjects was randomly selected. Of the total respondents (N= 182), 167 had their own ...

  16. Random practice - one of the factors of the motor learning process

    Directory of Open Access Journals (Sweden)

    Petr Valach

    2012-01-01

    Full Text Available BACKGROUND: An important concept of acquiring motor skills is the random practice (contextual interference - CI. The explanation of the effect of contextual interference is that the memory has to work more intensively, and therefore it provides higher effect of motor skills retention than the block practice. Only active remembering of a motor skill assigns the practical value for appropriate using in the future. OBJECTIVE: The aim of this research was to determine the difference in how the motor skills in sport gymnastics are acquired and retained using the two different teaching methods - blocked and random practice. METHODS: The blocked and random practice on the three selected gymnastics tasks were applied in the two groups students of physical education (blocked practice - the group BP, random practice - the group RP during two months, in one session a week (totally 80 trials. At the end of the experiment and 6 months after (retention tests the groups were tested on the selected gymnastics skills. RESULTS: No significant differences in a level of the gymnastics skills were found between BP group and RP group at the end of the experiment. However, the retention tests showed significantly higher level of the gymnastics skills in the RP group in comparison with the BP group. CONCLUSION: The results confirmed that a retention of the gymnastics skills using the teaching method of the random practice was significantly higher than with use of the blocked practice.

  17. The generation of 68 Gbps quantum random number by measuring laser phase fluctuations

    International Nuclear Information System (INIS)

    Nie, You-Qi; Liu, Yang; Zhang, Jun; Pan, Jian-Wei; Huang, Leilei; Payne, Frank

    2015-01-01

    The speed of a quantum random number generator is essential for practical applications, such as high-speed quantum key distribution systems. Here, we push the speed of a quantum random number generator to 68 Gbps by operating a laser around its threshold level. To achieve the rate, not only high-speed photodetector and high sampling rate are needed but also a very stable interferometer is required. A practical interferometer with active feedback instead of common temperature control is developed to meet the requirement of stability. Phase fluctuations of the laser are measured by the interferometer with a photodetector and then digitalized to raw random numbers with a rate of 80 Gbps. The min-entropy of the raw data is evaluated by modeling the system and is used to quantify the quantum randomness of the raw data. The bias of the raw data caused by other signals, such as classical and detection noises, can be removed by Toeplitz-matrix hashing randomness extraction. The final random numbers can pass through the standard randomness tests. Our demonstration shows that high-speed quantum random number generators are ready for practical usage

  18. Random fields, topology, and the Imry-Ma argument.

    Science.gov (United States)

    Proctor, Thomas C; Garanin, Dmitry A; Chudnovsky, Eugene M

    2014-03-07

    We consider an n-component fixed-length order parameter interacting with a weak random field in d=1, 2, 3 dimensions. Relaxation from the initially ordered state and spin-spin correlation functions are studied on lattices containing hundreds of millions of sites. At n ≤ d the presence of topological defects leads to strong metastability and glassy behavior, with the final state depending on the initial condition. At n=d+1, when topological structures are nonsingular, the system possesses a weak metastability. At n>d+1, when topological objects are absent, the final, lowest-energy state is independent of the initial condition. It is characterized by the exponential decay of correlations that agrees quantitatively with the theory based upon the Imry-Ma argument.

  19. Phage display peptide libraries: deviations from randomness and correctives

    Science.gov (United States)

    Ryvkin, Arie; Ashkenazy, Haim; Weiss-Ottolenghi, Yael; Piller, Chen; Pupko, Tal; Gershoni, Jonathan M

    2018-01-01

    Abstract Peptide-expressing phage display libraries are widely used for the interrogation of antibodies. Affinity selected peptides are then analyzed to discover epitope mimetics, or are subjected to computational algorithms for epitope prediction. A critical assumption for these applications is the random representation of amino acids in the initial naïve peptide library. In a previous study, we implemented next generation sequencing to evaluate a naïve library and discovered severe deviations from randomness in UAG codon over-representation as well as in high G phosphoramidite abundance causing amino acid distribution biases. In this study, we demonstrate that the UAG over-representation can be attributed to the burden imposed on the phage upon the assembly of the recombinant Protein 8 subunits. This was corrected by constructing the libraries using supE44-containing bacteria which suppress the UAG driven abortive termination. We also demonstrate that the overabundance of G stems from variant synthesis-efficiency and can be corrected using compensating oligonucleotide-mixtures calibrated by mass spectroscopy. Construction of libraries implementing these correctives results in markedly improved libraries that display random distribution of amino acids, thus ensuring that enriched peptides obtained in biopanning represent a genuine selection event, a fundamental assumption for phage display applications. PMID:29420788

  20. Topics in random walks in random environment

    International Nuclear Information System (INIS)

    Sznitman, A.-S.

    2004-01-01

    Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)

  1. A randomized controlled trial of an electronic informed consent process.

    Science.gov (United States)

    Rothwell, Erin; Wong, Bob; Rose, Nancy C; Anderson, Rebecca; Fedor, Beth; Stark, Louisa A; Botkin, Jeffrey R

    2014-12-01

    A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study. © The Author(s) 2014.

  2. Double-blind randomized placebo-controlled study of bixa orellana in patients with lower urinary tract symptoms associated to benign prostatic hyperplasia

    Directory of Open Access Journals (Sweden)

    Luis Zegarra

    2007-08-01

    Full Text Available OBJECTIVE: To determine the efficacy of Bixa Orellana (BO in patients with benign prostatic hyperplasia (BPH presenting moderate lower urinary tract symptoms (LUTS. MATERIALS AND METHODS: It is a prospective double-blind randomized placebo-controlled study. One thousand four hundred and seventy eight patients presenting moderate LUTS associated to BPH were interviewed, from whom we selected 136 to fulfill the criteria of inclusion and exclusion. Assignation was performed at random in blocks of four to receive B0 at a dose of 250 mg 3 times a day or placebo (Pbo for 12 months, 68 patients were assigned to each group. From the patients in the study we obtained data of demographic, epidemiologic, symptom score, uroflowmetry and post void residual urine variables. RESULTS: Basically both groups were compared clinically, demographically and biochemically. Throughout the study variations of symptom score, mean delta symptom score during each visit and the final average delta were similar for both groups (BO - 0.79 ± 1.87 and Pbo - 1.07 ± 1.49 (p = 0.33. Similarly variations of Qmax mean, Qmax average delta and final average delta were similar (BO 0.44 ± 1.07 and Pbo 0.47 ± 1.32 (p = 0.88. Variations of post void residual urine mean, post void residual urine average delta in each visit and the final average delta were similar for both groups (BO 4.24 ± 11.69 and Pbo 9.01 ± 18.66 (p = 0.07. No differences were found in the answers of clinically significant improvement assessed with relative risk and risk differences, even though the proportion of adverse effects was similar for both groups. CONCLUSION: Patients with BPH that present moderate LUTS did not show any benefit receiving BO when compared to placebo.

  3. Balancing treatment allocations by clinician or center in randomized trials allows unacceptable levels of treatment prediction.

    Science.gov (United States)

    Hills, Robert K; Gray, Richard; Wheatley, Keith

    2009-08-01

    Randomized controlled trials are the standard method for comparing treatments because they avoid the selection bias that might arise if clinicians were free to choose which treatment a patient would receive. In practice, allocation of treatments in randomized controlled trials is often not wholly random with various 'pseudo-randomization' methods, such as minimization or balanced blocks, used to ensure good balance between treatments within potentially important prognostic or predictive subgroups. These methods avoid selection bias so long as full concealment of the next treatment allocation is maintained. There is concern, however, that pseudo-random methods may allow clinicians to predict future treatment allocations from previous allocation history, particularly if allocations are balanced by clinician or center. We investigate here to what extent treatment prediction is possible. Using computer simulations of minimization and balanced block randomizations, the success rates of various prediction strategies were investigated for varying numbers of stratification variables, including the patient's clinician. Prediction rates for minimization and balanced block randomization typically exceed 60% when clinician is included as a stratification variable and, under certain circumstances, can exceed 80%. Increasing the number of clinicians and other stratification variables did not greatly reduce the prediction rates. Without clinician as a stratification variable, prediction rates are poor unless few clinicians participate. Prediction rates are unacceptably high when allocations are balanced by clinician or by center. This could easily lead to selection bias that might suggest spurious, or mask real, treatment effects. Unless treatment is blinded, randomization should not be balanced by clinician (or by center), and clinician-center effects should be allowed for instead by retrospectively stratified analyses. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese

  4. From Bloch to random lasing in ZnO self-assembled nanostructures

    DEFF Research Database (Denmark)

    Garcia-Fernandez, Pedro David; Cefe, López

    2013-01-01

    In this paper, we present measurements on UV lasing in ZnO ordered and disordered nanostructures. Bloch lasing is achieved in the ordered structures by exploiting very low group-velocity Bloch modes in ZnO photonic crystals. In the second case, random lasing is observed in ZnO photonic glasses. We...... study the lasing threshold in both cases and its dependence on the structural parameters. Finally, we present the transition from Bloch to random lasing by deliberately doping a ZnO inverse photonic crystal with a controlled amount of lattice vacancies effectively converting it into a translationally...

  5. 77 FR 35953 - Arts in Education National Program; Final Priority, Requirements, Definitions, and Selection...

    Science.gov (United States)

    2012-06-15

    ..., standards-based teaching that is unique to music education. The commenter added that it would be beneficial... DEPARTMENT OF EDUCATION [CFDA Number 84.351F] Arts in Education National Program; Final Priority... Education. ACTION: Notice. SUMMARY: The Assistant Deputy Secretary for Innovation and Improvement announces...

  6. Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation

    Directory of Open Access Journals (Sweden)

    Gabriel Recchia

    2015-01-01

    Full Text Available Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics.

  7. Immigration And Self-Selection

    OpenAIRE

    George J. Borjas

    1988-01-01

    Self-selection plays a dominant role in determining the size and composition of immigrant flows. The United States competes with other potential host countries in the "immigration market". Host countries vary in their "offers" of economic opportunities and also differ in the way they ration entry through their immigration policies. Potential immigrants compare the various opportunities and are non-randomly sorted by the immigration market among the various host countries. This paper presents ...

  8. Spin, statistics, and geometry of random walks

    International Nuclear Information System (INIS)

    Jaroszewicz, T.; Kurzepa, P.S.

    1991-01-01

    The authors develop and unify two complementary descriptions of propagation of spinning particles: the directed random walk representation and the spin factor approach. Working in an arbitrary number of dimensions D, they first represent the Dirac propagator in terms of a directed random walk. They then derive the general and explicit form of the gauge connection describing parallel transport of spin and investigate the resulting quantum-mechanical problem of a particle moving on a sphere in the field of a nonabelian SO(D-1) monopole. This construction, generalizing Polyakov's results, enables them to prove the equivalence of the random walk and path-integral (spin factor) representation. As an alternative, they construct and discuss various Wess-Zumino-Witten forms of the spin factor. They clarify the role played by the coupling between the particle's spin and translational degrees of freedom in establishing the geometrical properties of particle's paths in spacetime. To this end, they carefully define and evaluate Hausdorff dimensions of bosonic and fermionic sample paths, in the covariant as well as nonrelativistic formulations. Finally, as an application of the developed formalism, they give an intuitive spacetime interpretation of chiral anomalies in terms of the geometry of fermion trajectories

  9. Increasing Prediction the Original Final Year Project of Student Using Genetic Algorithm

    Science.gov (United States)

    Saragih, Rijois Iboy Erwin; Turnip, Mardi; Sitanggang, Delima; Aritonang, Mendarissan; Harianja, Eva

    2018-04-01

    Final year project is very important forgraduation study of a student. Unfortunately, many students are not seriouslydidtheir final projects. Many of studentsask for someone to do it for them. In this paper, an application of genetic algorithms to predict the original final year project of a studentis proposed. In the simulation, the data of the final project for the last 5 years is collected. The genetic algorithm has several operators namely population, selection, crossover, and mutation. The result suggest that genetic algorithm can do better prediction than other comparable model. Experimental results of predicting showed that 70% was more accurate than the previous researched.

  10. Robust portfolio selection based on asymmetric measures of variability of stock returns

    Science.gov (United States)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  11. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  12. Optimal Site Selection of Electric Vehicle Charging Stations Based on a Cloud Model and the PROMETHEE Method

    Directory of Open Access Journals (Sweden)

    Yunna Wu

    2016-03-01

    Full Text Available The task of site selection for electric vehicle charging stations (EVCS is hugely important from the perspective of harmonious and sustainable development. However, flaws and inadequacies in the currently used multi-criteria decision making methods could result in inaccurate and irrational decision results. First of all, the uncertainty of the information cannot be described integrally in the evaluation of the EVCS site selection. Secondly, rigorous consideration of the mutual influence between the various criteria is lacking, which is mainly evidenced in two aspects: one is ignoring the correlation, and the other is the unconscionable measurements. Last but not least, the ranking method adopted in previous studies is not very appropriate for evaluating the EVCS site selection problem. As a result of the above analysis, a Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE method-based decision system combined with the cloud model is proposed in this paper for EVCS site selection. Firstly, the use of the PROMETHEE method can bolster the confidence and visibility for decision makers. Secondly, the cloud model is recommended to describe the fuzziness and randomness of linguistic terms integrally and accurately. Finally, the Analytical Network Process (ANP method is adopted to measure the correlation of the indicators with a greatly simplified calculation of the parameters and the steps required.

  13. New Interval-Valued Intuitionistic Fuzzy Behavioral MADM Method and Its Application in the Selection of Photovoltaic Cells

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2016-10-01

    Full Text Available As one of the emerging renewable resources, the use of photovoltaic cells has become a promise for offering clean and plentiful energy. The selection of a best photovoltaic cell for a promoter plays a significant role in aspect of maximizing income, minimizing costs and conferring high maturity and reliability, which is a typical multiple attribute decision making (MADM problem. Although many prominent MADM techniques have been developed, most of them are usually to select the optimal alternative under the hypothesis that the decision maker or expert is completely rational and the decision data are represented by crisp values. However, in the selecting processes of photovoltaic cells the decision maker is usually bounded rational and the ratings of alternatives are usually imprecise and vague. To address these kinds of complex and common issues, in this paper we develop a new interval-valued intuitionistic fuzzy behavioral MADM method. We employ interval-valued intuitionistic fuzzy numbers (IVIFNs to express the imprecise ratings of alternatives; and we construct LINMAP-based nonlinear programming models to identify the reference points under IVIFNs contexts, which avoid the subjective randomness of selecting the reference points. Finally we develop a prospect theory-based ranking method to identify the optimal alternative, which takes fully into account the decision maker’s behavioral characteristics such as reference dependence, diminishing sensitivity and loss aversion in the decision making process.

  14. Cathodic Protection Field Trials on Prestressed Concrete Components, Final Report

    Science.gov (United States)

    1998-01-01

    This is the final report in a study to demonstrate the feasibility of using cathodic protection (CP) on concrete bridge structures containing prestressed steel. The interim report, FHWA-RD-95-032, has more details on the installation of selected CP s...

  15. Understanding perspectives on sex-selection in India: an intersectional study

    OpenAIRE

    Sonya Davey, BA; Manisha Sharma, PhD MFA

    2014-01-01

    Background: Sex-selective abortion results in fewer girls than boys in India (914 girls:1000 boys). To understand perspectives about who is responsible for sex-selective abortion, our aim was to focus on narratives of vastly diverse stakeholders in Indian society. Methods: The qualitative study was undertaken in urban sectors of six northwestern Indian states. Ethnographic unstructured, conversation-style interviews with randomly selected participants were held for an unbiased study. To ca...

  16. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    International Nuclear Information System (INIS)

    Pitton, Michael B.; Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph

    2015-01-01

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies

  17. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Pitton, Michael B., E-mail: michael.pitton@unimedizin-mainz.de; Kloeckner, Roman [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Ruckes, Christian [Johannes Gutenberg University Medical Center, IZKS (Germany); Wirth, Gesine M. [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Eichhorn, Waltraud [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Wörns, Marcus A.; Weinmann, Arndt [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Schreckenberger, Mathias [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Galle, Peter R. [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Otto, Gerd [Johannes Gutenberg University Medical Center, Department of Transplantation Surgery (Germany); Dueber, Christoph [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany)

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  18. Vast Portfolio Selection with Gross-exposure Constraints*

    Science.gov (United States)

    Fan, Jianqing; Zhang, Jingjin; Yu, Ke

    2012-01-01

    We introduce the large portfolio selection using gross-exposure constraints. We show that with gross-exposure constraint the empirically selected optimal portfolios based on estimated covariance matrices have similar performance to the theoretical optimal ones and there is no error accumulation effect from estimation of vast covariance matrices. This gives theoretical justification to the empirical results in Jagannathan and Ma (2003). We also show that the no-short-sale portfolio can be improved by allowing some short positions. The applications to portfolio selection, tracking, and improvements are also addressed. The utility of our new approach is illustrated by simulation and empirical studies on the 100 Fama-French industrial portfolios and the 600 stocks randomly selected from Russell 3000. PMID:23293404

  19. An explicit semantic relatedness measure based on random walk

    Directory of Open Access Journals (Sweden)

    HU Sihui

    2016-10-01

    Full Text Available The semantic relatedness calculation of open domain knowledge network is a significant issue.In this paper,pheromone strategy is drawn from the thought of ant colony algorithm and is integrated into the random walk which is taken as the basic framework of calculating the semantic relatedness degree.The pheromone distribution is taken as a criterion of determining the tightness degree of semantic relatedness.A method of calculating semantic relatedness degree based on random walk is proposed and the exploration process of calculating the semantic relatedness degree is presented in a dominant way.The method mainly contains Path Select Model(PSM and Semantic Relatedness Computing Model(SRCM.PSM is used to simulate the path selection of ants and pheromone release.SRCM is used to calculate the semantic relatedness by utilizing the information returned by ants.The result indicates that the method could complete semantic relatedness calculation in linear complexity and extend the feasible strategy of semantic relatedness calculation.

  20. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  1. Prediction of plant promoters based on hexamers and random triplet pair analysis

    Directory of Open Access Journals (Sweden)

    Noman Nasimul

    2011-06-01

    Full Text Available Abstract Background With an increasing number of plant genome sequences, it has become important to develop a robust computational method for detecting plant promoters. Although a wide variety of programs are currently available, prediction accuracy of these still requires further improvement. The limitations of these methods can be addressed by selecting appropriate features for distinguishing promoters and non-promoters. Methods In this study, we proposed two feature selection approaches based on hexamer sequences: the Frequency Distribution Analyzed Feature Selection Algorithm (FDAFSA and the Random Triplet Pair Feature Selecting Genetic Algorithm (RTPFSGA. In FDAFSA, adjacent triplet-pairs (hexamer sequences were selected based on the difference in the frequency of hexamers between promoters and non-promoters. In RTPFSGA, random triplet-pairs (RTPs were selected by exploiting a genetic algorithm that distinguishes frequencies of non-adjacent triplet pairs between promoters and non-promoters. Then, a support vector machine (SVM, a nonlinear machine-learning algorithm, was used to classify promoters and non-promoters by combining these two feature selection approaches. We referred to this novel algorithm as PromoBot. Results Promoter sequences were collected from the PlantProm database. Non-promoter sequences were collected from plant mRNA, rRNA, and tRNA of PlantGDB and plant miRNA of miRBase. Then, in order to validate the proposed algorithm, we applied a 5-fold cross validation test. Training data sets were used to select features based on FDAFSA and RTPFSGA, and these features were used to train the SVM. We achieved 89% sensitivity and 86% specificity. Conclusions We compared our PromoBot algorithm to five other algorithms. It was found that the sensitivity and specificity of PromoBot performed well (or even better with the algorithms tested. These results show that the two proposed feature selection methods based on hexamer frequencies

  2. Functional Principal Component Analysis and Randomized Sparse Clustering Algorithm for Medical Image Analysis

    Science.gov (United States)

    Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao

    2015-01-01

    Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383

  3. School-located Influenza Vaccinations for Adolescents: A Randomized Controlled Trial.

    Science.gov (United States)

    Szilagyi, Peter G; Schaffer, Stanley; Rand, Cynthia M; Goldstein, Nicolas P N; Vincelli, Phyllis; Hightower, A Dirk; Younge, Mary; Eagan, Ashley; Blumkin, Aaron; Albertin, Christina S; DiBitetto, Kristine; Yoo, Byung-Kwang; Humiston, Sharon G

    2018-02-01

    We aimed to evaluate the effect of school-located influenza vaccination (SLIV) on adolescents' influenza vaccination rates. In 2015-2016, we performed a cluster-randomized trial of adolescent SLIV in middle/high schools. We selected 10 pairs of schools (identical grades within pairs) and randomly allocated schools within pairs to SLIV or usual care control. At eight suburban SLIV schools, we sent parents e-mail notifications about upcoming SLIV clinics and promoted online immunization consent. At two urban SLIV schools, we sent parents (via student backpack fliers) paper immunization consent forms and information about SLIV. E-mails were unavailable at these schools. Local health department nurses administered nasal or injectable influenza vaccine at dedicated SLIV clinics and billed insurers. We compared influenza vaccination rates at SLIV versus control schools using school directories to identify the student sample in each school. We used the state immunization registry to determine receipt of influenza vaccination. The final sample comprised 17,650 students enrolled in the 20 schools. Adolescents at suburban SLIV schools had higher overall influenza vaccination rates than did adolescents at control schools (51% vs. 46%, p < .001; adjusted odds ratio = 1.27, 95% confidence interval 1.18-1.38, controlling for vaccination during the prior two seasons). No effect of SLIV was noted among urbanschools on multivariate analysis. SLIV did not substitute for vaccinations in primary care or other settings; in suburban settings, SLIV was associated with increased vaccinations in primary care or other settings (adjusted odds ratio = 1.10, 95% confidence interval 1.02-1.19). SLIV in this community increased influenza vaccination rates among adolescents attending suburban schools. Copyright © 2018. Published by Elsevier Inc.

  4. Brain potentials index executive functions during random number generation.

    Science.gov (United States)

    Joppich, Gregor; Däuper, Jan; Dengler, Reinhard; Johannes, Sönke; Rodriguez-Fornells, Antoni; Münte, Thomas F

    2004-06-01

    The generation of random sequences is considered to tax different executive functions. To explore the involvement of these functions further, brain potentials were recorded in 16 healthy young adults while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus to yield either fast (1 press/800 ms) or slow (1 press/1300 ms) sequences in separate runs. Attentional demands of random and ordered tasks were assessed by the introduction of a secondary task (key-press to a target tone). The P3 amplitude to the target tone of this secondary task was reduced during RNG, reflecting the greater consumption of attentional resources during RNG. Moreover, RNG led to a left frontal negativity peaking 140 ms after the onset of the pacing stimulus, whenever the subjects produced a true random response. This negativity could be attributed to the left dorsolateral prefrontal cortex and was absent when numbers were repeated. This negativity was interpreted as an index for the inhibition of habitual responses. Finally, in response locked ERPs a negative component was apparent peaking about 50 ms after the key-press that was more prominent during RNG. Source localization suggested a medial frontal source. This effect was tentatively interpreted as a reflection of the greater monitoring demands during random sequence generation.

  5. Record statistics of a strongly correlated time series: random walks and Lévy flights

    Science.gov (United States)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  6. Random unitary operations and quantum Darwinism

    International Nuclear Information System (INIS)

    Balaneskovic, Nenad

    2016-01-01

    We study the behavior of Quantum Darwinism (Zurek, Nature Physics 5, 181-188 (2009)) within the iterative, random unitary operations qubit-model of pure decoherence (Novotn'y et al, New Jour. Phys. 13, 053052 (2011)). We conclude that Quantum Darwinism, which describes the quantum mechanical evolution of an open system from the point of view of its environment, is not a generic phenomenon, but depends on the specific form of initial states and on the type of system-environment interactions. Furthermore, we show that within the random unitary model the concept of Quantum Darwinism enables one to explicitly construct and specify artificial initial states of environment that allow to store information about an open system of interest and its pointer-basis with maximal efficiency. Furthermore, we investigate the behavior of Quantum Darwinism after introducing dissipation into the iterative random unitary qubit model with pure decoherence in accord with V. Scarani et al (Phys. Rev. Lett. 88, 097905 (2002)) and reconstruct the corresponding dissipative attractor space. We conclude that in Zurek's qubit model Quantum Darwinism depends on the order in which pure decoherence and dissipation act upon an initial state of the entire system. We show explicitly that introducing dissipation into the random unitary evolution model in general suppresses Quantum Darwinism (regardless of the order in which decoherence and dissipation are applied) for all positive non-zero values of the dissipation strength parameter, even for those initial state configurations which, in Zurek's qubit model and in the random unitary model with pure decoherence, would lead to Quantum Darwinism. Finally, we discuss what happens with Quantum Darwinism after introducing into the iterative random unitary qubit model with pure decoherence (asymmetric) dissipation and dephasing, again in accord with V. Scarani et al (Phys. Rev. Lett. 88, 097905 (2002)), and reconstruct the corresponding

  7. Fragmentation of random trees

    International Nuclear Information System (INIS)

    Kalay, Z; Ben-Naim, E

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N→∞. We obtain analytically the size density ϕ s of trees of size s. The size density has power-law tail ϕ s ∼s −α with exponent α=1+(1/m). Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees. (paper)

  8. 13 scientists aced their science communication test at the FameLab final

    CERN Document Server

    Antonella Del Rosso

    2015-01-01

    On 8 May, the joint CERN and Swiss FameLab final took place in CERN’s Restaurant 1, which was transformed into a cosy setting for the special occasion. The jury selected Oskari Vinko, a Master’s student in synthetic biology at ETH Zurich, as the winner of the Swiss final while Lillian Smestad, a physicist in the Aegis collaboration, will be the first CERN finalist to go to the international final at the Cheltenham Science Festival. In addition, CMS physicist Christos Lazaridis was awarded the audience prize.   

  9. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    DEFF Research Database (Denmark)

    Kleven, Henrik Jacobsen; Knudsen, Martin B.; Kreiner, Claus Thustrup

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were...... deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main findings. First, we find that the tax evasion rate is very small (0.3%) for income subject to third...... impact on tax evasion, but that this effect is small in comparison to avoidance responses. Third, we find that prior audits substantially increase self-reported income, implying that individuals update their beliefs about detection probability based on experiencing an audit. Fourth, threat-of-audit...

  10. Familial versus mass selection in small populations

    Directory of Open Access Journals (Sweden)

    Couvet Denis

    2003-07-01

    Full Text Available Abstract We used diffusion approximations and a Markov-chain approach to investigate the consequences of familial selection on the viability of small populations both in the short and in the long term. The outcome of familial selection was compared to the case of a random mating population under mass selection. In small populations, the higher effective size, associated with familial selection, resulted in higher fitness for slightly deleterious and/or highly recessive alleles. Conversely, because familial selection leads to a lower rate of directional selection, a lower fitness was observed for more detrimental genes that are not highly recessive, and with high population sizes. However, in the long term, genetic load was almost identical for both mass and familial selection for populations of up to 200 individuals. In terms of mean time to extinction, familial selection did not have any negative effect at least for small populations (N ≤ 50. Overall, familial selection could be proposed for use in management programs of small populations since it increases genetic variability and short-term viability without impairing the overall persistence times.

  11. True random number generation from mobile telephone photo based on chaotic cryptography

    International Nuclear Information System (INIS)

    Zhao Liang; Liao Xiaofeng; Xiao Di; Xiang Tao; Zhou Qing; Duan Shukai

    2009-01-01

    A cheap, convenient and universal TRNG based on mobile telephone photo for producing random bit sequence is proposed. To settle the problem of sequential pixels and comparability, three chaos-based approaches are applied to post-process the generated binary image. The random numbers produced by three users are tested using US NIST RNG statistical test software. The experimental results indicate that the Arnold cat map is the fastest way to generate a random bit sequence and can be accepted on general PC. The 'MASK' algorithm also performs well. Finally, comparing with the TRNG of Hu et al. [Hu Y, Liao X, Wong KW, Zhou Q. A true random number generator based on mouse movement and chaotic cryptography. Chaos, Solitons and Fractals 2007. doi: 10.1016/j.chaos.2007.10.022] which is presented by Hu et al., many merits of the proposed TRNG in this paper has been found.

  12. [Intel random number generator-based true random number generator].

    Science.gov (United States)

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  13. Mobile access to virtual randomization for investigator-initiated trials.

    Science.gov (United States)

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization

  14. Ray tracing method for simulation of laser beam interaction with random packings of powders

    Science.gov (United States)

    Kovalev, O. B.; Kovaleva, I. O.; Belyaev, V. V.

    2018-03-01

    Selective laser sintering is a technology of rapid manufacturing of a free form that is created as a solid object by selectively fusing successive layers of powder using a laser. The motivation of this study is due to the currently insufficient understanding of the processes and phenomena of selective laser melting of powders whose time scales differ by orders of magnitude. To construct random packings from mono- and polydispersed solid spheres, the algorithm of their generation based on the discrete element method is used. A numerical method of ray tracing is proposed that is used to simulate the interaction of laser radiation with a random bulk packing of spherical particles and to predict the optical properties of the granular layer, the extinction and absorption coefficients, depending on the optical properties of a powder material.

  15. 77 FR 44475 - Final Definitions, Requirements, and Selection Criteria; Charter Schools Program (CSP)-Charter...

    Science.gov (United States)

    2012-07-30

    ... commenter cited the recent movie ``Bully,'' which documented the effects of bullying, and stated that... the Executive order and subject to review by the Office of Management and Budget (OMB). Section 3(f... Executive order. This final regulatory action is not a significant regulatory action subject to review by...

  16. Conference report: 2012 Repository Symposium. Final storage in Germany. New start - ways and consequences of the site selection procedure

    International Nuclear Information System (INIS)

    Kettler, John

    2012-01-01

    The Aachen Institute for Nuclear Training invited participants to the 3-day '2012 Repository Symposium - Final Storage in Germany' held in Bonn. The subtitle of the event, 'New Start - Ways and Consequences of the Site Selection Procedure,' expressed the organizers' summary that the Repository Finding Act currently under discussion did not give rise to any expectation of a repository for high-level radioactive waste before 2080. The symposium was attended by more than 120 persons from Germany and abroad. They discussed the basic elements of the site selection procedure and its consequences on the basis of the draft so far known to the public. While extensive public participation is envisaged for the stage of finding a repository, this does not apply to the draft legislation in the same way. The legal determinations are negotiated in a small circle by the political parties and the state governments. Michael Sailer (Oeko-Institut e.V.) holds that agreement on a repository finding act is urgent. Prof. Dr. Bruno Thomauske (RWTH Aachen) arrives at the conclusion mentioned above, that no repository for high-level radioactive waste can start operation before 2080 on the basis of the Repository Finding Act. Dr. Bettina Keienburg, attorney at law, in her paper drew attention to the points of dispute in the draft legislation with regard to changes in competency of public authorities. The draft law indicated a clear shift of competency for finding a repository from the Federal Office for Radiation Protection to a federal agency yet to be set up. Prof. Dr. Christoph Moench outlined the deficiencies of the draft legislation in matters of refinancing and the polluter-pays principle. Among the tentative solutions discussed it was above all the Swedish model which was acclaimed most widely. (orig.)

  17. Antenna Selection for Full-Duplex MIMO Two-Way Communication Systems

    KAUST Repository

    Wilson-Nunn, Daniel; Chaaban, Anas; Sezgin, Aydin; Alouini, Mohamed-Slim

    2017-01-01

    Antenna selection for full-duplex communication between two nodes, each equipped with a predefined number of antennae and transmit/receive chains, is studied. Selection algorithms are proposed based on magnitude, orthogonality, and determinant criteria. The algorithms are compared to optimal selection obtained by exhaustive search as well as random selection, and are shown to yield performance fairly close to optimal at a much lower complexity. Performance comparison for a Rayleigh fading symmetric channel reveals that selecting a single transmit antenna is best at low signal-to-noise ratio (SNR), while selecting an equal number of transmit and receive antennae is best at high SNR.

  18. Antenna Selection for Full-Duplex MIMO Two-Way Communication Systems

    KAUST Repository

    Wilson-Nunn, Daniel

    2017-03-11

    Antenna selection for full-duplex communication between two nodes, each equipped with a predefined number of antennae and transmit/receive chains, is studied. Selection algorithms are proposed based on magnitude, orthogonality, and determinant criteria. The algorithms are compared to optimal selection obtained by exhaustive search as well as random selection, and are shown to yield performance fairly close to optimal at a much lower complexity. Performance comparison for a Rayleigh fading symmetric channel reveals that selecting a single transmit antenna is best at low signal-to-noise ratio (SNR), while selecting an equal number of transmit and receive antennae is best at high SNR.

  19. Assessing the Job Selection Criteria of Accounting Students: a Normative Approach

    OpenAIRE

    zubairu, umaru; Ismail, Suhaiza; Abdul Hamid, Fatima

    2017-01-01

    This research assessed to what extent final-year Muslim accounting students in Malaysia considered Islamic principles when choosing a job after graduation. 356 final-year Muslim accounting students in four Malaysian universities were surveyed using an open-ended job selection scenario. The result shows that reality does not live up to the ideal. Only 16% of the respondents apply Islamic principles in making a job selection decision. The remaining 84% are more concerned with other criteria suc...

  20. Drug-eluting versus plain balloon angioplasty for the treatment of failing dialysis access: Final results and cost-effectiveness analysis from a prospective randomized controlled trial (NCT01174472)

    Energy Technology Data Exchange (ETDEWEB)

    Kitrou, Panagiotis M., E-mail: panoskitrou@gmail.com [Department of Interventional Radiology, Patras University Hospital, School of Medicine, Rion 26504 (Greece); Katsanos, Konstantinos [Department of Interventional Radiology, Guy' s and St. Thomas’ Hospitals, NHS Foundation Trust, King' s Health Partners, London SE1 7EH (United Kingdom); Spiliopoulos, Stavros; Karnabatidis, Dimitris; Siablis, Dimitris [Department of Interventional Radiology, Patras University Hospital, School of Medicine, Rion 26504 (Greece)

    2015-03-15

    Highlights: •1-Year target lesion primary patency significantly higher after PCB application compared to plain balloon angioplasty in the failing dialysis access. •Significant difference in favor of PCB in cumulative primary patency of AVGs at 1 year. •No significant difference in cumulative primary patency of AVFs treated with PCB at 1 year. •Cost effectiveness analysis performed. •Paclitaxel-coated balloon angioplasty proves to be a cost-effective option for treating dialysis access. -- Abstract: Objective: To report the final results and cost-effectiveness analysis of a prospective randomized controlled trial investigating drug-eluting balloon (DEB) versus plain balloon angioplasty (BA) for the treatment of failing dialysis access ( (NCT01174472)). Methods: 40 patients were randomized to angioplasty with either DEB (n = 20) or BA (n = 20) for treatment of significant venous stenosis causing a failing dialysis access. Both arteriovenous fistulas (AVF) and synthetic arteriovenous grafts (AVG) were included. Angiographic follow up was scheduled every two months. Primary endpoints were technical success and target lesion primary patency at 1 year. Cumulative and survival analysis was performed. Incremental net benefit (INB) and incremental cost effectiveness ratio (ICER) were calculated and the cost-effectiveness acceptability curve (CEAC) was drawn. Results: Baseline variables were equally distributed between the two groups. At 1 year, cumulative target lesion primary patency was significantly higher after DEB application (35% vs. 5% after BA, p < 0.001). Overall, median primary patency was 0.64 years in case of DEB vs. 0.36 years in case of BA (p = 0.0007; unadjusted HR = 0.27 [95%CI: 0.13–0.58]; Cox adjusted HR = 0.23 [95%CI: 0.10–0.50]). ICER was 2198 Euros (€) per primary patency year of dialysis access gained. INB was 1068€ (95%CI: 31–2105€) for a willingness-to-pay (WTP) threshold of 5000€ (corresponding acceptability probability >97

  1. Evaluation of silodosin in comparison to tamsulosin in benign prostatic hyperplasia: a randomized controlled trial.

    Science.gov (United States)

    Pande, Satabdi; Hazra, Avijit; Kundu, Anup Kumar

    2014-01-01

    Benign prostatic hyperplasia (BPH) is the most common cause of lower urinary tract symptoms in elderly men. Selective alfa1-adrenergic antagonists are now first-line drugs in the medical management of BPH. We conducted a single-blind, parallel group, randomized, controlled trial to compare the effectiveness and safety of the new alfa1-blocker silodosin versus the established drug tamsulosin in symptomatic BPH. Ambulatory male BPH patients, aged above 50 years, were recruited on the basis of International Prostate Symptom Score (IPSS). Subjects were randomized in 1:1 ratio to receive either tamsulosin 0.4 mg controlled release or silodosin 8 mg once daily after dinner for 12 weeks. Primary outcome measure was reduction in IPSS. Proportion of subjects who achieved IPSS tamsulosin were analyzed. Final IPSS at 12-week was significantly less than baseline for both groups. However, groups remained comparable in terms of IPSS at all visits. There was a significant impact on sexual function (assessed by IPSS sexual function score) in silodosin arm compared with tamsulosin. Prostate size and uroflowmetry parameters did not change. Both treatments were well-tolerated. Retrograde ejaculation was encountered only with silodosin and postural hypotension only with tamsulosin. Silodosin is comparable to tamsulosin in the treatment of BPH in Indian men. However, retrograde ejaculation may be troublesome for sexually active patients.

  2. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  3. Radioactive waste repository site selection in the Republic of Slovenia

    International Nuclear Information System (INIS)

    Jeran, M.

    1992-01-01

    The report shows the procedure for the low and intermediate level radwaste (LLW and ILW) repository site selection and the work performed up to the present. The procedure for the repository site selection is divided into four steps. In the first step the unsuitable areas are excluded by taking into consideration the rough exclusion criteria. In the second step, the remaining suitable areas are screened to identify the potential sites with respect to preference criteria. In the third step three to five candidate sites will be assessed and selected among the potential sites. In the final, the fourth step, detailed site investigation and confirmation of one or two most suitable sites will follow. In Slovenia the 1st and the 2nd step of site selection have been completed, while step 3 is now in its final stage. (author) [sl

  4. Randomized Oversampling for Generalized Multiscale Finite Element Methods

    KAUST Repository

    Calo, Victor M.

    2016-03-23

    In this paper, we develop efficient multiscale methods for flows in heterogeneous media. We use the generalized multiscale finite element (GMsFEM) framework. GMsFEM approximates the solution space locally using a few multiscale basis functions. This approximation selects an appropriate snapshot space and a local spectral decomposition, e.g., the use of oversampled regions, in order to achieve an efficient model reduction. However, the successful construction of snapshot spaces may be costly if too many local problems need to be solved in order to obtain these spaces. We use a moderate quantity of local solutions (or snapshot vectors) with random boundary conditions on oversampled regions with zero forcing to deliver an efficient methodology. Motivated by the randomized algorithm presented in [P. G. Martinsson, V. Rokhlin, and M. Tygert, A Randomized Algorithm for the approximation of Matrices, YALEU/DCS/TR-1361, Yale University, 2006], we consider a snapshot space which consists of harmonic extensions of random boundary conditions defined in a domain larger than the target region. Furthermore, we perform an eigenvalue decomposition in this small space. We study the application of randomized sampling for GMsFEM in conjunction with adaptivity, where local multiscale spaces are adaptively enriched. Convergence analysis is provided. We present representative numerical results to validate the method proposed.

  5. Particle swarm optimization-based automatic parameter selection for deep neural networks and its applications in large-scale and high-dimensional data.

    Science.gov (United States)

    Ye, Fei

    2017-01-01

    In this paper, we propose a new automatic hyperparameter selection approach for determining the optimal network configuration (network structure and hyperparameters) for deep neural networks using particle swarm optimization (PSO) in combination with a steepest gradient descent algorithm. In the proposed approach, network configurations were coded as a set of real-number m-dimensional vectors as the individuals of the PSO algorithm in the search procedure. During the search procedure, the PSO algorithm is employed to search for optimal network configurations via the particles moving in a finite search space, and the steepest gradient descent algorithm is used to train the DNN classifier with a few training epochs (to find a local optimal solution) during the population evaluation of PSO. After the optimization scheme, the steepest gradient descent algorithm is performed with more epochs and the final solutions (pbest and gbest) of the PSO algorithm to train a final ensemble model and individual DNN classifiers, respectively. The local search ability of the steepest gradient descent algorithm and the global search capabilities of the PSO algorithm are exploited to determine an optimal solution that is close to the global optimum. We constructed several experiments on hand-written characters and biological activity prediction datasets to show that the DNN classifiers trained by the network configurations expressed by the final solutions of the PSO algorithm, employed to construct an ensemble model and individual classifier, outperform the random approach in terms of the generalization performance. Therefore, the proposed approach can be regarded an alternative tool for automatic network structure and parameter selection for deep neural networks.

  6. Using a Calendar and Explanatory Instructions to Aid Within-Household Selection in Mail Surveys

    Science.gov (United States)

    Stange, Mathew; Smyth, Jolene D.; Olson, Kristen

    2016-01-01

    Although researchers can easily select probability samples of addresses using the U.S. Postal Service's Delivery Sequence File, randomly selecting respondents within households for surveys remains challenging. Researchers often place within-household selection instructions, such as the next or last birthday methods, in survey cover letters to…

  7. Mucositis reduction by selective elimination of oral flora in irradiated cancers of the head and neck: a placebo-controlled double-blind randomized study

    International Nuclear Information System (INIS)

    Wijers, Oda B.; Levendag, Peter C.; Harms, Erik; Gan-Teng, A.M.; Schmitz, Paul I.M.; Hendriks, W.D.H.; Wilms, Erik B.; Est, Henri van der; Visch, Leo L.

    2001-01-01

    Purpose: The aim of the study was to test the hypothesis that aerobic Gram-negative bacteria (AGNB) play a crucial role in the pathogenesis of radiation-induced mucositis; consequently, selective elimination of these bacteria from the oral flora should result in a reduction of the mucositis. Methods and Materials: Head-and-neck cancer patients, when scheduled for treatment by external beam radiation therapy (EBRT), were randomized for prophylactic treatment with an oral paste containing either a placebo or a combination of the antibiotics polymyxin E, tobramycin, and amphotericin B (PTA group). Weekly, the objective and subjective mucositis scores and microbiologic counts of the oral flora were noted. The primary study endpoint was the mucositis grade after 3 weeks of EBRT. Results: Seventy-seven patients were evaluable. No statistically significant difference for the objective and subjective mucositis scores was observed between the two study arms (p=0.33). The percentage of patients with positive cultures of AGNB was significantly reduced in the PTA group (p=0.01). However, complete eradication of AGNB was not achieved. Conclusions: Selective elimination of AGNB of the oral flora did not result in a reduction of radiation-induced mucositis and therefore does not support the hypothesis that these bacteria play a crucial role in the pathogenesis of mucositis

  8. Comparative analysis of rationale used by dentists and patient for final esthetic outcome of dental treatment.

    Science.gov (United States)

    Reddy, S Varalakshmi; Madineni, Praveen Kumar; Sudheer, A; Gujjarlapudi, Manmohan Choudary; Sreedevi, B; Reddy, Patelu Sunil Kumar

    2013-05-01

    To compare and evaluate the perceptions of esthetics among dentists and patients regarding the final esthetic outcome of a dental treatment. Esthetics is a matter of perception and is associated with the way different people look at an object. What constitutes esthetic for a particular person may not be acceptable for another. Hence it is subjective in nature. This becomes more obvious during the post-treatment evaluation of esthetics by dentist and the concerned patient. Opinion seldom matches. Hence, the study is a necessary part of the process of understanding the mind of dentist and patient regarding what constitutes esthetics. A survey has been conducted by means of a questionnaire consisting of 10 questions, on two groups of people. First group consists of 100 dentists picked at random in Kanyakumari district of Tamil Nadu, India. Second group consisted of 100 patients who required complete denture prosthesis. The second group was divided into two subgroups A and B. Subgroup A consisting of 50 men and subgroup B consisting of 50 women. In each subgroup 25 patients were selected in age group of 40 to 50 and 25 patients were selected in the age group of 50 to 60. The questionnaire was given to both the groups and asked to fill up, which was then statistically analyzed to look for patterns of thought process among them. Results were subjected to statistical analysis by Student's t-test. Perceptions of esthetics differs from dentist who is educated regarding esthetic principles of treatment and a patient who is not subjected to such education. Since, the questions were formulated such that patients could better understand the underlying problem, the final outcome of survey is a proof that dentists need to take into account what the patient regards as esthetics in order to provide a satisfactory treatment. CLINICAL AND ACADEMIC SIGNIFICANCE: The current study helps the dentist to better educate the patient regarding esthetics so that patient appreciates the final

  9. Selected papers on noise and stochastic processes

    CERN Document Server

    1954-01-01

    Six classic papers on stochastic process, selected to meet the needs of physicists, applied mathematicians, and engineers. Contents: 1.Chandrasekhar, S.: Stochastic Problems in Physics and Astronomy. 2. Uhlenbeck, G. E. and Ornstein, L. S.: On the Theory of the Browninan Motion. 3. Ming Chen Wang and Uhlenbeck, G. E.: On the Theory of the Browninan Motion II. 4. Rice, S. O.: Mathematical Analysis of Random Noise. 5. Kac, Mark: Random Walk and the Theory of Brownian Motion. 6. Doob, J. L.: The Brownian Movement and Stochastic Equations. Unabridged republication of the Dover reprint (1954). Pre

  10. Selection of the Mars Science Laboratory landing site

    Science.gov (United States)

    Golombek, M.; Grant, J.; Kipp, D.; Vasavada, A.; Kirk, Randolph L.; Fergason, Robin L.; Bellutta, P.; Calef, F.; Larsen, K.; Katayama, Y.; Huertas, A.; Beyer, R.; Chen, A.; Parker, T.; Pollard, B.; Lee, S.; Hoover, R.; Sladek, H.; Grotzinger, J.; Welch, R.; Dobrea, E. Noe; Michalski, J.; Watkins, M.

    2012-01-01

    The selection of Gale crater as the Mars Science Laboratory landing site took over five years, involved broad participation of the science community via five open workshops, and narrowed an initial >50 sites (25 by 20 km) to four finalists (Eberswalde, Gale, Holden and Mawrth) based on science and safety. Engineering constraints important to the selection included: (1) latitude (±30°) for thermal management of the rover and instruments, (2) elevation (surface that is safe for landing and roving and not dominated by fine-grained dust. Science criteria important for the selection include the ability to assess past habitable environments, which include diversity, context, and biosignature (including organics) preservation. Sites were evaluated in detail using targeted data from instruments on all active orbiters, and especially Mars Reconnaissance Orbiter. All of the final four sites have layered sedimentary rocks with spectral evidence for phyllosilicates that clearly address the science objectives of the mission. Sophisticated entry, descent and landing simulations that include detailed information on all of the engineering constraints indicate all of the final four sites are safe for landing. Evaluation of the traversabilty of the landing sites and target “go to” areas outside of the ellipse using slope and material properties information indicates that all are trafficable and “go to” sites can be accessed within the lifetime of the mission. In the final selection, Gale crater was favored over Eberswalde based on its greater diversity and potential habitability.

  11. Arbitrary-step randomly delayed robust filter with application to boost phase tracking

    Science.gov (United States)

    Qin, Wutao; Wang, Xiaogang; Bai, Yuliang; Cui, Naigang

    2018-04-01

    The conventional filters such as extended Kalman filter, unscented Kalman filter and cubature Kalman filter assume that the measurement is available in real-time and the measurement noise is Gaussian white noise. But in practice, both two assumptions are invalid. To solve this problem, a novel algorithm is proposed by taking the following four steps. At first, the measurement model is modified by the Bernoulli random variables to describe the random delay. Then, the expression of predicted measurement and covariance are reformulated, which could get rid of the restriction that the maximum number of delay must be one or two and the assumption that probabilities of Bernoulli random variables taking the value one are equal. Next, the arbitrary-step randomly delayed high-degree cubature Kalman filter is derived based on the 5th-degree spherical-radial rule and the reformulated expressions. Finally, the arbitrary-step randomly delayed high-degree cubature Kalman filter is modified to the arbitrary-step randomly delayed high-degree cubature Huber-based filter based on the Huber technique, which is essentially an M-estimator. Therefore, the proposed filter is not only robust to the randomly delayed measurements, but robust to the glint noise. The application to the boost phase tracking example demonstrate the superiority of the proposed algorithms.

  12. Selective-imaging camera

    Science.gov (United States)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  13. Relationship of Sibling Structure and Interaction to Categorization Ability. Final Report.

    Science.gov (United States)

    Cicirelli, Victor G.; And Others

    This study identified behaviors of sibling pairs interacting on a cognitive task and related these behaviors to sibling structure variables (age and sex of each sibling and age spacing between them) and to measure of cognitive abilities of the younger sibling. Subjects were 160 sibling pairs randomly selected from appropriate subpopulations of…

  14. Early prevention of antisocial personality: long-term follow-up of two randomized controlled trials comparing indicated and selective approaches.

    Science.gov (United States)

    Scott, Stephen; Briskman, Jackie; O'Connor, Thomas G

    2014-06-01

    Antisocial personality is a common adult problem that imposes a major public health burden, but for which there is no effective treatment. Affected individuals exhibit persistent antisocial behavior and pervasive antisocial character traits, such as irritability, manipulativeness, and lack of remorse. Prevention of antisocial personality in childhood has been advocated, but evidence for effective interventions is lacking. The authors conducted two follow-up studies of randomized trials of group parent training. One involved 120 clinic-referred 3- to 7-year-olds with severe antisocial behavior for whom treatment was indicated, 93 of whom were reassessed between ages 10 and 17. The other involved 109 high-risk 4- to 6-year-olds with elevated antisocial behavior who were selectively screened from the community, 90 of whom were reassessed between ages 9 and 13. The primary psychiatric outcome measures were the two elements of antisocial personality, namely, antisocial behavior (assessed by a diagnostic interview) and antisocial character traits (assessed by a questionnaire). Also assessed were reading achievement (an important domain of youth functioning at work) and parent-adolescent relationship quality. In the indicated sample, both elements of antisocial personality were improved in the early intervention group at long-term follow-up compared with the control group (antisocial behavior: odds ratio of oppositional defiant disorder=0.20, 95% CI=0.06, 0.69; antisocial character traits: B=-4.41, 95% CI=-1.12, -8.64). Additionally, reading ability improved (B=9.18, 95% CI=0.58, 18.0). Parental expressed emotion was warmer (B=0.86, 95% CI=0.20, 1.41) and supervision was closer (B=-0.43, 95% CI=-0.11, -0.75), but direct observation of parenting showed no differences. Teacher-rated and self-rated antisocial behavior were unchanged. In contrast, in the selective high-risk sample, early intervention was not associated with improved long-term outcomes. Early intervention with

  15. Using histograms to introduce randomization in the generation of ensembles of decision trees

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  16. Study of Randomness in AES Ciphertexts Produced by Randomly Generated S-Boxes and S-Boxes with Various Modulus and Additive Constant Polynomials

    Science.gov (United States)

    Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan

    2016-06-01

    In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.

  17. Atomic structure calculations using the relativistic random phase approximation

    International Nuclear Information System (INIS)

    Cheng, K.T.; Johnson, W.R.

    1981-01-01

    A brief review is given for the relativistic random phase approximation (RRPA) applied to atomic transition problems. Selected examples of RRPA calculations on discrete excitations and photoionization are given to illustrate the need of relativistic many-body theories in dealing with atomic processes where both relativity and correlation are important

  18. Students' level of skillfulness and use of the internet in selected ...

    African Journals Online (AJOL)

    The study examined level of skillfulness and the use of the Internet for learning among secondary school students in Lagos State, Nigeria. The descriptive survey research method was adopted for the study. A sample of 450 students was randomly selected from the three secondary schools. One intact arm was selected from ...

  19. Identification and DNA fingerprinting of Legionella strains by randomly amplified polymorphic DNA analysis.

    OpenAIRE

    Bansal, N S; McDonell, F

    1997-01-01

    The randomly amplified polymorphic DNA (RAPD) technique was used in the development of a fingerprinting (typing) and identification protocol for Legionella strains. Twenty decamer random oligonucleotide primers were screened for their discriminatory abilities. Two candidate primers were selected. By using a combination of these primers, RAPD analysis allowed for the differentiation between all different species, between the serogroups, and further differentiation between subtypes of the same ...

  20. Are Final Comments in Web Survey Panels Associated with Next-Wave Attrition?

    Directory of Open Access Journals (Sweden)

    Cynthia McLauchlan

    2016-12-01

    Full Text Available Near the end of a web survey respondents are often asked whether they have further comments. Such final comments are usually ignored, in part because open-ended questions are challenging to analyse. We explored whether final comments are associated with next-wave attrition in survey panels. We categorized a random sample of final comments in the Longitudinal Studies for the Social Sciences (LISS panel and Dutch Immigrant panel into one of eight categories (neutral, positive, six subcategories of negative and regressed the indicator of next-wave attrition on comment length, comment category and socio-demographic variables. In the Immigrant panel we found shorter final comments (55 words with decreased next-wave attrition relative to making no comment. Comments about unclear survey questions quadruple the odds of attrition and “other” (uncategorized negative comments almost double the odds of attrition. In the LISS panel, making a comment (vs. not and comment length are not associated with attrition. However, when specifying individual comment categories, neutral comments are associated with half the odds of attrition relative to not making a comment.

  1. Evolution of the concentration PDF in random environments modeled by global random walk

    Science.gov (United States)

    Suciu, Nicolae; Vamos, Calin; Attinger, Sabine; Knabner, Peter

    2013-04-01

    The evolution of the probability density function (PDF) of concentrations of chemical species transported in random environments is often modeled by ensembles of notional particles. The particles move in physical space along stochastic-Lagrangian trajectories governed by Ito equations, with drift coefficients given by the local values of the resolved velocity field and diffusion coefficients obtained by stochastic or space-filtering upscaling procedures. A general model for the sub-grid mixing also can be formulated as a system of Ito equations solving for trajectories in the composition space. The PDF is finally estimated by the number of particles in space-concentration control volumes. In spite of their efficiency, Lagrangian approaches suffer from two severe limitations. Since the particle trajectories are constructed sequentially, the demanded computing resources increase linearly with the number of particles. Moreover, the need to gather particles at the center of computational cells to perform the mixing step and to estimate statistical parameters, as well as the interpolation of various terms to particle positions, inevitably produce numerical diffusion in either particle-mesh or grid-free particle methods. To overcome these limitations, we introduce a global random walk method to solve the system of Ito equations in physical and composition spaces, which models the evolution of the random concentration's PDF. The algorithm consists of a superposition on a regular lattice of many weak Euler schemes for the set of Ito equations. Since all particles starting from a site of the space-concentration lattice are spread in a single numerical procedure, one obtains PDF estimates at the lattice sites at computational costs comparable with those for solving the system of Ito equations associated to a single particle. The new method avoids the limitations concerning the number of particles in Lagrangian approaches, completely removes the numerical diffusion, and

  2. 10-Year Mortality Outcome of a Routine Invasive Strategy Versus a Selective Invasive Strategy in Non-ST-Segment Elevation Acute Coronary Syndrome: The British Heart Foundation RITA-3 Randomized Trial.

    Science.gov (United States)

    Henderson, Robert A; Jarvis, Christopher; Clayton, Tim; Pocock, Stuart J; Fox, Keith A A

    2015-08-04

    The RITA-3 (Third Randomised Intervention Treatment of Angina) trial compared outcomes of a routine early invasive strategy (coronary arteriography and myocardial revascularization, as clinically indicated) to those of a selective invasive strategy (coronary arteriography for recurrent ischemia only) in patients with non-ST-segment elevation acute coronary syndrome (NSTEACS). At a median of 5 years' follow-up, the routine invasive strategy was associated with a 24% reduction in the odds of all-cause mortality. This study reports 10-year follow-up outcomes of the randomized cohort to determine the impact of a routine invasive strategy on longer-term mortality. We randomized 1,810 patients with NSTEACS to receive routine invasive or selective invasive strategies. All randomized patients had annual follow-up visits up to 5 years, and mortality was documented thereafter using data from the Office of National Statistics. Over 10 years, there were no differences in mortality between the 2 groups (all-cause deaths in 225 [25.1%] vs. 232 patients [25.4%]: p = 0.94; and cardiovascular deaths in 135 [15.1%] vs. 147 patients [16.1%]: p = 0.65 in the routine invasive and selective invasive groups, respectively). Multivariate analysis identified several independent predictors of 10-year mortality: age, previous myocardial infarction, heart failure, smoking status, diabetes, heart rate, and ST-segment depression. A modified post-discharge Global Registry of Acute Coronary Events (GRACE) score was used to calculate an individual risk score for each patient and to form low-risk, medium-risk, and high-risk groups. Risk of death within 10 years varied markedly from 14.4 % in the low-risk group to 56.2% in the high-risk group. This mortality trend did not depend on the assigned treatment strategy. The advantage of reduced mortality of routine early invasive strategy seen at 5 years was attenuated during later follow-up, with no evidence of a difference in outcome at 10 years

  3. Comparative Study Of Two Non-Selective Cyclooxygenase ...

    African Journals Online (AJOL)

    The comparative study of the effects of two non-selective cyclooxygenase inhibitors ibuprofen and paracetamol on maternal and neonatal growth was conducted using 15 Sprague dawley rats, with mean body weight ranging between 165 and 179g. The rats were separated at random into three groups (A, B and C).

  4. Errors in causal inference: an organizational schema for systematic error and random error.

    Science.gov (United States)

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Pseudo-Random Sequences Generated by a Class of One-Dimensional Smooth Map

    Science.gov (United States)

    Wang, Xing-Yuan; Qin, Xue; Xie, Yi-Xin

    2011-08-01

    We extend a class of a one-dimensional smooth map. We make sure that for each desired interval of the parameter the map's Lyapunov exponent is positive. Then we propose a novel parameter perturbation method based on the good property of the extended one-dimensional smooth map. We perturb the parameter r in each iteration by the real number xi generated by the iteration. The auto-correlation function and NIST statistical test suite are taken to illustrate the method's randomness finally. We provide an application of this method in image encryption. Experiments show that the pseudo-random sequences are suitable for this application.

  6. A generalization of random matrix theory and its application to statistical physics.

    Science.gov (United States)

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  7. Risk-Controlled Multiobjective Portfolio Selection Problem Using a Principle of Compromise

    Directory of Open Access Journals (Sweden)

    Takashi Hasuike

    2014-01-01

    Full Text Available This paper proposes a multiobjective portfolio selection problem with most probable random distribution derived from current market data and other random distributions of boom and recession under the risk-controlled parameters determined by an investor. The current market data and information include not only historical data but also interpretations of economists’ oral and linguistic information, and hence, the boom and recession are often caused by these nonnumeric data. Therefore, investors need to consider several situations from most probable condition to boom and recession and to avoid the risk less than the target return in each situation. Furthermore, it is generally difficult to set random distributions of these cases exactly. Therefore, a robust-based approach for portfolio selection problems using the only mean values and variances of securities is proposed as a multiobjective programming problem. In addition, an exact algorithm is developed to obtain an explicit optimal portfolio using a principle of compromise.

  8. Role of selective interaction in wealth distribution

    International Nuclear Information System (INIS)

    Gupta, A.K.

    2005-08-01

    In our simplified description 'money' is wealth. A kinetic theory model of money is investigated where two agents interact (trade) selectively and exchange random amount of money between them while keeping total money of all the agents constant. The probability distributions of individual money (P(m) vs. m) is seen to be influenced by certain modes of selective interactions. The distributions shift away from Boltzmann-Gibbs like exponential distribution and in some cases distributions emerge with power law tails known as Pareto's law (P(m) ∝ m -(1+α) ). (author)

  9. Wide brick tunnel randomization - an unequal allocation procedure that limits the imbalance in treatment totals.

    Science.gov (United States)

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2014-04-30

    In open-label studies, partial predictability of permuted block randomization provides potential for selection bias. To lessen the selection bias in two-arm studies with equal allocation, a number of allocation procedures that limit the imbalance in treatment totals at a pre-specified level but do not require the exact balance at the ends of the blocks were developed. In studies with unequal allocation, however, the task of designing a randomization procedure that sets a pre-specified limit on imbalance in group totals is not resolved. Existing allocation procedures either do not preserve the allocation ratio at every allocation or do not include all allocation sequences that comply with the pre-specified imbalance threshold. Kuznetsova and Tymofyeyev described the brick tunnel randomization for studies with unequal allocation that preserves the allocation ratio at every step and, in the two-arm case, includes all sequences that satisfy the smallest possible imbalance threshold. This article introduces wide brick tunnel randomization for studies with unequal allocation that allows all allocation sequences with imbalance not exceeding any pre-specified threshold while preserving the allocation ratio at every step. In open-label studies, allowing a larger imbalance in treatment totals lowers selection bias because of the predictability of treatment assignments. The applications of the technique in two-arm and multi-arm open-label studies with unequal allocation are described. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Random Deep Belief Networks for Recognizing Emotions from Speech Signals.

    Science.gov (United States)

    Wen, Guihua; Li, Huihui; Huang, Jubing; Li, Danyang; Xun, Eryang

    2017-01-01

    Now the human emotions can be recognized from speech signals using machine learning methods; however, they are challenged by the lower recognition accuracies in real applications due to lack of the rich representation ability. Deep belief networks (DBN) can automatically discover the multiple levels of representations in speech signals. To make full of its advantages, this paper presents an ensemble of random deep belief networks (RDBN) method for speech emotion recognition. It firstly extracts the low level features of the input speech signal and then applies them to construct lots of random subspaces. Each random subspace is then provided for DBN to yield the higher level features as the input of the classifier to output an emotion label. All outputted emotion labels are then fused through the majority voting to decide the final emotion label for the input speech signal. The conducted experimental results on benchmark speech emotion databases show that RDBN has better accuracy than the compared methods for speech emotion recognition.

  11. Random Deep Belief Networks for Recognizing Emotions from Speech Signals

    Directory of Open Access Journals (Sweden)

    Guihua Wen

    2017-01-01

    Full Text Available Now the human emotions can be recognized from speech signals using machine learning methods; however, they are challenged by the lower recognition accuracies in real applications due to lack of the rich representation ability. Deep belief networks (DBN can automatically discover the multiple levels of representations in speech signals. To make full of its advantages, this paper presents an ensemble of random deep belief networks (RDBN method for speech emotion recognition. It firstly extracts the low level features of the input speech signal and then applies them to construct lots of random subspaces. Each random subspace is then provided for DBN to yield the higher level features as the input of the classifier to output an emotion label. All outputted emotion labels are then fused through the majority voting to decide the final emotion label for the input speech signal. The conducted experimental results on benchmark speech emotion databases show that RDBN has better accuracy than the compared methods for speech emotion recognition.

  12. Intermittent random walks: transport regimes and implications on search strategies

    International Nuclear Information System (INIS)

    Gomez Portillo, Ignacio; Campos, Daniel; Méndez, Vicenç

    2011-01-01

    We construct a transport model for particles that alternate rests of random duration and flights with random velocities. The model provides a balance equation for the mesoscopic particle density obtained from the continuous-time random walk framework. By assuming power laws for the distributions of waiting times and flight durations (for any velocity distribution with finite moments) we have found that the model can yield all the transport regimes ranging from subdiffusion to ballistic depending on the values of the characteristic exponents of the distributions. In addition, if the exponents satisfy a simple relationship it is shown how the competition between the tails of the distributions gives rise to a diffusive transport. Finally, we explore how the details of this intermittent transport process affect the success probability in an optimal search problem where an individual searcher looks for a target distributed (heterogeneously) in space. All the results are conveniently checked with numerical simulations

  13. Cardiac resynchronization therapy : advances in optimal patient selection

    NARCIS (Netherlands)

    Bleeker, Gabe Berend

    2007-01-01

    Despite the impressive results of cardiac resynchronization theraphy (CRT) in recent large randomized trials a consistent number of patients fails to improve following CRT implantation when the established CRT selection criteria (NYHA class III-IV heart failure, LV ejection fraction ≤35 % and QRS

  14. Atlas ranking and selection for automatic segmentation of the esophagus from CT scans

    Science.gov (United States)

    Yang, Jinzhong; Haas, Benjamin; Fang, Raymond; Beadle, Beth M.; Garden, Adam S.; Liao, Zhongxing; Zhang, Lifei; Balter, Peter; Court, Laurence

    2017-12-01

    In radiation treatment planning, the esophagus is an important organ-at-risk that should be spared in patients with head and neck cancer or thoracic cancer who undergo intensity-modulated radiation therapy. However, automatic segmentation of the esophagus from CT scans is extremely challenging because of the structure’s inconsistent intensity, low contrast against the surrounding tissues, complex and variable shape and location, and random air bubbles. The goal of this study is to develop an online atlas selection approach to choose a subset of optimal atlases for multi-atlas segmentation to the delineate esophagus automatically. We performed atlas selection in two phases. In the first phase, we used the correlation coefficient of the image content in a cubic region between each atlas and the new image to evaluate their similarity and to rank the atlases in an atlas pool. A subset of atlases based on this ranking was selected, and deformable image registration was performed to generate deformed contours and deformed images in the new image space. In the second phase of atlas selection, we used Kullback-Leibler divergence to measure the similarity of local-intensity histograms between the new image and each of the deformed images, and the measurements were used to rank the previously selected atlases. Deformed contours were overlapped sequentially, from the most to the least similar, and the overlap ratio was examined. We further identified a subset of optimal atlases by analyzing the variation of the overlap ratio versus the number of atlases. The deformed contours from these optimal atlases were fused together using a modified simultaneous truth and performance level estimation algorithm to produce the final segmentation. The approach was validated with promising results using both internal data sets (21 head and neck cancer patients and 15 thoracic cancer patients) and external data sets (30 thoracic patients).

  15. Sexual differences in telomere selection in the wild.

    Science.gov (United States)

    Olsson, Mats; Pauliny, Angela; Wapstra, Erik; Uller, Tobias; Schwartz, Tonia; Miller, Emily; Blomqvist, Donald

    2011-05-01

    Telomere length is restored primarily through the action of the reverse transcriptase telomerase, which may contribute to a prolonged lifespan in some but not all species and may result in longer telomeres in one sex than the other. To what extent this is an effect of proximate mechanisms (e.g. higher stress in males, higher oestradiol/oestrogen levels in females), or is an evolved adaptation (stronger selection for telomere length in one sex), usually remains unknown. Sand lizard (Lacerta agilis) females have longer telomeres than males and better maintain telomere length through life than males do. We also show that telomere length more strongly contributes to life span and lifetime reproductive success in females than males and that telomere length is under sexually diversifying selection in the wild. Finally, we performed a selection analysis with number of recruited offspring into the adult population as a response variable with telomere length, life span and body size as predictor variables. This showed significant differences in selection pressures between the sexes with strong ongoing selection in females, with these three predictors explaining 63% of the variation in recruitment. Thus, the sexually dimorphic telomere dynamics with longer telomeres in females is a result of past and ongoing selection in sand lizards. Finally, we compared the results from our selection analyses based on Telometric-derived data to the results based on data generated by the software ImageJ. ImageJ resulted in shorter average telomere length, but this difference had virtually no qualitative effect on the patterns of ongoing selection. © 2011 Blackwell Publishing Ltd.

  16. Non-compact random generalized games and random quasi-variational inequalities

    OpenAIRE

    Yuan, Xian-Zhi

    1994-01-01

    In this paper, existence theorems of random maximal elements, random equilibria for the random one-person game and random generalized game with a countable number of players are given as applications of random fixed point theorems. By employing existence theorems of random generalized games, we deduce the existence of solutions for non-compact random quasi-variational inequalities. These in turn are used to establish several existence theorems of noncompact generalized random ...

  17. Implementation of client versus care-provider strategies to improve external cephalic version rates: a cluster randomized controlled trial

    NARCIS (Netherlands)

    Vlemmix, Floortje; Rosman, Ageeth N.; Rijnders, Marlies E.; Beuckens, Antje; Opmeer, Brent C.; Mol, Ben W. J.; Kok, Marjolein; Fleuren, Margot A. H.

    2015-01-01

    To determine the effectiveness of a client or care-provider strategy to improve the implementation of external cephalic version. Cluster randomized controlled trial. Twenty-five clusters; hospitals and their referring midwifery practices randomly selected in the Netherlands. Singleton breech

  18. Implementation of client versus care-provider strategies to improve external cephalic version rates: a cluster randomized controlled trial

    NARCIS (Netherlands)

    Vlemmix, F.; Rosman, A.N.; Rijnders, M.E.; Beuckens, A.; Opmeer, B.C.; Mol, B.W.J.; Kok, M.; Fleuren, M.A.H.

    2015-01-01

    Onjective: To determine the effectiveness of a client or care-provider strategy to improve the implementation of external cephalic version. Design: Cluster randomized controlled trial.Setting: Twenty-five clusters; hospitals and their referring midwifery practices randomly selected in the

  19. Models of disordered media: some new results, including some new connections between composite-media, fluid-state, and random-flight theories

    International Nuclear Information System (INIS)

    Stell, G.

    1983-01-01

    Some new theoretical results on the microstructure of models of two-phase disordered media are given, as well as the new quantitative bounds on the thermal conductivity that follows for one such model (randomly centered spherical inclusions). A second set of results is then given for random flights, including random flights with hit expectancy prescribed in a unit hall around the flight origin. Finally, some interesting correspondences are demonstrated, via the Ornstein-Zernike equation, between random-flight results, liquid-state results and percolation-theory results. 27 references, 6 figures, 4 tables

  20. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    1991-01-01

    -Walker equations and finally, least-square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average (ARMA) model directly to the system output from a single-degree-of-freedom system loaded by white noise.......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification system is used: first, the correlation functions are estimated by the Random...... Decrement Technique, and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameter identification process: a simple non-parametric method, estimation of an Auto Regressive (AR) model by solving an overdetermined set of Yule...

  1. A note on mate allocation for dominance handling in genomic selection

    Directory of Open Access Journals (Sweden)

    Toro Miguel A

    2010-08-01

    Full Text Available Abstract Estimation of non-additive genetic effects in animal breeding is important because it increases the accuracy of breeding value prediction and the value of mate allocation procedures. With the advent of genomic selection these ideas should be revisited. The objective of this study was to quantify the efficiency of including dominance effects and practising mating allocation under a whole-genome evaluation scenario. Four strategies of selection, carried out during five generations, were compared by simulation techniques. In the first scenario (MS, individuals were selected based on their own phenotypic information. In the second (GSA, they were selected based on the prediction generated by the Bayes A method of whole-genome evaluation under an additive model. In the third (GSD, the model was expanded to include dominance effects. These three scenarios used random mating to construct future generations, whereas in the fourth one (GSD + MA, matings were optimized by simulated annealing. The advantage of GSD over GSA ranges from 9 to 14% of the expected response and, in addition, using mate allocation (GSD + MA provides an additional response ranging from 6% to 22%. However, mate selection can improve the expected genetic response over random mating only in the first generation of selection. Furthermore, the efficiency of genomic selection is eroded after a few generations of selection, thus, a continued collection of phenotypic data and re-evaluation will be required.

  2. QUANTITATIVE GENETICS OF MORPHOLOGICAL DIFFERENTIATION IN PEROMYSCUS. II. ANALYSIS OF SELECTION AND DRIFT.

    Science.gov (United States)

    Lofsvold, David

    1988-01-01

    The hypothesis that the morphological divergence of local populations of Peromyscus is due to random genetic drift was evaluated by testing the proportionality of the among-locality covariance matrix, L, and the additive genetic covariance matrix, G. Overall, significant proportionality of L̂ and Ĝ was not observed, indicating the evolutionary divergence of local populations does not result from random genetic drift. The forces of selection needed to differentiate three taxa of Peromyscus were reconstructed to examine the divergence of species and subspecies. The selection gradients obtained illustrate the inadequacy of univariate analyses of selection by finding that some characters evolve in the direction opposite to the force of selection acting directly on them. A retrospective selection index was constructed using the estimated selection gradients, and truncation selection on this index was used to estimate the minimum selective mortality per generation required to produce the observed change. On any of the time scales used, the proportion of the population that would need to be culled was quite low, the greatest being of the same order of magnitude as the selective intensities observed in extant natural populations. Thus, entirely plausible intensities of directional natural selection can produce species-level differences in a period of time too short to be resolved in the fossil record. © 1988 The Society for the Study of Evolution.

  3. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    International Nuclear Information System (INIS)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun; Pan, Jian-Wei; Zhou, Hongyi; Ma, Xiongfeng

    2016-01-01

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  4. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun, E-mail: zhangjun@ustc.edu.cn; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at the Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); CAS Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Zhou, Hongyi; Ma, Xiongfeng [Center for Quantum Information, Institute for Interdisciplinary Information Sciences, Tsinghua University, Beijing 100084 (China)

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  5. Rural Women\\'s Response To Selected Crop Production ...

    African Journals Online (AJOL)

    The study centered on rural women's response to selected crop production technologies in Imo State with a view to making policy recommendations. Structured questionnaire and interview schedule were administered through the assistance of extension agents to 258 randomly sampled rural women farmers from the three ...

  6. Hospital-acquired complications in a randomized controlled clinical trial of a geriatric consultation team.

    Science.gov (United States)

    Becker, P M; McVey, L J; Saltz, C C; Feussner, J R; Cohen, H J

    1987-05-01

    As part of a controlled clinical trial of a geriatric consultation team (GCT), we investigated whether a GCT could affect the incidence of hospital-acquired complications in elderly patients. One hundred eighty-five patients, aged 75 years and older, were randomized into an intervention (N = 92) and a control (N = 93) group. Members of the intervention group received a GCT consultation and were routinely followed up throughout their hospitalization. The incidence of hospital-acquired complications for the entire study population was 38%. The type and rate of hospital-acquired complications in the intervention and control groups were not significantly different. Functional status on admission and admission to the psychiatry service were predictive for the occurrence of a hospital-acquired complication. In a broadly selected population such as this, the intensity of care available through a GCT was unable to reduce the occurrence of hospital-acquired complications. However, since this is only one aspect of a GCT function, and others may be of great importance, such aspects, and more targeted populations, must be evaluated before final conclusions can be reached about GCT efficiency.

  7. How random are random numbers generated using photons?

    International Nuclear Information System (INIS)

    Solis, Aldo; Angulo Martínez, Alí M; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U’Ren, Alfred B; Hirsch, Jorge G

    2015-01-01

    Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined. (paper)

  8. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  9. Dynamic probability of reinforcement for cooperation: Random game termination in the centipede game.

    Science.gov (United States)

    Krockow, Eva M; Colman, Andrew M; Pulford, Briony D

    2018-03-01

    Experimental games have previously been used to study principles of human interaction. Many such games are characterized by iterated or repeated designs that model dynamic relationships, including reciprocal cooperation. To enable the study of infinite game repetitions and to avoid endgame effects of lower cooperation toward the final game round, investigators have introduced random termination rules. This study extends previous research that has focused narrowly on repeated Prisoner's Dilemma games by conducting a controlled experiment of two-player, random termination Centipede games involving probabilistic reinforcement and characterized by the longest decision sequences reported in the empirical literature to date (24 decision nodes). Specifically, we assessed mean exit points and cooperation rates, and compared the effects of four different termination rules: no random game termination, random game termination with constant termination probability, random game termination with increasing termination probability, and random game termination with decreasing termination probability. We found that although mean exit points were lower for games with shorter expected game lengths, the subjects' cooperativeness was significantly reduced only in the most extreme condition with decreasing computer termination probability and an expected game length of two decision nodes. © 2018 Society for the Experimental Analysis of Behavior.

  10. Stochastic noncooperative and cooperative evolutionary game strategies of a population of biological networks under natural selection.

    Science.gov (United States)

    Chen, Bor-Sen; Yeh, Chin-Hsun

    2017-12-01

    We review current static and dynamic evolutionary game strategies of biological networks and discuss the lack of random genetic variations and stochastic environmental disturbances in these models. To include these factors, a population of evolving biological networks is modeled as a nonlinear stochastic biological system with Poisson-driven genetic variations and random environmental fluctuations (stimuli). To gain insight into the evolutionary game theory of stochastic biological networks under natural selection, the phenotypic robustness and network evolvability of noncooperative and cooperative evolutionary game strategies are discussed from a stochastic Nash game perspective. The noncooperative strategy can be transformed into an equivalent multi-objective optimization problem and is shown to display significantly improved network robustness to tolerate genetic variations and buffer environmental disturbances, maintaining phenotypic traits for longer than the cooperative strategy. However, the noncooperative case requires greater effort and more compromises between partly conflicting players. Global linearization is used to simplify the problem of solving nonlinear stochastic evolutionary games. Finally, a simple stochastic evolutionary model of a metabolic pathway is simulated to illustrate the procedure of solving for two evolutionary game strategies and to confirm and compare their respective characteristics in the evolutionary process. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Optimized bioregenerative space diet selection with crew choice

    Science.gov (United States)

    Vicens, Carrie; Wang, Carolyn; Olabi, Ammar; Jackson, Peter; Hunter, Jean

    2003-01-01

    Previous studies on optimization of crew diets have not accounted for choice. A diet selection model with crew choice was developed. Scenario analyses were conducted to assess the feasibility and cost of certain crew preferences, such as preferences for numerous-desserts, high-salt, and high-acceptability foods. For comparison purposes, a no-choice and a random-choice scenario were considered. The model was found to be feasible in terms of food variety and overall costs. The numerous-desserts, high-acceptability, and random-choice scenarios all resulted in feasible solutions costing between 13.2 and 17.3 kg ESM/person-day. Only the high-sodium scenario yielded an infeasible solution. This occurred when the foods highest in salt content were selected for the crew-choice portion of the diet. This infeasibility can be avoided by limiting the total sodium content in the crew-choice portion of the diet. Cost savings were found by reducing food variety in scenarios where the preference bias strongly affected nutritional content.

  12. Holographic memories with encryption-selectable function

    Science.gov (United States)

    Su, Wei-Chia; Lee, Xuan-Hao

    2006-03-01

    Volume holographic storage has received increasing attention owing to its potential high storage capacity and access rate. In the meanwhile, encrypted holographic memory using random phase encoding technique is attractive for an optical community due to growing demand for protection of information. In this paper, encryption-selectable holographic storage algorithms in LiNbO 3 using angular multiplexing are proposed and demonstrated. Encryption-selectable holographic memory is an advance concept of security storage for content protection. It offers more flexibility to encrypt the data or not optionally during the recording processes. In our system design, the function of encryption and non-encryption storage is switched by a random phase pattern and a uniform phase pattern. Based on a 90-degree geometry, the input patterns including the encryption and non-encryption storage are stored via angular multiplexing with reference plane waves at different incident angles. Image is encrypted optionally by sliding the ground glass into one of the recording waves or removing it away in each exposure. The ground glass is a key for encryption. Besides, it is also an important key available for authorized user to decrypt the encrypted information.

  13. Comparison between paricalcitol and active non-selective vitamin D receptor activator for secondary hyperparathyroidism in chronic kidney disease: a systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Cai, Panpan; Tang, Xiaohong; Qin, Wei; Ji, Ling; Li, Zi

    2016-04-01

    The goal of this systematic review is to evaluate the efficacy and safety of paricalcitol versus active non-selective vitamin D receptor activators (VDRAs) for secondary hyperparathyroidism (SHPT) management in chronic kidney disease (CKD) patients. PubMed, EMBASE, Cochrane Central Register of Controlled Trials (CENTRAL), clinicaltrials.gov (inception to September 2015), and ASN Web site were searched for relevant studies. A meta-analysis of randomized controlled trials (RCTs) and quasi-RCTs that assessed the effects and adverse events of paricalcitol and active non-selective VDRA in adult CKD patients with SHPT was performed using Review Manager 5.2. A total of 10 trials involving 734 patients were identified for this review. The quality of included trials was limited, and very few trials reported all-cause mortality or cardiovascular calcification without any differences between two groups. Compared with active non-selective VDRAs, paricalcitol showed no significant difference in both PTH reduction (MD -7.78, 95% CI -28.59-13.03, P = 0.46) and the proportion of patients who achieved the target reduction of PTH (OR 1.27, 95% CI 0.87-1.85, P = 0.22). In addition, no statistical differences were found in terms of serum calcium, episodes of hypercalcemia, serum phosphorus, calcium × phosphorus products, and bone metabolism index. Current evidence is insufficient, showing paricalcitol is superior to active non-selective VDRAs in lowering PTH or reducing the burden of mineral loading. Further trials are required to prove the tissue-selective effect of paricalcitol and to overcome the limitation of current research.

  14. Outcomes in registered, ongoing randomized controlled trials of patient education.

    Directory of Open Access Journals (Sweden)

    Cécile Pino

    Full Text Available BACKGROUND: With the increasing prevalence of chronic noncommunicable diseases, patient education is becoming important to strengthen disease prevention and control. We aimed to systematically determine the extent to which registered, ongoing randomized controlled trials (RCTs evaluated an educational intervention focus on patient-important outcomes (i.e., outcomes measuring patient health status and quality of life. METHODS: On May 6, 2009, we searched for all ongoing RCTs registered in the World Health Organization International Clinical Trials Registry platform. We used a standardized data extraction form to collect data and determined whether the outcomes assessed were 1 patient-important outcomes such as clinical events, functional status, pain, or quality of life or 2 surrogate outcomes, such as biological outcome, treatment adherence, or patient knowledge. PRINCIPAL FINDINGS: We selected 268 of the 642 potentially eligible studies and assessed a random sample of 150. Patient-important outcomes represented 54% (178 of 333 of all primary outcomes and 46% (286 of 623 of all secondary outcomes. Overall, 69% of trials (104 of 150 used at least one patient-important outcome as a primary outcome and 66% (99 of 150 as a secondary outcome. Finally, for 31% of trials (46 of 150, primary outcomes were only surrogate outcomes. The results varied by medical area. In neuropsychiatric disorders, patient important outcomes represented 84% (51 of 61 of primary outcomes, as compared with 54% (32 of 59 in malignant neoplasm and 18% (4 of 22 in diabetes mellitus trials. In addition, only 35% assessed the long-term impact of interventions (i.e., >6 months. CONCLUSIONS: There is a need to improve the relevance of outcomes and to assess the long term impact of educational interventions in RCTs.

  15. An augmented space formulation of the optical conductivity of random semiconducting alloys

    International Nuclear Information System (INIS)

    Mookerjee, A.

    1984-08-01

    A formalism has been developed for the study of optical conductivity of disordered semiconducting alloys effect of off-diagonal disorder, clustering and randomness in the electron-photon interaction matrix may be incorporated within this. The aim is to finally study GaAssub(x)Sbsub(1-x) as well as deep levels in this alloy. (author)

  16. Assessing the Job Selection Criteria of Accounting Students: A Normative Approach

    Directory of Open Access Journals (Sweden)

    Umaru Zubairu

    2017-08-01

    Full Text Available This research assessed to what extent final-year Muslim accounting students in Malaysia considered Islamic principles when choosing a job after graduation. 356 final-year Muslim accounting students in four Malaysian universities were surveyed using an open-ended job selection scenario. The result shows that reality does not live up to the ideal. Only 16% of the respondents apply Islamic principles in making a job selection decision. The remaining 84% are more concerned with other criteria such as personal interests, salary considerations, and company reputation.

  17. Correlated randomness and switching phenomena

    Science.gov (United States)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  18. Visual coding with a population of direction-selective neurons.

    Science.gov (United States)

    Fiscella, Michele; Franke, Felix; Farrow, Karl; Müller, Jan; Roska, Botond; da Silveira, Rava Azeredo; Hierlemann, Andreas

    2015-10-01

    The brain decodes the visual scene from the action potentials of ∼20 retinal ganglion cell types. Among the retinal ganglion cells, direction-selective ganglion cells (DSGCs) encode motion direction. Several studies have focused on the encoding or decoding of motion direction by recording multiunit activity, mainly in the visual cortex. In this study, we simultaneously recorded from all four types of ON-OFF DSGCs of the rabbit retina using a microelectronics-based high-density microelectrode array (HDMEA) and decoded their concerted activity using probabilistic and linear decoders. Furthermore, we investigated how the modification of stimulus parameters (velocity, size, angle of moving object) and the use of different tuning curve fits influenced decoding precision. Finally, we simulated ON-OFF DSGC activity, based on real data, in order to understand how tuning curve widths and the angular distribution of the cells' preferred directions influence decoding performance. We found that probabilistic decoding strategies outperformed, on average, linear methods and that decoding precision was robust to changes in stimulus parameters such as velocity. The removal of noise correlations among cells, by random shuffling trials, caused a drop in decoding precision. Moreover, we found that tuning curves are broad in order to minimize large errors at the expense of a higher average error, and that the retinal direction-selective system would not substantially benefit, on average, from having more than four types of ON-OFF DSGCs or from a perfect alignment of the cells' preferred directions. Copyright © 2015 the American Physiological Society.

  19. Methods for a multicenter randomized trial for mixed urinary incontinence: rationale and patient-centeredness of the ESTEEM trial

    Science.gov (United States)

    Sung, Vivian W.; Borello-France, Diane; Dunivan, Gena; Gantz, Marie; Lukacz, Emily S.; Moalli, Pamela; Newman, Diane K.; Richter, Holly E.; Ridgeway, Beri; Smith, Ariana L.; Weidner, Alison C.; Meikle, Susan

    2016-01-01

    Introduction Mixed urinary incontinence (MUI) can be a challenging condition to manage. We describe the protocol design and rationale for the Effects of Surgical Treatment Enhanced with Exercise for Mixed Urinary Incontinence (ESTEEM) trial, designed to compare a combined conservative and surgical treatment approach versus surgery alone for improving patient-centered MUI outcomes at 12 months. Methods ESTEEM is a multi-site, prospective, randomized trial of female participants with MUI randomized to a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone. We describe our methods and four challenges encountered during the design phase: defining the study population, selecting relevant patient-centered outcomes, determining sample size estimates using a patient-reported outcome measure, and designing an analysis plan that accommodates MUI failure rates. A central theme in the design was patient-centeredness, which guided many key decisions. Our primary outcome is patient-reported MUI symptoms measured using the Urogenital Distress Inventory (UDI) score at 12 months. Secondary outcomes include quality of life, sexual function, cost-effectiveness, time to failure and need for additional treatment. Results The final study design was implemented in November 2013 across 8 clinical sites in the Pelvic Floor Disorders Network. As of February 27, 2016, 433 total /472 targeted participants have been randomized. Conclusions We describe the ESTEEM protocol and our methods for reaching consensus for methodological challenges in designing a trial for MUI by maintaining the patient perspective at the core of key decisions. This trial will provide information that can directly impact patient care and clinical decision-making. PMID:27287818

  20. Comparing Effectiveness of Mindfulness-Based Relapse Prevention with Treatment as Usual on Impulsivity and Relapse for Methadone-Treated Patients: A Randomized Clinical Trial.

    Science.gov (United States)

    Yaghubi, Mehdi; Zargar, Fatemeh; Akbari, Hossein

    2017-07-01

    Impulsivity is one of the causes of relapse that can affect treatment outcomes. Studies have shown that addiction treatments can reduce impulsivity in drug-dependent individuals. Studies also have suggested that mindfulness is associated with impulsivity. However, no study has investigated the effectiveness of the mindfulness-based intervention on impulsivity in opioid-dependent individuals. This study aimed to compare the effectiveness of mindfulness-based relapse prevention (MBRP) with treatment as usual (TAU) in terms of impulsivity and relapse for methadone-treated patients. The present randomized controlled clinical trial was performed in Kashan, Iran, in 2015. The study population was opioid-dependent patients referred to Maintenance Treatment Centers. Seventy patients were selected by random sampling and were assigned in two groups (MBRP and TAU) randomly. The participants of two groups filled out Barratt impulsivity scale (BIS-11) as a pre-test and 8 weeks later as post-test and 2 months later as a follow-up. Both groups received methadone-therapy. The MBRP group received 8 sessions of group therapy, while the control group did not receive any group psychotherapy session. Finally, data from 60 patients were analyzed statistically. The MBRP group had decreased impulsivity significantly (P relapse frequency (P relapse probability. These findings suggest that MBRP is useful for opioid-dependent individuals with high-level impulsivity, and relapse prevention.

  1. A randomized controlled trial investigating the use of a predictive nomogram for the selection of the FSH starting dose in IVF/ICSI cycles.

    Science.gov (United States)

    Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio

    2017-04-01

    The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate. Copyright © 2017 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  2. Selective Gaseous Extraction: Research, Development and Training for Isotope Production, Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Bertch, Timothy C, [General Atomics

    2014-03-31

    General Atomics and the University of Missouri Research Reactor (MURR) completed research and development of selective gaseous extraction of fission products from irradiated fuel, which included training and education of MURR students. The process used porous fuel and after irradiation flowed product gases through the fuel to selectively removed desired fission products with the primary goal of demonstrating the removal of rhodium 105. High removal rates for the ruthenium/rhodium (Ru/Rh), tellurium/iodine (Te/I) and molybdenum/technetium (Mo/Tc) series were demonstrated. The success of this research provides for the reuse of the target for further production, significantly reducing the production of actinide wastes relative to processes that dissolve the target. This effort was conducted under DOE funding (DE-SC0007772). General Atomics objective of the project was to conduct R&D on alternative methods to produce a number of radioactive isotopes currently needed for medical and industry applications to include rhodium-105 and other useful isotopes. Selective gaseous extraction was shown to be effective at removing radioisotopes of the ruthenium/rhodium, tellurium/iodine and molybdenum/technetium decay chains while having trace to no quantities of other fission products or actinides. This adds a new, credible method to the area of certain commercial isotope production beyond current techniques, while providing significant potential reduction of process wastes. Waste reduction, along with reduced processing time/cost provides for superior economic feasibility which may allow domestic production under full cost recovery practices. This provides the potential for improved access to domestically produced isotopes for medical diagnostics and treatment at reduced cost, providing for the public good.

  3. Immediate movement history influences reach-to-grasp action selection in children and adults.

    Science.gov (United States)

    Kent, Samuel W; Wilson, Andrew D; Plumb, Mandy S; Williams, Justin H G; Mon-Williams, Mark

    2009-01-01

    Action selection is subject to many biases. Immediate movement history is one such bias seen in young infants. Is this bias strong enough to affect adult behavior? Adult participants reached and grasped a cylinder positioned to require either pronation or supination of the hand. Successive cylinder positions changed either randomly or systematically between trials. Random positioning led to optimized economy of movement. In contrast, systematic changes in position biased action selection toward previously selected actions at the expense of movement economy. Thus, one switches to a new movement only when the savings outweigh the costs of the switch. Immediate movement history had an even larger influence on children aged 7-15 years. This suggests that switching costs are greater in children, which is consistent with their reduced grasping experience. The presence of this effect in adults suggests that immediate movement history exerts a more widespread and pervasive influence on patterns of action selection than researchers had previously recognized.

  4. Career preferences of final year medical students at a medical school in Kenya--A cross sectional study.

    Science.gov (United States)

    Dossajee, Hussein; Obonyo, Nchafatso; Ahmed, Syed Masud

    2016-01-11

    The World Health Organization (WHO) recommended physician to population ratio is 23:10,000. Kenya has a physician to population ratio of 1.8:10,000 and is among 57 countries listed as having a serious shortage of health workers. Approximately 52% of physicians work in urban areas, 6% in rural and 42% in peri-urban locations. This study explored factors influencing the choice of career specialization and location for practice among final year medical students by gender. A descriptive cross-sectional study was carried out on final year students in 2013 at the University of Nairobi's, School of Medicine in Kenya. Sample size was calculated at 156 students for simple random sampling. Data collected using a pre-tested self-administered questionnaire included socio-demographic characteristics of the population, first and second choices for specialization. Outcome variables collected were factors affecting choice of specialty and location for practice. Bivariate analysis by gender was carried out between the listed factors and outcome variables with calculation of odds ratios and chi-square statistics at an alpha level of significance of 0.05. Factors included in a binomial logistic regression model were analysed to score the independent categorical variables affecting choice of specialty and location of practice. Internal medicine, Surgery, Obstetrics/Gynaecology and Paediatrics accounted for 58.7% of all choices of specialization. Female students were less likely to select Obs/Gyn (OR 0.41, 95% CI =0.17-0.99) and Surgery (OR 0.33, 95% CI = 0.13-0.86) but eight times more likely to select Paediatrics (OR 8.67, 95% CI = 1.91-39.30). Surgery was primarily selected because of the 'perceived prestige of the specialty' (OR 4.3 95% CI = 1.35-14.1). Paediatrics was selected due to 'Ease of raising a family' (OR 4.08 95% CI = 1.08-15.4). Rural origin increased the odds of practicing in a rural area (OR 2.5, 95% CI = 1.04-6.04). Training abroad was more likely

  5. Final disposal of spent fuel in the Finnish bedrock

    International Nuclear Information System (INIS)

    1992-12-01

    Teollisuuden Voima Oy (TVO) is preparing for the final disposal of spent nuclear fuel from the Olkiluoto nuclear power plant (TVO-I and TVO-II reactors). According to present estimates, a total of 1840 tU of spent fuel will be accumulated during the 40-year lifetime of the power plant. An interim storage facility for spent fuel (TVO-KPA Store) has operated at Olkiluoto since 1987. The spent fuel will be held in storage for several decades before it is shipped to the repository site. Both train and road transportation are possible. The spent fuel will be encapsulated in composite copper and steel canisters (ACP Canister) in a facility that will be build above the ground on the site where the repository is located. The repository will be constructed at the depth of several hundreds of meters in the bedrock. In 1987 five areas were selected for preliminary site investigations. The safety analysis (TVO-92) that was carried out shows that the proposed safety criteria would be met at each of the candidate sites. In future expected conditions there would never be significant releases of radioactive substances to the biosphere. The site investigations will be continued in the period 1993 to 2000. In parallel, a R and D programme will be devoted to the safety and technology of final disposal. The site for final disposal will be selected in the year 2000 with the aim of having the capability to start the disposal operations in 2020

  6. Particle swarm optimization-based automatic parameter selection for deep neural networks and its applications in large-scale and high-dimensional data

    Science.gov (United States)

    2017-01-01

    In this paper, we propose a new automatic hyperparameter selection approach for determining the optimal network configuration (network structure and hyperparameters) for deep neural networks using particle swarm optimization (PSO) in combination with a steepest gradient descent algorithm. In the proposed approach, network configurations were coded as a set of real-number m-dimensional vectors as the individuals of the PSO algorithm in the search procedure. During the search procedure, the PSO algorithm is employed to search for optimal network configurations via the particles moving in a finite search space, and the steepest gradient descent algorithm is used to train the DNN classifier with a few training epochs (to find a local optimal solution) during the population evaluation of PSO. After the optimization scheme, the steepest gradient descent algorithm is performed with more epochs and the final solutions (pbest and gbest) of the PSO algorithm to train a final ensemble model and individual DNN classifiers, respectively. The local search ability of the steepest gradient descent algorithm and the global search capabilities of the PSO algorithm are exploited to determine an optimal solution that is close to the global optimum. We constructed several experiments on hand-written characters and biological activity prediction datasets to show that the DNN classifiers trained by the network configurations expressed by the final solutions of the PSO algorithm, employed to construct an ensemble model and individual classifier, outperform the random approach in terms of the generalization performance. Therefore, the proposed approach can be regarded an alternative tool for automatic network structure and parameter selection for deep neural networks. PMID:29236718

  7. Dimer coverings on random multiple chains of planar honeycomb lattices

    International Nuclear Information System (INIS)

    Ren, Haizhen; Zhang, Fuji; Qian, Jianguo

    2012-01-01

    We study dimer coverings on random multiple chains. A multiple chain is a planar honeycomb lattice constructed by successively fusing copies of a ‘straight’ condensed hexagonal chain at the bottom of the previous one in two possible ways. A random multiple chain is then generated by admitting the Bernoulli distribution on the two types of fusing, which describes a zeroth-order Markov process. We determine the expectation of the number of the pure dimer coverings (perfect matchings) over the ensemble of random multiple chains by the transfer matrix approach. Our result shows that, with only two exceptions, the average of the logarithm of this expectation (i.e., the annealed entropy per dimer) is asymptotically nonzero when the fusing process goes to infinity and the length of the hexagonal chain is fixed, though it is zero when the fusing process and the length of the hexagonal chain go to infinity simultaneously. Some numerical results are provided to support our conclusion, from which we can see that the asymptotic behavior fits well to the theoretical results. We also apply the transfer matrix approach to the quenched entropy and reveal that the quenched entropy of random multiple chains has a close connection with the well-known Lyapunov exponent of random matrices. Using the theory of Lyapunov exponents we show that, for some random multiple chains, the quenched entropy per dimer is strictly smaller than the annealed one when the fusing process goes to infinity. Finally, we determine the expectation of the free energy per dimer over the ensemble of the random multiple chains in which the three types of dimers in different orientations are distinguished, and specify a series of non-random multiple chains whose free energy per dimer is asymptotically equal to this expectation. (paper)

  8. Pseudo-Random Sequences Generated by a Class of One-Dimensional Smooth Map

    International Nuclear Information System (INIS)

    Wang Xing-Yuan; Qin Xue; Xie Yi-Xin

    2011-01-01

    We extend a class of a one-dimensional smooth map. We make sure that for each desired interval of the parameter the map's Lyapunov exponent is positive. Then we propose a novel parameter perturbation method based on the good property of the extended one-dimensional smooth map. We perturb the parameter r in each iteration by the real number x i generated by the iteration. The auto-correlation function and NIST statistical test suite are taken to illustrate the method's randomness finally. We provide an application of this method in image encryption. Experiments show that the pseudo-random sequences are suitable for this application. (general)

  9. Robust Fuzzy Control for Fractional-Order Uncertain Hydroturbine Regulating System with Random Disturbances

    Directory of Open Access Journals (Sweden)

    Fengjiao Wu

    2016-01-01

    Full Text Available The robust fuzzy control for fractional-order hydroturbine regulating system is studied in this paper. First, the more practical fractional-order hydroturbine regulating system with uncertain parameters and random disturbances is presented. Then, on the basis of interval matrix theory and fractional-order stability theorem, a fuzzy control method is proposed for fractional-order hydroturbine regulating system, and the stability condition is expressed as a group of linear matrix inequalities. Furthermore, the proposed method has good robustness which can process external random disturbances and uncertain parameters. Finally, the validity and superiority are proved by the numerical simulations.

  10. The effect of morphometric atlas selection on multi-atlas-based automatic brachial plexus segmentation

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Wouters, Johan; Vercauteren, Tom; De Gersem, Werner; Achten, Eric; De Neve, Wilfried; Van Hoof, Tom

    2015-01-01

    The present study aimed to measure the effect of a morphometric atlas selection strategy on the accuracy of multi-atlas-based BP autosegmentation using the commercially available software package ADMIRE® and to determine the optimal number of selected atlases to use. Autosegmentation accuracy was measured by comparing all generated automatic BP segmentations with anatomically validated gold standard segmentations that were developed using cadavers. Twelve cadaver computed tomography (CT) atlases were included in the study. One atlas was selected as a patient in ADMIRE®, and multi-atlas-based BP autosegmentation was first performed with a group of morphometrically preselected atlases. In this group, the atlases were selected on the basis of similarity in the shoulder protraction position with the patient. The number of selected atlases used started at two and increased up to eight. Subsequently, a group of randomly chosen, non-selected atlases were taken. In this second group, every possible combination of 2 to 8 random atlases was used for multi-atlas-based BP autosegmentation. For both groups, the average Dice similarity coefficient (DSC), Jaccard index (JI) and Inclusion index (INI) were calculated, measuring the similarity of the generated automatic BP segmentations and the gold standard segmentation. Similarity indices of both groups were compared using an independent sample t-test, and the optimal number of selected atlases was investigated using an equivalence trial. For each number of atlases, average similarity indices of the morphometrically selected atlas group were significantly higher than the random group (p < 0,05). In this study, the highest similarity indices were achieved using multi-atlas autosegmentation with 6 selected atlases (average DSC = 0,598; average JI = 0,434; average INI = 0,733). Morphometric atlas selection on the basis of the protraction position of the patient significantly improves multi-atlas-based BP autosegmentation accuracy

  11. Random matrix approach to cross correlations in financial data

    Science.gov (United States)

    Plerou, Vasiliki; Gopikrishnan, Parameswaran; Rosenow, Bernd; Amaral, Luís A.; Guhr, Thomas; Stanley, H. Eugene

    2002-06-01

    We analyze cross correlations between price fluctuations of different stocks using methods of random matrix theory (RMT). Using two large databases, we calculate cross-correlation matrices C of returns constructed from (i) 30-min returns of 1000 US stocks for the 2-yr period 1994-1995, (ii) 30-min returns of 881 US stocks for the 2-yr period 1996-1997, and (iii) 1-day returns of 422 US stocks for the 35-yr period 1962-1996. We test the statistics of the eigenvalues λi of C against a ``null hypothesis'' - a random correlation matrix constructed from mutually uncorrelated time series. We find that a majority of the eigenvalues of C fall within the RMT bounds [λ-,λ+] for the eigenvalues of random correlation matrices. We test the eigenvalues of C within the RMT bound for universal properties of random matrices and find good agreement with the results for the Gaussian orthogonal ensemble of random matrices-implying a large degree of randomness in the measured cross-correlation coefficients. Further, we find that the distribution of eigenvector components for the eigenvectors corresponding to the eigenvalues outside the RMT bound display systematic deviations from the RMT prediction. In addition, we find that these ``deviating eigenvectors'' are stable in time. We analyze the components of the deviating eigenvectors and find that the largest eigenvalue corresponds to an influence common to all stocks. Our analysis of the remaining deviating eigenvectors shows distinct groups, whose identities correspond to conventionally identified business sectors. Finally, we discuss applications to the construction of portfolios of stocks that have a stable ratio of risk to return.

  12. Selective retrieval of memory and concept sequences through neuro-windows

    OpenAIRE

    Kakeya, Hideki; Okabe, Yoichi

    1999-01-01

    This letter presents a crosscorrelational associative memory model which realizes selective retrieval of pattern sequences. When hierarchically correlated sequences are memorized, sequences of the correlational centers can be defined as the concept sequences. The authors propose a modified neuro-window method which enables selective retrieval of memory sequences and concept sequences. It is also shown that the proposed model realizes capacity expansion of the memory which stores random sequen...

  13. Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping

    NARCIS (Netherlands)

    Beers, van W.C.M.; Kleijnen, J.P.C.

    2005-01-01

    This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that

  14. Goal selection versus process control while learning to use a brain-computer interface

    Science.gov (United States)

    Royer, Audrey S.; Rose, Minn L.; He, Bin

    2011-06-01

    A brain-computer interface (BCI) can be used to accomplish a task without requiring motor output. Two major control strategies used by BCIs during task completion are process control and goal selection. In process control, the user exerts continuous control and independently executes the given task. In goal selection, the user communicates their goal to the BCI and then receives assistance executing the task. A previous study has shown that goal selection is more accurate and faster in use. An unanswered question is, which control strategy is easier to learn? This study directly compares goal selection and process control while learning to use a sensorimotor rhythm-based BCI. Twenty young healthy human subjects were randomly assigned either to a goal selection or a process control-based paradigm for eight sessions. At the end of the study, the best user from each paradigm completed two additional sessions using all paradigms randomly mixed. The results of this study were that goal selection required a shorter training period for increased speed, accuracy, and information transfer over process control. These results held for the best subjects as well as in the general subject population. The demonstrated characteristics of goal selection make it a promising option to increase the utility of BCIs intended for both disabled and able-bodied users.

  15. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  16. 13 CFR 108.380 - Final approval as a NMVC Company.

    Science.gov (United States)

    2010-01-01

    ... VENTURE CAPITAL (âNMVCâ) PROGRAM Evaluation and Selection of NMVC Companies § 108.380 Final approval as a... amount of Regulatory Capital set forth in its application, pursuant to § 108.310(a)(1); and (B) The... at least 30 percent of its Regulatory Capital if the Conditionally Approved NMVC Company— (i) Already...

  17. Neuronal effects of nicotine during auditory selective attention.

    Science.gov (United States)

    Smucny, Jason; Olincy, Ann; Eichman, Lindsay S; Tregellas, Jason R

    2015-06-01

    Although the attention-enhancing effects of nicotine have been behaviorally and neurophysiologically well-documented, its localized functional effects during selective attention are poorly understood. In this study, we examined the neuronal effects of nicotine during auditory selective attention in healthy human nonsmokers. We hypothesized to observe significant effects of nicotine in attention-associated brain areas, driven by nicotine-induced increases in activity as a function of increasing task demands. A single-blind, prospective, randomized crossover design was used to examine neuronal response associated with a go/no-go task after 7 mg nicotine or placebo patch administration in 20 individuals who underwent functional magnetic resonance imaging at 3T. The task design included two levels of difficulty (ordered vs. random stimuli) and two levels of auditory distraction (silence vs. noise). Significant treatment × difficulty × distraction interaction effects on neuronal response were observed in the hippocampus, ventral parietal cortex, and anterior cingulate. In contrast to our hypothesis, U and inverted U-shaped dependencies were observed between the effects of nicotine on response and task demands, depending on the brain area. These results suggest that nicotine may differentially affect neuronal response depending on task conditions. These results have important theoretical implications for understanding how cholinergic tone may influence the neurobiology of selective attention.

  18. Lamplighter model of a random copolymer adsorption on a line

    Directory of Open Access Journals (Sweden)

    L.I. Nazarov

    2014-09-01

    Full Text Available We present a model of an AB-diblock random copolymer sequential self-packaging with local quenched interactions on a one-dimensional infinite sticky substrate. It is assumed that the A-A and B-B contacts are favorable, while A-B are not. The position of a newly added monomer is selected in view of the local contact energy minimization. The model demonstrates a self-organization behavior with the nontrivial dependence of the total energy, E (the number of unfavorable contacts, on the number of chain monomers, N: E ~ N^3/4 for quenched random equally probable distribution of A- and B-monomers along the chain. The model is treated by mapping it onto the "lamplighter" random walk and the diffusion-controlled chemical reaction of X+X → 0 type with the subdiffusive motion of reagents.

  19. Kalman Filtering for Discrete Stochastic Systems with Multiplicative Noises and Random Two-Step Sensor Delays

    Directory of Open Access Journals (Sweden)

    Dongyan Chen

    2015-01-01

    Full Text Available This paper is concerned with the optimal Kalman filtering problem for a class of discrete stochastic systems with multiplicative noises and random two-step sensor delays. Three Bernoulli distributed random variables with known conditional probabilities are introduced to characterize the phenomena of the random two-step sensor delays which may happen during the data transmission. By using the state augmentation approach and innovation analysis technique, an optimal Kalman filter is constructed for the augmented system in the sense of the minimum mean square error (MMSE. Subsequently, the optimal Kalman filtering is derived for corresponding augmented system in initial instants. Finally, a simulation example is provided to demonstrate the feasibility and effectiveness of the proposed filtering method.

  20. Optimal redundant systems for works with random processing time

    International Nuclear Information System (INIS)

    Chen, M.; Nakagawa, T.

    2013-01-01

    This paper studies the optimal redundant policies for a manufacturing system processing jobs with random working times. The redundant units of the parallel systems and standby systems are subject to stochastic failures during the continuous production process. First, a job consisting of only one work is considered for both redundant systems and the expected cost functions are obtained. Next, each redundant system with a random number of units is assumed for a single work. The expected cost functions and the optimal expected numbers of units are derived for redundant systems. Subsequently, the production processes of N tandem works are introduced for parallel and standby systems, and the expected cost functions are also summarized. Finally, the number of works is estimated by a Poisson distribution for the parallel and standby systems. Numerical examples are given to demonstrate the optimization problems of redundant systems